Why Blocking Bot Traffic is the Single Best Thing You Can Do for Your AdSense Revenue
How cleaning up bad traffic, tuning PHP, and adding Redis transformed ad quality across all my sites — and why most publishers never think to do this.

Most people chasing AdSense revenue obsess over content, keyword research, and ad placement. Very few ever look at the other side of the equation — the tsunami of garbage traffic quietly sabotaging everything they've built. I know, because I've been there.
After implementing a serious round of bot-blocking and traffic filtering across my own websites — steps I covered in detail in my Plesk VPS speed and security guide — I started noticing something unexpected. Not only were my sites faster and more secure, but AdSense revenue began to tick upward. Some sites that had flatlined at zero impressions started picking up genuine visitors. And perhaps most noticeably, the rubbish placeholder ads — the "Congratulations, you've won!" banners and the blank grey boxes — started being replaced by real, relevant, properly-paid advertisements.
This wasn't a coincidence. There's a clear mechanism at work here, and once you understand it, you'll wonder why nobody talks about it more.
🤖 What Even Is Bot Traffic?
Before we get into the AdSense side of things, it's worth being clear about what we're actually dealing with. When people say "bot traffic," they're referring to automated, non-human visits to your website. Not every bot is malicious — Googlebot crawling your pages to index them is perfectly fine and exactly what you want. The problem is the other kind.
Bad bot traffic falls into two main categories. The first is General Invalid Traffic (GIVT), which covers known crawlers, data centre visitors, and basic automated scrapers. These aren't necessarily trying to harm you — they might be price scrapers, content aggregators, or research tools — but they're not humans and they're not going to click an ad with any genuine intent.
The second and more dangerous category is Sophisticated Invalid Traffic (SIVT). These are the real troublemakers. They use botnets to mimic genuine human browsing behaviour — they'll load pages, appear to scroll, dwell for a few seconds — all designed to look legitimate. Some are part of organised click fraud operations. Others are deployed by competitors to drain ad budgets. Either way, they are poison for any website that relies on advertising.
Research suggests bots account for nearly half of all internet traffic. Not a small fraction. Nearly half. If you haven't taken steps to filter it, a significant chunk of what appears in your analytics right now isn't human.
💸 How Bad Traffic Directly Destroys AdSense Revenue
Most site owners assume that more traffic, even fake traffic, must be neutral at worst. It isn't. Bad traffic actively works against you in several connected ways.
Google Sees It Too — and Penalises You For It
Google's AdSense system runs its own invalid traffic detection in real time. It analyses IP addresses, browser fingerprints, click patterns, and session behaviour to identify non-human interactions. When it spots them, it doesn't just ignore those clicks — it claws back the revenue associated with them at the end of the month. You might see a month of what looks like reasonable earnings, only to find a chunk quietly removed when your payment is processed. Worse, if Google sees persistent invalid traffic patterns, it can limit ad serving, reduce your ad quality tier, or in extreme cases suspend your account entirely. You don't have to be doing anything wrong. If bad actors are sending bot traffic to your site, you can be penalised for it anyway.
It Pollutes Your Signals, Causing Google to Serve Junk Ads
Here's something that doesn't get talked about nearly enough. Google's AdSense system doesn't just serve any old ad to your visitors — it analyses your audience to serve contextually relevant, high-value advertisements. It uses crawlers like Mediapartners-Google and Google-Display-Ads-Bot to understand what your site is about and who visits it. When your traffic is loaded with bots, the audience signal Google is reading becomes distorted. The system can't get a clear picture of who your real visitors are, what they're interested in, or what geographic markets they represent. The result? It falls back on low-value, generic ad inventory — the filler ads. The placeholders. The "click here, you've won a prize" nonsense that earns fractions of a penny and makes your site look untrustworthy.
It Wrecks Your Analytics Data
Bot traffic inflates your pageview numbers and completely destroys meaningful metrics like bounce rate, session duration, and conversion rates. If a significant portion of your "visitors" are bots that land on a page and immediately leave, your bounce rate looks terrible. Your session duration looks terrible. If they trigger fake conversion events, your conversion data is meaningless. Google's systems are watching these engagement signals. A site that appears to have terrible engagement is going to be treated differently to a site with genuine, engaged human visitors. Bot-polluted metrics tell Google the wrong story about your site.
It Consumes Server Resources, Slowing Down the Real Experience
Bad bots don't just visit your site — they hammer it. Some are scanning for vulnerabilities, some are scraping content at high speed, and others are part of distributed attacks. All of that consumes your server's CPU, memory, and bandwidth. A slower site means real visitors get a worse experience. Slower page loads mean lower Core Web Vitals scores, which affects SEO rankings, which affects how much genuine traffic finds you in the first place. And slower ad loading means ads are less likely to be seen — reducing viewability rates, a metric advertisers pay attention to when setting CPM bids.
Bot traffic isn't a background nuisance you can safely ignore. It's actively degrading every system that supports your AdSense earnings — your analytics, your account standing, your server performance, and the quality of ads being served.
🛡 The Cloudflare Effect: What Happens When You Block the Junk
When I went through the process of properly securing my servers and websites — blocking bad IP ranges, enabling Cloudflare's bot protection, setting firewall rules for suspicious traffic patterns, and configuring country-level restrictions where they made sense — the change wasn't instant, but it was real.
Sites that had been flatlined started picking up visitors. Not floods of traffic — just the genuine visitors who had presumably always been there, now visible because the noise had been removed. The analytics started telling a coherent story. And the AdSense ads that appeared when I visited my own pages stopped being placeholder garbage and started being actual relevant advertisements.
Cloudflare operates at the network edge, which means it can intercept bad traffic before it ever reaches your server. That's important. It's not just about keeping your analytics clean — it's about reducing server load, improving response times, and ensuring that when Google's own crawlers visit your site, they're seeing it perform at its best.
While you want to block malicious bots aggressively, you also need to make sure you're not accidentally blocking the bots that help your AdSense. Google's Mediapartners-Google and Google-Display-Ads-Bot crawlers need unrestricted access to your pages to serve properly targeted ads. If Cloudflare's bot fight mode blocks them, your ad quality will suffer.
The solution is simple: create a custom Cloudflare firewall rule that explicitly allows these user agents through, while the broader bot-blocking rules remain in force.
| Traffic Type | What to Do | Effect on AdSense |
|---|---|---|
| Malicious bots / scrapers | BLOCK | Cleans audience signal, reduces invalid clicks |
| Data centre / proxy IPs | BLOCK | Removes fake impression sources |
| Irrelevant geo traffic | BLOCK | Improves audience quality score |
| Mediapartners-Google | ALLOW | Required for relevant ad targeting |
| Google-Display-Ads-Bot | ALLOW | Required for ad approval & quality |
| Googlebot | ALLOW | Required for SEO & rankings |
📊 Why AdSense Account Health Matters More Than Raw Clicks
One thing that took me a while to fully appreciate is that AdSense isn't just a tap you turn on. It's a relationship with Google's advertising network, and that relationship has a health score attached to it.
An account with a clean history — consistent genuine traffic, low invalid traffic rates, good site performance — gets access to better ad inventory. Advertisers paying premium rates want their ads appearing in front of real humans on quality sites. Google's system routes that premium inventory to publishers it trusts.
An account with repeated invalid traffic warnings, suspicious click patterns, or degraded engagement metrics gets pushed down the priority list. It still serves ads, but it gets the bottom of the barrel — the unsold inventory that nobody wanted to pay much for. This is one reason why two seemingly similar sites can have wildly different RPM figures. It's not always about the content or the niche. Sometimes it's simply about traffic quality.
Protecting your account health by proactively blocking bad traffic is an investment in your long-term earning potential. You're not just cleaning up today's data — you're building the kind of account reputation that gets you access to better ads tomorrow.
⚡ The Speed Side of the Equation: PHP Settings and Redis in Plesk
Blocking bad traffic was the biggest single change I made, but it worked alongside something equally important — actually making the sites fast. And on a Plesk VPS, two things made an immediate and noticeable difference: tuning the PHP settings and setting up Redis as an object cache.
This matters more than most people realise when it comes to AdSense. A faster site isn't just a better experience for visitors — it's a signal Google actively measures and rewards.
PHP OPcache: Small Tweaks, Big Impact
Out of the box, PHP on a shared or default VPS setup isn't optimised. It's configured conservatively so it works for everyone, which usually means it's not ideal for anyone. In Plesk, you have per-domain control over PHP settings, which is genuinely useful once you know what to change.
The key area that made a real difference was enabling OPcache, which compiles PHP scripts into bytecode so they don't have to be re-parsed on every single page request. With OPcache properly enabled and configured — setting a sensible number of cached files, turning on file revalidation, enabling fast shutdown — pages start serving noticeably faster because PHP isn't re-doing work it already did five seconds ago.
The performance improvement on a WordPress site with OPcache configured properly versus without is substantial. Dynamic page generation that previously took several hundred milliseconds drops to a fraction of that for cached requests. That's real-world speed that visitors and Google both notice.
In Plesk, go to each domain's PHP settings and ensure OPcache is enabled. Set opcache.memory_consumption to at least 128MB, enable opcache.validate_timestamps so code changes aren't missed, and set opcache.max_accelerated_files high enough to cover all your WordPress PHP files. These three settings alone will shave meaningful milliseconds off every page request.
Redis: Giving Your Database a Memory
WordPress by default hits the database for almost everything — every page load, every widget render, every option lookup. On a busy or even moderately trafficked site, this creates a queue of database queries that stacks up and slows everything down.
Redis changes this by acting as an in-memory object cache sitting between WordPress and the database. When something has been fetched from the database once, Redis stores it in memory. The next time WordPress needs it, it grabs it from RAM instead — which is orders of magnitude faster than a database query.
Setting Redis up in Plesk and connecting it to WordPress via a caching plugin means your database stops being the bottleneck it was before. Pages load faster. The server handles more concurrent visitors without buckling. And critically, the Time to First Byte (TTFB) — one of the metrics Google uses to evaluate site performance for Core Web Vitals — improves significantly.
I set this up as part of the process described in the Plesk VPS guide, and the difference in server response times was immediately visible in both Cloudflare's analytics and Google Search Console. Within hours of Redis going live, the cache hit count was climbing into the hundreds of thousands.
Why Speed Directly Affects AdSense
Here's the link that gets missed. Google's AdSense system doesn't just look at your traffic — it looks at your site quality, and page speed is a significant part of that.
Faster pages mean ads load faster. An ad that loads before the visitor scrolls past it gets counted as a viewed impression — a metric known as viewability. Advertisers pay significantly more for ads that are actually seen. Low viewability rates — which happen when pages load slowly and visitors scroll past before ads render — push down your effective CPM. You're serving impressions, but they're not qualifying as viewed ones, so they earn less.
The combination of blocking bad traffic and genuinely speeding up the site creates a compounding effect. The bots are gone, so the real visitors are visible. The pages are fast, so those real visitors stay longer and engage more. The engagement signals are clean and positive, so Google's systems respond by serving better ads. Each improvement feeds the next one.
🔍 The SEO Connection: Why Cleaner Traffic Brings More Real Visitors
There's another layer to this that's worth understanding. Google's ranking algorithms use engagement signals as part of how they evaluate site quality. A site with artificially inflated traffic but terrible engagement metrics — because bots are distorting everything — can actually end up being ranked lower than it deserves.
When you remove the bot noise and your real engagement metrics become visible, Google gets a more accurate picture of how genuine users interact with your content. If real visitors are staying, reading, and navigating through your site, that positive signal starts carrying more weight in rankings. Better rankings mean more organic traffic. More organic traffic means more genuine ad impressions. More genuine ad impressions means more AdSense revenue.
The cleanup doesn't just help you today — it starts a positive feedback loop that compounds over time.
🔧 Practical Steps to Block Bot Traffic and Protect Your AdSense
If you want to replicate what I've done across my own sites, here's where to focus:
-
Set up Cloudflare and configure it properly Even the free tier gives you meaningful bot protection. Enable bot fight mode, set your security level appropriately, and make sure you've configured it to pass real visitor IP addresses through to your server so Google can accurately assess your audience.
-
Create a whitelist for Google's ad bots In Cloudflare's firewall rules, add exceptions for user agents containing "Mediapartners-Google" and "Google-Display-Ads-Bot". These crawlers need to reach your pages to serve relevant ads. Block everything else you don't trust, but let these through without question.
-
Block traffic from high-risk countries if it's not your audience If you're running UK or US-focused sites and you're getting floods of traffic from regions that have no realistic connection to your content, geo-blocking is perfectly reasonable. You're not losing real potential visitors — you're removing bot hotspots from your audience signal.
-
Check your server logs regularly Cloudflare will show you what it's blocked, but your server logs tell a more complete story. Look for IPs making dozens of requests in seconds, user agents that look like scripts rather than browsers, and patterns of traffic hitting the same pages repeatedly from different IPs.
-
Enable bot filtering in Google Analytics GA4 automatically excludes known bots using the IAB/ABC international bot list, but sophisticated bots that mimic human behaviour can still slip through. Pair GA4's built-in filtering with your Cloudflare rules for the best coverage.
-
Watch your AdSense invalid traffic reports In your AdSense dashboard, you can see what percentage of your traffic is being flagged as invalid. If that number is high and then drops after implementing your blocking rules, you're on the right track. A lower invalid traffic rate is directly associated with better account standing and access to higher-quality ad inventory.
-
Tune your PHP OPcache settings in Plesk Go into your Plesk panel and enable OPcache for each domain. Set a generous
opcache.memory_consumption(128MB or higher), enableopcache.validate_timestampsso changes aren't missed, and set a reasonableopcache.max_accelerated_filesto cover all your WordPress PHP files. This alone will shave meaningful milliseconds off every page request and reduce the CPU load on your server. -
Set up Redis as a WordPress object cache Install Redis on your server, enable the Redis extension for PHP in Plesk, then connect it to WordPress using a caching plugin. Once it's running, your database stops being hammered on every page load. The result is faster TTFB, better Core Web Vitals scores, and a server that handles traffic spikes without slowing to a crawl — all of which feeds directly into better ad viewability and stronger Google quality signals.
📈 What to Expect After Cleaning Up Your Traffic
The results aren't always dramatic overnight. What you're doing is removing noise so the real signal can be heard. In my experience, here's what tends to happen over the weeks following a proper bot cleanup and speed optimisation:
Your reported visitor numbers may actually drop even though your real audience is the same size it always was. This is a good sign. What you're seeing now is closer to the truth. The engagement metrics — bounce rate, session duration, pages per session — will improve because they're now reflecting actual human behaviour rather than bot noise.
And over the medium term, the improved engagement signals and cleaner analytics will start feeding back into better organic search rankings, which brings more real traffic, which improves the whole cycle again.
🎯 The Bottom Line
Most AdSense optimisation advice focuses on what you put on your pages. Very little focuses on who — or what — is actually visiting them. That's a mistake.
Bad bot traffic isn't just background noise you can learn to live with. It's actively corrupting your analytics, suppressing your ad quality, exhausting your server, and potentially putting your AdSense account health at risk. And a slow site makes all of that worse — because even your genuine visitors won't stick around long enough for the ads to load, let alone be seen and clicked.
Cleaning up the traffic and speeding up the server work together. One removes the noise. The other makes the signal as strong as it can be. Done together, they create conditions where AdSense can actually do what it's supposed to.
The internet is noisier and more hostile than it's ever been. Protecting your traffic quality isn't optional anymore. It's the foundation everything else is built on.
🖥 Want the Full Technical Setup?
The complete guide covering Cloudflare WAF rules, Redis installation, PHP-FPM tuning, MariaDB optimisation, and server hardening on Plesk is all here.
Read the Complete Plesk VPS Guide →



