Why Your Website Analytics Are Probably Lying to You


You check Google Analytics. Your website got 5,000 visitors last month. Your bounce rate is 45%. Your most popular page is the blog post you wrote about productivity tips. Your average session duration is 2 minutes and 30 seconds.

Armed with this data, you make decisions. More blog content, since that’s what people want. Tweak the landing page because the bounce rate seems high. Feel good about the 5,000 visitors because it’s up from 4,200 last month.

Here’s the problem: almost none of those numbers mean what you think they mean.

Your Traffic Numbers Are Inflated

That 5,000-visitor figure includes a significant percentage of bot traffic. Not just the obvious crawlers from Google and Bing — those are filtered by most analytics tools. The problem is the sophisticated bots that mimic human behaviour: they load pages, trigger JavaScript, and appear in your analytics as legitimate sessions.

Imperva’s annual bot traffic report has consistently found that bots account for 40-50% of all internet traffic. While major analytics platforms filter known bot signatures, they don’t catch everything — particularly the newer generation of bots designed specifically to evade detection.

For a small business website, bot contamination might inflate traffic numbers by 15-30%. Your 5,000 visitors could be 3,500 real humans. That changes the ROI calculation on your marketing spend considerably.

Bounce Rate Is Misunderstood

Bounce rate — the percentage of visitors who leave without interacting beyond the initial page — is one of the most misused metrics in web analytics.

A 45% bounce rate sounds concerning. But context matters enormously. If someone lands on your blog post about “how to fix a leaking tap,” reads the entire article, gets their answer, and leaves — that’s a bounce. The visitor got exactly what they needed. That’s a success, not a failure.

Conversely, a low bounce rate might mean visitors are clicking around aimlessly because they can’t find what they’re looking for. More page views doesn’t always mean better engagement.

In GA4 (Google Analytics 4), the bounce rate metric has been redefined. It’s now the inverse of “engagement rate” — a session is “engaged” if it lasts longer than 10 seconds, includes a conversion event, or has 2+ page views. This is better than the old definition, but it’s still an arbitrary threshold that doesn’t necessarily correlate with business value.

Session Duration Is Basically Made Up

Average session duration has always been one of analytics’ dirtiest secrets. The way most analytics tools calculate session duration is fundamentally flawed.

Here’s how it works: the tool records a timestamp when the visitor loads Page A, and another timestamp when they load Page B. The time between those two timestamps is the “session duration” for Page A. Sounds reasonable, right?

The problem: if the visitor only views one page (which, for many websites, is the majority of sessions), there’s no second timestamp. The analytics tool records the session duration as zero seconds. These zero-second sessions are then included in the average, dragging it down dramatically.

Your “2 minutes and 30 seconds average session duration” probably means “visitors who looked at multiple pages averaged 4 minutes, but half your visitors were recorded as zero seconds because they only viewed one page.” The actual average time people spend on your site is unknowable from standard analytics data.

GA4 has improved this somewhat with its “engagement time” metric, which uses page visibility API data to estimate actual time on page. It’s better, but still imperfect — it can’t track what happens after the user switches to another tab or leaves the browser open while making coffee.

Attribution Is a Mess

Where did your visitors come from? Your analytics says 40% from organic search, 25% from direct, 15% from social, 10% from email, 10% from referral.

That “direct” category is the analytics equivalent of a junk drawer. It includes genuinely direct traffic (people typing your URL), but also:

  • Visitors from links in mobile apps (which often don’t pass referrer data)
  • Visitors clicking links in PDFs, Word documents, or desktop applications
  • Traffic from HTTPS sites to HTTP sites where the referrer is stripped
  • Visitors using browsers or extensions that block referrer data
  • Dark social — links shared in messaging apps like WhatsApp and Signal

When 25% of your traffic is “direct,” you have a significant blind spot in understanding how people actually find your website.

Similarly, UTM tracking — the standard method for tagging campaign links — only works when consistently applied. If your email marketing links are tagged but your social media links aren’t, your analytics will undercount social traffic and overcredit email. This is a process problem, not a technology problem, but it’s rampant in small businesses where nobody owns the analytics setup end to end.

What to Do About It

Accept imprecision. Web analytics are directional indicators, not accounting ledgers. Treat them as “roughly right” rather than “precisely correct.” Month-over-month trends are more reliable than absolute numbers. If traffic goes from roughly 3,000 to roughly 4,500, that’s a real signal. Whether it’s exactly 4,487 or 4,623 doesn’t matter.

Focus on conversion metrics. Instead of obsessing over traffic volume, track what actually matters for your business: form submissions, phone calls (use call tracking numbers), quote requests, purchases. These are harder to inflate with bots and harder to misinterpret. A 20% increase in contact form submissions is a clearer success signal than a 20% increase in page views.

Implement proper UTM tagging. Create a simple UTM naming convention and apply it to every marketing link you create. Google’s Campaign URL Builder is free and takes 30 seconds per link. Consistent tagging dramatically improves attribution accuracy.

Use multiple data sources. Cross-reference analytics data with other signals. Does your analytics traffic spike match a corresponding spike in enquiries? If traffic went up 30% but enquiries stayed flat, the “traffic” might not be real. Server-side analytics tools like Plausible or Fathom provide alternative data points that aren’t affected by ad blockers (which can cause Google Analytics to undercount by 20-40%).

Get an annual analytics audit. Have someone who understands web analytics review your setup once a year. They’ll find misconfigured goals, duplicate tracking code, spam referrals inflating your numbers, and filter issues you didn’t know existed. It’s a few hundred dollars for clarity that could change how you allocate your marketing budget.

Your website analytics aren’t worthless — they’re just not as straightforward as the dashboards suggest. Understanding their limitations doesn’t mean ignoring them. It means reading them with appropriate scepticism and making decisions based on patterns and conversions rather than headline numbers that make you feel good but might not be real.