Server-side analytics vs JavaScript analytics.
Every popular analytics tool — Google Analytics, Plausible, Fathom, Simple Analytics, GoatCounter, Umami — uses the same architecture: a JavaScript snippet that fires in the visitor's browser. That architecture has a hard limit. Roughly 30-60% of real web traffic isn't a human in a browser — it's bots, crawlers, attackers, server-to-server requests. JS-only analytics can't see any of it.
Server-side analytics captures requests at the server layer, before any browser is involved. Below is the technical comparison and a recommendation: when JS is enough, when you need server-side, and when (as we believe) you want both.
The architecture difference
JavaScript analytics works like this:
- Visitor's browser requests
/page.html - Server returns HTML containing
<script src="ga.js"> - Browser parses, downloads
ga.js, executes it - Script fires a beacon to GA's servers with the visit metadata
This pipeline fails when:
- The "visitor" is a script that doesn't run JS (bots, crawlers, attackers)
- The visitor is using uBlock Origin / Ghostery / similar (blocks the GA domain)
- The visitor is on a connection that times out before
ga.jsfinishes loading - The request never returned HTML (REST API responses, image hits, file downloads)
- JavaScript silently errors out before the beacon fires
Server-side analytics works differently:
- Visitor's request hits your server (Apache / Nginx / PHP-FPM)
- Your application code (or a server-side beacon) captures the request and forwards metadata to the analytics layer
- Capture happens regardless of what the visitor does next
This pipeline doesn't care about JavaScript, ad-blockers, browser quirks, or response type. If the request hit your server, it's counted.
What each layer captures
| Server-side (Radar) | JavaScript (GA / Plausible / Fathom) | |
|---|---|---|
| Human visitors with browsers + JS enabled | ✓ | ✓ |
| Human visitors with ad-blockers active | ✓ | ✕ |
| AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) | ✓ | ✕ |
| SEO bots (Ahrefs, Semrush, Majestic, etc.) | ✓ | ✕ |
| RSS readers (Feedly, Inoreader, NewsBlur) | ✓ | ✕ |
| Uptime monitors (UptimeRobot, Pingdom) | ✓ | ✕ |
| Attackers (sqlmap, wpscan, nuclei, brute-force) | ✓ | ✕ |
| REST API requests (/wp-json/, /api/) | ✓ | ✕ |
| Web Vitals (LCP, INP, CLS — client-side perf) | ✓ | ✓ |
| JS errors / rage clicks / scroll depth | ✓ | Partial |
| UTM attribution + channels | ✓ | ✓ |
SysWP Radar uses both layers: a JavaScript pixel for human visitors (Web Vitals + UTM + engagement) AND a server-side beacon (via a PHP plugin on WordPress, or a one-line pixel snippet on any platform) for everything JS can't see. The two are complementary, not competing.
When JavaScript-only is enough
- You only care about human conversion funnels (e-commerce checkout, SaaS signup)
- You're OK ignoring 30-60% of total traffic
- You don't run a content site that AI crawlers care about
- You're not on WordPress (most attacks target WP specifically)
- You don't need to know which bots are eating your bandwidth
In this case, Plausible or Fathom are great. They're focused, fast, privacy-first, and they do their job well within their scope.
When you need server-side analytics
- Content site / blog / docs: AI crawlers are now a meaningful percentage of "traffic". You want to optimize for citation in ChatGPT / Claude / Perplexity answers. You can't optimize what you can't measure.
- WordPress site:
/wp-login.php,/xmlrpc.php,/wp-json/are constant attack surfaces. JS analytics can't see any attacks. Server-side captures them all. - E-commerce / SaaS with public API: API requests (GET /api/products) don't run JS. Server-side capture shows you which endpoints are hit, by whom, how often.
- News / publishing: RSS readers (Feedly, Inoreader, NewsBlur) consume
/feed/with no browser. You're entirely blind to your most loyal subscribers with JS-only. - Security-conscious teams: scanners hitting
/.env,/.git/config, vulnerability probes, sqlmap injection attempts — invisible to JS analytics, visible at the server layer.
How SysWP Radar implements both layers
One platform, two capture layers:
- JavaScript pixel (one async
<script>tag, ~2 KB) — handles human browser sessions. Captures Web Vitals using PerformanceObserver, UTM params, scroll depth, rage clicks, JS errors. Runs on any platform. - Server-side beacon — for WordPress, it's the SysRadar plugin that hooks into
shutdownand ships requests overfastcgi_finish_request()+blocking=false(zero latency to visitors). For non-WP, an HTTP webhook or Apache log shipper.
The classifier (a server-side Bayesian-style scoring model + curated rules) sorts each captured request into 9 categories in real time: humans, verified bots, AI crawlers, SEO crawlers, RSS readers, health checks, WP internal, attackers, unknown. The dashboard shows them aggregated, drillable, with hourly time-series.
FAQ
Doesn't server-side analytics violate privacy more than JS analytics?
Can I run Radar alongside Google Analytics / Plausible?
Does the server-side beacon slow my site?
shutdown hook with blocking=false + fastcgi_finish_request() — meaning visitor latency is zero. The HTTP request to Radar happens after WordPress has already sent the response to the visitor.What if my site doesn't use PHP?
Capture both layers in one install.
Free forever plan. JS pixel + server-side beacon, one signup. See the difference in your first hour.
Create free account →First 100 paying customers lock in 50% lifetime discount.