We recently ran into a nasty issue on a client project. In BigQuery, backend events were popping up from users who didn’t exist in any session data. Just backend hits floating in the void. No user context.
Harmless? Not quite. Behind these orphaned events were real ad dollars quietly going to waste. Thousands — sometimes tens of thousands — gone because attribution was falling apart.
The root cause? Identity handling with Measurement Protocol + GTM Server-Side.
When you’re properly running GA4 Server Side tagging with FPID (HTTP-only first-party cookie), user identity revolves around FPID. Stable, server-managed, clean attribution.
But when developers send Measurement Protocol hits from the client, they usually can’t access FPID — it’s server-only. So they grab what’s available: client_id from the old _ga cookie. Now you’re injecting events with completely separate identifiers into GA4.
The outcome:
- Backend events tied to “ghost” users.
- Duplicate users and sessions (FPID vs JS-client_id collisions).
- Enrichment breaks — GA4 can’t stitch the data together.
Suddenly, your beautiful reports are built on fragmented data. And with them — your marketing spend gets messier too.
There are solutions:
- Return FPID to the client via server response (DataLayer push workaround from Stape) — server returns FPID, frontend captures it, and developers pass it into Measurement Protocol.
- Store a mapping table (JS client_id ↔ FPID) in BigQuery or Firestore — match identifiers during event generation server-side or directly in your database.
- Fully switch to JS client_id even server-side — easier technically, but highly vulnerable to ad blockers and cookie restrictions.
In our case, we simply handed FPID to developers on the frontend. Clean data, proper attribution, and — more importantly — ad budgets finally stopped bleeding.
If you work with GA4 to BigQuery exports, be sure to check out my SQL cheat sheet.