The recent fix by Google makes the session_start and first_visit more consistent by including the event parameters, such as the traffic source details, that used to be missing. However, there are still many issues related to these two events. At best, they are just a bit off, while in worst cases, they are complete junk.
Incremental tables allow maintaining tables based on large source data by only performing small periodic queries. In this post, I’ll share three different methods for performing an incremental refresh. The code examples are all based on the GA4 BigQuery export data.
The daily BigQuery export schedules for GA4 are notoriously random. Sometimes the data is processed in the early-morning hours, sometimes at noon, and sometimes not even during the next day. In this article, I’ll walk you through how to set up a Cloud Workflow that compiles and executes the selected queries in your Dataform project.
This blog post will show how to recreate the GA4 session traffic source dimensions using BigQuery export event data. The result will be a lookup table containing the last non-direct source of traffic for each unique session id.
In this post, I will go through how to set up a simple event stream from server-side Tag Manager directly to BigQuery. The idea is to keep the setup as simple as possible. That way, it’s suitable for different kinds of experimentation and debugging needs. Things to consider when sending data to BigQuery BigQuery supports
One of the significant improvements in GA4 over the old Universal Analytics is its event-based data model and the free data export to BigQuery. This article is a collection of some of the most common SQL formulas, plus a few extras, needed when working with the GA4 event data in BigQuery.
Unlike some other Google tools, Search Console lacks the free BigQuery data export. In this article I’ll guide you through in how to set up you own BigQuery export using Node.js and Google Cloud Functions.