Before Google released Google Analytics 4, only GA 360 users had the ability to send raw events from Google Analytics to BigQuery. Because GA 360 is not free (the price starts from $70 000 per year), not all businesses used the raw data exploring feature.
If you use GA4, you can follow official documentation from Google on how-to set up BigQuery Export.
For Google Universal Analytics, Google still has not provided free options to export raw data to BigQuery or any other database. But what if you don’t have plans to migrate to GA4 and want to get the benefits of BigQuery raw data? I decided to solve this question with GTM Server Side tagging and created this how-to for you.
How I send GA events from browser to BigQuery table
- gtag.js sends GA event to Google Tag Manager Server container
- Universal Analytics client gets the request and triggers JSON HTTP Request Tag
- JSON HTTP Request Tag sends data to the URL of Request To Google Cloud Storage function
- GCS function saves data on your GCP Storage bucket
- Use BigQuery GCP table type to perform data querying
Things that you should know about this flow before you start to go thru how-to
- BigQuery can use Google Cloud Storage bucket as a table. More info about this feature can be found in this official documentation.
- You can store on GCS 5 GB of data for free. One GA request is near 1 KB, so that’s 5 242 880 requests. More details about GCS pricing you can found on the pricing page.
- The First 2 million requests to Cloud Functions are free, so that’s more than enough for our task, but remember to check usage and pricing if you already use it.
I decided to divide how-to into two parts. First, how-to save data that came in request on GCS, and the second part of how-to query this data with BigQuery. I hope this helps you get the desired result faster.
How-to store data from Google Analytics client inside Google Cloud Storage
Using this how-to, you can store data from any GTM Server Side client, not only Google Analytics.
- We will start by creating a Google Cloud Storage bucket that will contain our raw event data. Search for Storage product in GCP search prompt. Then click on “Create Bucket”.
- Choose a name for your bucket, click “Continue” and choose region “us-central1”. Leave other options with default values.
- Now you need to create a Google Cloud function that receives data from Google Tag Manager Server Side Container. To do this, go to Google Cloud Developer Console and open the Cloud Function section. Then click “Create Function”.
- Fill in the name of the function and select “Allow unauthenticated invocations” Authentication option. Copy somewhere “Trigger URL” we will use it in the next steps. Click on the Next button at the end of the page.
- Set “Entry point” field to “init”. Also, copy the content of this file and paste it into the code window. You need to change ‘your-bucket-name’ to the bucket name you’ve created in the previous steps. Also, change ‘very-secret-random-key’ to some random key. We will need it in the future. The runtime needs to be set to Node.js 10. One more thing, you need to edit the package.json file. Put this file content to it.
- Now you can click on the deploy button. Google cloud needs a few minutes to create the function for the first time so wait a little bit before deploying.
- Time to create a tag inside the GTM Server container. If you don’t have Google Tag Manager Server-Side Container, you can follow this instruction for creating it. Also, you need to have a JSON HTTP request tag template installed on your GTM Container. If you don’t have it, follow this instruction for adding. Create a new JSON HTTP request tag and fill “Destination URL” with the URL you did get when creating Cloud Function on step 4 and add to the end of it your secret key like this “?key=your-secret-key”. Don’t forget to add the trigger you use to determine Universal Analytics (check my screenshots for this step).
- Publish GTM Server container changes and generate test requests to GA Tag. You can simply open your website. If you have done everything right, you will see files generated on the Google Cloud Storage bucket.
If you set up storing raw events on GCS correctly, my congratulations to you! The hardest part is done. Let’s follow a few more steps before you can start query data.
How to query data to Google Cloud Storage using BigQuery
- Go to BigQuery product in GCP and click on “create dataset” button.
- Leave all options by default just fill the name field.
- Now let’s create a table in this dataset. Click on “create table”. In the source selector, choose “Google Cloud Storage”. Field “Select file from GCS bucket” has to look like this: “your-bucket-name/*.json”. Table type set to “External table”. Choose any table name you want. Check the “Auto detect” option. At the end of this form, click on the create table button.
- Now you can query your data. Click on the query table data button and set “*” as the field name. Run query. If you have done everything correctly, you will see raw GA data in query results.
Please feel free to ask any questions about GA events export in the comments or our customer support channels.