All Collections
Reports
How do I export platform data into Big Query?
How do I export platform data into Big Query?

For paying subscribers looking to export SEO data into Google Big Query for visualisation in Google Data Studio, PowerBI, etc.

Laurence O'Toole avatar
Written by Laurence O'Toole
Updated over a week ago

In order to export your Authoritas SEO platform data into Google's Big Query service, so you can run complex SQL queries on large datasets, or use the data as a basis for a dashboard you're creating in Google Data Studio or other visualisation tools, you will need to complete the following steps.

We will shortly be publishing a series of powerful Google Data Studio reports which utilise these Big Query datasets to help you build and customise a beautiful set of SEO reports in no time at all. The first iteration of these reports will be published in early June 2021.

N/B: Currently this is available free of charge to all paying monthly or annual subscribers on the Business Expert package or to our paying Agency Providers and Agency Partners only.

If you have a legacy Big Query integration - please speak to your client manager about how to migrate to this new, enhanced data feed. It is possible to do this but it is essential you create a new Big Query Project and Service Key and upload this to the platform or you risk over-writing your existing data feed.


What types of SEO platform data can be exported into Big Query?

We have recently added a new Big Query integration option which allows you to export the following data from your Authoritas platform projects into a single, dedicated Big Query project that you have created following the instructions set out below.

  • Keyword Ranking Data (including tags)

  • Share of Search

  • Google Search Console

  • Google Analytics

  • Website crawl

  • Backlink data

How often is the data updated in Big Query?

Every time a new event is created on the platform then we will try and sync this activity as soon as possible with Big Query. This means that if you create Google Data Studio reports or other visualisations using the data exported from the platform you can be assured that the data is up-to-date.

Getting Started - How do I create a Big Query Project and Service Key?

N/B: You only need to create one Big Query project and Service Key!

We will send all your platform data to this single project - but each project will be in a separate data set with identical file names to make it easy to copy a single Google Data Studio report and connect to the Big Query data you want. Each platform project will have a folder with the following files, as illustrated below.

The data structure in your Big Query instance will be like this:

Big Query Project > Data Set > Tables

Big_Query_Export_Screenshot.png




(You are responsible for paying your own Google Cloud Platform costs. At the time of writing, the first 1 TB of data processed with BigQuery each month is free - but please check Google's latest pricing for Google Big Query).

1) Login to your Google Console

2) Create a Service Account

Screenshot_2020-06-23_at_16.04.19.png
  • Go to IAM & Admin > Service Accounts and follow the wizard

2 a)

  • Just provide a name and (optionally) a description. Google will automatically create the Service account ID:

step1.png

2 b)

  • Assigned the service account you've just created a role of 'BigQuery user':

step2.png

2 c) Grant users access to this service account

  • This is an optional step. You can just click on the DONE button here.

3) Create a key file

  • From the Service Accounts view (IAM & Admin > Service Accounts), click on the Actions column on the right-hand side and select 'Create key'.

create_key.png

This will ask if you want to create a JSON file or a P12 file:

Screenshot_2020-06-23_at_16.30.34.png
  • Leave 'JSON' selected and click on 'CREATE'. This will download a JSON file to your machine.

  • If you now click on IAM & Admin > IAM now, you should see your service account added as a Member with the 'BigQuery User' role assigned to it (if you don't see this (highlighted in the orange box below), you will need to manually add a Member and ensure the Member name is the same as the service key and the 'BigQuery User' role is assigned to this member):

Screenshot_2020-06-23_at_16.16.35_12.png

4) Add Big Query integration to one or more projects in your Authoritas account

  • Login to Authoritas, go to Settings > Integrations and click on 'Google Big Query':

screencapture-ldx-authoritas-seo-html-2021-04-27-15_07_59.png

Select one or more Authoritas projects to sync with Big Query:

Screenshot_2021-04-27_at_15.14.53.png

5) Upload the JSON file

Finally, just upload the JSON file you created in Step 3. The platform will then send over project data to Big Query. You will then be able to use a template in Google Data Studio and populate it using your data in your Big Query instance by changing the data source for the template.

N/B:

  1. If you have a legacy Big Query integration that is already running, then please ensure you are uploading a different set of credentials here or you risk over-writing your existing data feed. If you need assistance then please message the team.

  2. You must enable billing on your Google Big Query account or the data will be deleted automatically by Google after 60 days! We also use some advanced features to stream data to Big Query and these may not be available or limited on the free Big Query plan which may mean your SEO reports may be impacted. Please note, this does not necessarily mean you will start incurring additional costs straightaway as Google offers a reasonably generous free allowance.

Did this answer your question?