1. Data Warehouses
  2. BigQuery

Data anomalies monitoring as dbt tests

Supports Bigquery since version 0.3.4


BigQuery connection profile

After installing Elementary’s dbt package upon running dbt deps, you can generate Elementary’s profile for usage with edr by running the following command within your project:

dbt run-operation elementary.generate_elementary_cli_profile

The command will print to the terminal a partially filled template of the profile that’s needed for edr to work. Fill in the missing authentication fields in the output and copy it to ~/.dbt/profiles.yml.

The profiles.yml file to connect to BigQuery should contain a profile in the following format:

BigQuery
## BIGQUERY ##
## By default, edr expects the profile name 'elementary'.      ##
## Configure the database and schema of elementary models.     ##
## Check where 'elementary_test_results' is to find it.        ##

elementary:
  outputs:
    default:
      type: bigquery

      ## Service account auth, other options require other configs ##
      method: service-account
      keyfile: [full path to your keyfile]

      project: [project id]
      dataset: [dataset name] # elementary dataset, usually [dataset name]_elementary
      threads: 8
      location: [dataset location]
      priority: interactive

We support the same format and connection methods as dbt. Please refer to dbt’s documentation of BigQuery for further details.

Create BigQuery service account

For Elementary to be able to access the jobs by project in the information schema, there are two options:

  1. Create service account with Project Owner or BigQuery Admin role. For this option, skip to create service account.
  2. Recommended: Create service account with lower permissions by creating a custom role. For this option, you need to create the custom role first.

Create custom role

  1. In the Cloud Console, go to: IAM & Admin > Roles

  2. Click on ‘CREATE ROLE’

Create role
  1. Give the role a title, description, etc.

  2. Click on ’+ ADD PERMISSIONS’

  3. Using the filter, find and add the permissions bigquery.jobs.listall and bigquery.jobs.create, then click ‘ADD’:

Permissions
  1. Click ‘CREATE’ and make sure the new role was created and is now in the roles list.

Create service account:

  1. In the Cloud Console, go to: IAM & Admin > Service Accounts

  2. Click on ‘CREATE SERVICE ACCOUNT’

Create service account
  1. Fill in the service account name (‘elementary’) and account description (‘Elementary Data’) and click ‘CREATE AND CONTINUE’:
Create and Continue
  1. Now we need to configure the relevant permissions for this new service account. As described, at this point there are two options -
  2. Option 1 - Choose BigQuery Admin or Owner.
  3. Recommended: Option 2 - Choose the following 3 roles: BigQuery Data Editor + BigQuery User + custom role you created with these instructions:
Roles
  1. The last step is optional so skip it if you don’t need to manage this service account with another service account, and press done.

  2. Press on the dots icon to the right of your screen for your new service account and select ‘Manage keys’:

Manage keys
  1. Press on ‘ADD KEY’ and select ‘Create new key’:
Add Key
  1. Use the ‘JSON’ option radio button and press ‘CREATE’:
JSON
  1. This will automatically generate and download a JSON file with your private key information for this service account. This JSON file provides the credentials to programmatically connect and work with your BigQuery environment.

  2. Add the full path of this JSON file to your connection profile under ‘keyfile’.

Have a question?

We are available on Slack, reach out for any kind of help!