Troubleshooting: BigQuery destination
BigQuery API error: Access Denied
This error typically means that the Service Account you created in Google Cloud Platform (the .json key file of which you used in the importer's settings) is missing the correct permission. Please make sure that your Service Account has these permissions:
- bigquery.dataEditor
- bigquery.jobUser
or these:
- bigquery.tables.create
- bigquery.tables.updateData
- bigquery.jobs.create
For tips on creating a service account with needed permissions, please visit this article: https://help.coupler.io/article/190-how-to-get-google-cloud-key-file
Note: after changing roles, you need to generate new JSON key and upload it to Coupler.io as your BigQuery destination connection.
BigQuery table shows column names as string_field_0
Couper.io is using BigQuery auto schema detection mechanism by default. In some cases, schema cannot be generated properly by BigQuery:
- If all columns in your dataset have text values.
- If the dataset you are trying to save is empty.
In these cases, BigQuery automatically generates names for all columns as string_field_0, 1, 2, etc:
To avoid issue #1 - make sure your source data contains at least one boolean, date/time or integer field.
To avoid issue #2 - make sure your dataset is not empty. Extend your reporting period or adjust filters if you applied some.
When the dataset is fixed you can re-run the importer in replace mode to update the table structure.
When the import mode is in "append" mode, the same issue can persist even if there's already data seen on the preview because the BigQuery schema is still not changed. You can either try to:
- Run the importer in "replace" mode before you turn it back to "append" mode if you need this mode
- Or create a new importer from scratch and run in "replace" mode first, before turning it back to "append" mode.
If you don't want to change the dataset - you can specify the schema for BigQuery manually. Read more here: How to generate BigQuery schema to define column types manually?