Ideas

BigQuery input to TD via Data Connector

Import data from BigQuery by specifying BigQuery's SQL statement.

If you want to import data from BigQuery, TD user can dump BigQuery's job result into GCS and then use GCS data connector. However this needs a lot of manual work and difficult to mange the data pipeline for existing BigQuery user.

It would be much easier if we have BigQuery input data connector, just by specifying BigQuery SQL statement and handles all of underlying complexity.

While we do aim to have this available, please note that there is a workaround for this today. Specifically, you can output from BigQuery to GCS Object store, and then use our GCS Data Connector to input into Treasure Data.

In case Bigquery table has some record column, CSV formatted export to GCS is not available. JSON or Avro formatted export is possible, but it contains data type data and usually difficult to parse by Data Connector.

This is the case of Firebase data export to Bigquery, for example.

In that case, user has to export data from original table to temporary table by creating some query with csv formattable output.Then, export data from the temporary table to GCS.There is no way to directly output query result to GCS.So, total workflow will be as follows.