When performing import with the update or delete operation, you need to manually specify columns, which will be considered a primary key.

Google BigQuery Connections

To create a Google BigQuery connection, perform the following steps:

1.

Open the Connections page by clicking Connections in the menu on the left side of the page.

2.

Click the New button.

3.

In the opened pane click Google BigQuery.

4.

In the opened New Connection dialog box, specify a connection Name that will be used to identify the connection.

5.

Click Log In with Google BigQuery.

6.

In the opened window enter your Google BigQuery credentials and click Sign in.

7.

Click the Allow button.

8.

Specify the Project Id and DataSet Id to connect to.

9.

Also specify the name of the bucket (Cloud Storage Bucket) that will be used for temporary upload of CSV files when performing Import or Replication to Google BigQuery. The uploaded files will be deleted after the corresponding operations are finished.

10.

You can also optionally click the Advanced button and set the Command Timeout interval. It specifies the wait time before terminating an attempt to execute a command and generating an error. Note that it is the time to wait for any server reply since the command was sent to a server, and it doesn't include the time necessary to fetch data if the command selects some data.

11.

Here you can also optionally enable the use of legacy SQL syntax by selecting the Use Legacy SQL check box.

12.

In the Advanced settings you can also disable bulk import by clearing the Use Bulk Import check box. This setting affects only import packages with the INSERT operation and with BigQuery as a target.

By default, such packages import data in the following way: Skyvia writes data into multiple temporary CSV files, upload them to Google Cloud Storage and then tells Google BigQuery to import data from these CSV files. These actions are performed simultaneously, providing the best performance. After the CSV files are imported, they are deleted. However, when data are imported this way, it's not possible to obtain a per-record error log.

If you disable bulk import, Skyvia will use INSERT statements for importing data. This allows you to obtain a per-record error log, but provides far less performance. Besides, you may quickly hit the BigQuery limit of 1000 insert statements per day per table. Thus, disabling bulk import for BigQuery is not recommended, and should only be considered if you need to import a small amount of data and need to have a per-record error log or don't want to use Google Cloud Storage for temporary files.