Please notice that you can leverage concurrency to copy the slices of files in parallel. This is useful when you want to move the slices already happened in the past.

Caveat: concurrent copy activities with the same host via UNC path with different user accounts may lead to error e.g. “multiple connections to a server or shared resource by the same user, using more than one user name, are not allowed”. This is the restriction of the OS for security reasons. Please either schedule the copy activities with different gateways, or install the gateway within the host and use “localhost” or “local” instead of UNC path.

Now it is fairly easy to author copy activity to copy records based on time slices to Azure Blob with macros in Oracle reader query. Notice that you can specify concurrency number to run the qualified copy activities in parallel:

Specify Encoding for Text Files

Though UTF-8 encoding is quite popular, often time text files in Azure Blob follow other encodings due to historical reasons. With newly introduced encodingName, user now can specify the encoding by code page name for tables of TextFormat type, e.g. “gb2312”, “windows-1255”. If the value is omitted, the connector will still fall back to UTF-8 as usual, unless BOM denotes another Unicode encoding. Please refer to this link for the supported encoding names.

In your database, define the stored procedure with the same name as SqlWriterStoredProcedureName. It handles input data from your specified source, and insert into the output table. Notice that the parameter name of the stored procedure should be the same as the tableName defined in Table JSON file.