If you already have a Microsoft Azure account and use Azure containers for storing and managing your data files, you can make use of your existing containers and folder paths for bulk loading into Snowflake.
This set of topics describes how to use the COPY command to load data from an Azure container into tables.

Loading from block, append, and page blobs is supported. Unloaded files are created as block blobs. For information about these blob types, see the
Azure documentation on blob types.

As illustrated in the diagram below, loading data from an Azure container is performed in two steps:

Step 1:

Snowflake assumes the data files have already been staged in an Azure container. If they haven’t been staged yet, use the upload interfaces/utilities provided by Microsoft to stage the files.

Step 2:

Use the COPY INTO <table> command to load the contents of the staged file(s) into a Snowflake database table. You can load directly from the bucket, but
Snowflake recommends creating an external stage that references the bucket and using the external stage instead.

Regardless of the method you use, this step requires a running, current virtual warehouse for the session. The warehouse provides the compute resources to perform the actual insertion of rows into
the table.

Tip

The instructions in this set of topics assume you have read Preparing to Load Data and have created a named file format, if desired.