This section describes bulk data loading into Snowflake tables using the COPY INTO <table> command. The information is similar regardless if you are loading from data files on your local file system or in Amazon S3 buckets or Microsoft Azure containers.

Currently, Amazon S3 and Microsoft Azure are the only services supported for staging external data. Snowflake assumes the files have already been staged in one of these locations. If they
haven’t been staged already, use the upload interfaces/utilities provided by the service that hosts the location.

Use the COPY INTO <table> command to load the contents of the staged file(s) into a Snowflake database table.

This step requires a running virtual warehouse that is also the current warehouse (i.e. in use) for the session. The warehouse provides the compute resources to perform the actual insertion of rows into the table.