I would remove all constraints and turn off the identity column (just to make it as simple as possible). If the data imports, then you can turn everything back on. I hope you don't have to go through the export process every time you try this. You know that
you can save the SQL script after the export process and start over form there. As a matter of fact, once you have your tables created, if you want, just create a BCP command (you can look at the bottom of the TCQL script for BCP args to get the parameters
and just do the BCP command) from a command window. You can watch how many records are uploaded and then restart BCP where the last batch succeeded. That is what SQLAzureMW basically does.

Hi, does it get very far into the BCP process? Or does it die right off the bat? What version of SQL Server are you running on your local machine? How big is the table causing you issues? Is this something that I can copy to my machine and try (as long
as it is not top secret)? What is your target server in Azure? When you type in BCP /v in a command window, what is the version that you get?

I will take a look. But as a side note, I would temporally bump my database server up from an s0 to s1 or s2 (or even higher if you want) for the database migration. Once everything has been migrated, then I would bump it back down to s0 (or whatever size you
need).

I pulled your data and put it in a local database. Then I ran SQLAzureMW and went through the full process. Everything went well. Here is my summary:
BCP output file: c:\SQLAzureMW\BCPData\dbo.SurveysResults.dat.txt
7/20/2016 10:23:36 PM --> Copied 70010500 of 70010500 (100%)
Clock Time (ms.) Total : 12209141 Average : (5734.27 rows per sec.)
Processing finished at 7/20/2016 10:23:36 PM -- UTC -> 7/21/2016 3:23:36 AM
Total processing time: 3 hours, 23 minutes and 30 seconds

You can see that it took 3 hours and 23 mins to upload. I created an s2 database. You might want to try that. I will also use SQL Azure to backup my database to blob storage. We might want to copy that to your Azure blob storage and you can load from that.
But it is just that one table.

Yes, you should be able to if they are on different tables. SQLAzureMW kicks off 8 parallel BCP processes when migrating a database. You can set the number of parallel BCP process you want in SQLAzureMW.exe.config. Something else that I do is use many machines
for large (many table) databases. Once the database schema is create on your target server, then you can use BCP (or SQLAzureMW) to upload data to different tables on different machines.

Thanks George, I finally got it working on 3 machines. It was actually quicker, to copy over the data files to one of my VMs in Azure and then do the BCP from there, the speed was impressive. I have Comcast at
home and you know what that means. J

Quick question, thought you might be able to help here. I can’t seem to execute an insert statement against a table. I don’t know what the issue is, so I tried to drop or rename it, so that I can recreate it.