Steve,Bear in mind that bcp/bulk insert is a low-level utility that trades features for speed. The above example isn't truly comma-delimited; escaping a field with quotes that has the delimiter character in it is common but outside the scope of bcp.

Moving data to and from SQL Server, especially when the other entity involved is a disparate data source, is nominally done with DTS. DTS does handle escaping the delimiter in that fashion.

Echoing Rob's comments on VFP, that product has not (never,imho) received the attention that VB or Access has, and certainly not the attention that SQL Server has. Moving to SQL Server is definitely upsizing - I'm curious what aspects of SQL Server you find disadvantageous compared to VFP.

My (unfounded) discontent is really born of lack of familiarity with SQL Server. Otherwise, I'm just a long-time Visual FoxPro advocate (harking back to old FoxPlus days), who never got Mr. Gates' ear.

This article was a great help. I have added bulk import functionality to my Newsletter System (built using the Mail Queue Article!) Just to let you guys know, I built my script with a Cursor for the temp table. Why did I do this? I needed to add an Xref record to match new subscribers with the appropriate email list. It was easy. Just loop through your records checking for previous email address (as not to create duplicates). Then insert if the email address doesn't exist and use the @@Identity to create the Xref.

quote:Bear in mind that bcp/bulk insert is a low-level utility that trades features for speed.

I don't agree that this is a valid rationale. The code to deal with quoted CSV fields including quote-escaping is trivial and requires about 3 bits of state; the effect on speed would be negligable. The benefit of recognizing a data format generated by other pieces of Microsoft software would be significant.

quote:Bear in mind that bcp/bulk insert is a low-level utility that trades features for speed.

The above statement is true, though - bcp/bulk insert is faster than DTS and has less features. Yes I make the assumption that DTS is slower because it has more features; I suppose other factors could be in play but why not make DTS as fast as possible?

quote:The benefit of recognizing a data format generated by other pieces of Microsoft software would be significant.

Using DTS it does, of course. As far as why MSFT chooses to not implement this sort of CSV parsing in bcp ... given the significant improvement in DTS from version 7 to 8, I wouldn't be surprised if Yukon continued the trend and bcp was discontinued altogether.

I learned this so long ago I'd be embarrassed to say how long ago it was, but in truth I'm too old to remember how long ago it was. And if I had to guess I would imagine the MS coders who developed bcp had the same idea. I'm pretty sure the definition of a "delimiter" is some UNUSED, NON-DATA character that can always be discarded.

I don't see any valid rationale for using a quoted CSV file, especially if it creates the kind of issues discussed here. You really have to go out of your way to find software that DOESN'T allow a tab-delimited export, or some other delimiter character. Even dBase II could do it, PFS:Professional File, I mean, COME ON! Unless you're grabbing a file from some ancient mainframe (like ENIAC or Colossus) I don't agree that quoted CSV files are mandatory.

And just to throw another monkey wrench in there, whaddya do if your data has embedded quotation marks in it?????????

Actually I would not be suprised to see BCP removed in future versions. When I was at the PASS conference in January I attended a presentation on bulk loading data. They said BCP was the slowest way to do it. The code for BCP hasn't changed since either 4.2 or 6 -- I can't remember which. As I think about that more, I don't think they'll remove BCP, they'll just tell you not to use it.

Oddly enough, BULK INSERT was much. much faster. They used the same format files and options so I assumed they were the same program under the hood. Apparently BULK INSERT uses a completely different code base to import data.

They might, but they've included BULK INSERT into T-SQL, so the functionality will still exist. I worship DTS but many times it's not the right tool to use, especially in a stored procedure.

And I KNOW that tab characters don't exist in my data. I just know. They don't. Believe me, I know.

And I've got a challenge for you Arnold: the reason the + sign doesn't appear in the preview window is because of character escaping. If you can fix that, A) you're a hero, B) I'll cave in on quoted CSV files and recommend them to everyone I see.

To me, if I escape a character that can't be properly reconstructed, well, that means I'm losing data, and perhaps escaping is not a good thing after all. And I still think it's much easier to change a delimiter and completely avoid the need to escape ANYTHING.

Graz, I have had mucho mucho problems with bulk insert in SQL 7. It seems to be connected to using sp_OA* procs and FSOs in other parts of my batch . . . I had to bag bulk insert in favor of a bcp solution, just to get it to run.

After reading this old thread (I was having this problem myself not long ago) I'm quite amused. For those who yell "DON'T PUT COMMAS..." blah blah blah, you should back off and realize we don't often get to choose. Several of our B2B partners export data from a mainframe in exactly this format. My bitching about the format won't get it changed.

You haven't been around nearly as long as some of your posts would indicate. A lot of serious, high-quality code was written in Fox by some very talented people. Everything from accounting applications to large projects involving satellite tracking for NASA.

I wrote in it exclusively from 86 until 94 and made a nice living at it. Switched to VB, then C#/.NET and haven't looked back.

You're right. Microsoft basically killed it by steering every people down the VB path. But then, FP/VFP was never a beginner's toy. That was probably its demise more than anything. And most of those wannabes are back to selling cars or whatever since the bust. Back to the big boys world again.

quote:You haven't been around nearly as long as some of your posts would indicate.

Ha! I was a die-hard Clipper programmer back then. FoxBase (before MS bought it) was the new kid then, and just started coming up into FoxPro. I know what it's capable of, I just never had a chance to use it.

I'm getting an error when trying to BULK INSERT from a text file (pipe delimited) "Bulk insert data conversion error (type mismatch) for row x, column x". The column coming in is a date field but in the dd-mmm-yy format (example: 17-APR-04). The damn thing works on one SQL Server but I can't get it to import on my other SQL Server. Can't understand why it would work in one place but not in another (same text file!!).

The destination field is datetime and the format file is using the proper host file data length of 24 (as specified for datetime fields in the documentation for format files).