Export Excel worksheet as CSV with pipe "|" character as delimiter

I cannot find the previous question I asked about this. Someone provided a process by which I could change the delimeter for Excel "CSV" files from a comma to the pipe character. I need this because some of the text fields contain embedded commas and I don't want to change those, but I need the data in a CSV format for uploading using the SQL Server Bulk Insert command.

At any rate, when I open the Excel spreadsheet manually and save the file as a CSV, (also manually), I get the pipe delimiter. But if I open the spreadsheet via Access automation and save the worksheet:

osht.SaveAs Replace(oWbk.FullName, ".xlsx", ".csv"), xlCSV

I get a comma separated file instead of pipe separated. Can anyone provide any recommendations which will allow me to automate saving the selected sheet of the XL file as a pipe delimited file?

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Fear not! To defend your business’ IT systems we’re going to shine a light on the seven most sinister terrors that haunt sysadmins. That way you can be sure there’s nothing in your stack waiting to go bump in the night.

It was a recommendation similar to Ryan's and changing the list separator in the Regional settings which is allowing the export to work properly when I perform this task manually:
1. Open file in Excel
2. Click File => SaveAs
3. Select the CSV format from the save as dialog
4. Click OK.

But for some reason, when I open the file with VBA automation, and perform a sheet.SaveAs operation, it does not use that list separator format, with the pipe.

@Aikimark, when performing a true CSV export (with comma delimiters), it does. But this does not help, as the SQL Server Bulk Insert process does not recognize these quotes and still separates the fields based on the embedded commas. That is why I want to use the pipe as the separator, because there are no fields in the data that contain a pipe character.

@John, Yes, I could use that technique, but the files that this procedure will act upon may contain as many as 250,000 records, and I was hoping that I could get the sheet.SaveAs method to work properly with this delimiter.

@Gustav, the manual process works now, it is the automated process which is not working properly.

Personally i wouldn't tamper with system settings to get the desired functionality ...you can always forget something and when everything just stops to work then......if the fso crashes with many records its a different case...i will try to check...

@Sham, Interesting, That is a big deal. But since the user will be uploading up to a dozen of these files at a time, long running loops like John suggested will lock up their computer, potentially for hours. Need something that works better. Maybe there is a way to change that system setting briefly, while the code runs, and then set it back, after it completes. More testing needed.

WHERE #AL is defined as a temp table with all the fields are defined as nvarchar( ). After importing into this temp table, the SP performs an Append query that performs data type translations between the nvarchar( ) in the temp table and the actual field types in the reporting database table.

Unfortunately, the Bulk Insert process of SQL Server identifies the fieldterminator (whether embedded inside quotes or not) as a field terminator, so it would break:
a, "abc,def", 1
into 4 columns
a, abc, def, 1
when performing the bulk insert.

I saw this and I rigged up something that may be of value???
If you "Import" the CSV file into Access you can run code like this to create a text string with Pipes as delimiters.

Its a long shot, but it seemed to work for me (presuming I am understanding your situation).
Meaning:
I created a simple file in Excel (With a "fullname" field containing names in this format: "Coachman, Jeffrey", ...to test)
...and saved it as a csv file while in Excel.
I then imported this Excel CSV file into access and ran code like this:

Dim rst As DAO.RecordsetDim str As StringDim fld As DAO.Field'Your Excel CSV file that you import into Access as a table named "Change2Pipe"Set rst = CurrentDb.OpenRecordset("Change2Pipe") 'Get the Field Names For Each fld In rst.Fields 'Add a | between each one str = str & fld.Name & "|" Next fld 'Trim the trailing "|" str = Left(str, Len(str) - 1) & vbCrLf 'Loop all the records Do Until rst.EOF 'Loop all the fields For Each fld In rst.Fields 'Add each field value and concatenate a "|" str = str & fld.Value & "|" Next fld 'Trim the trailing "|" str = Left(str, Len(str) - 1) rst.MoveNext 'make new line str = str & vbCrLf Loop 'Trim the trailing "CRLF" str = Left(str, Len(str) - 2)rst.CloseSet rst = NothingMsgBox str'Create the new Pipe Delimited text fileOpen "c:\YourFolder\Change2Pipe.txt" For Output As #1 'Insert the string variable into it Print #1, str'Close/save this file.Close

As a test, I then took this newly created "PSV" text file and was able to successfully import it (back) into Excel, and it opened ok and the data presented well (Correctly defined fields and the comma in the Full Name field was preserved)
I then was able to, (double-check), also import this PSV text file back into Access (and again, ... it opened and the data presented well)

I'm on the run, ...but if you want the sample files, ...I can get them to you tomorrow...
;-)

SQL Server 2008 R2 will not accept those settings in the Bulk Insert command

FORMAT = 'CSV'
FIELDQUOTE = '"'

I'm going to have to go back to the drawing board, and see whether the client can get the original file, exported from their accounting software in a CSV file, and determine whether there is a way for them to strip commas from the fields during the export process. If not, I'm probably going to have to read the CSV files and parse out the commas embedded within text fields, and then remove the embedded quotes. Don't know how long this will take, but will give it a test later this week.

I also found that Excel was converting several text fields that look like numbers to numbers, and then reformatting them as scientific notation, which corrupted the data as well.

Thanks for the suggestions, I ended up with a solution similar to each of these.

I opened the file as a text file using VBA, read each row of the text file as a string, then parsed each string on the commas, then reconstructed the string, with some logic that accounted for string segments that started or ended with a quote.

"ABC,DEF"

would be broken into array elements "ABC and DEF", so if the array element started with a quote it accounted for that and instead stripping the leading quote and inserting a comma , it simply stripped the leading quote. Then as the next element was read, it would see that it ends with a " and would strip that quote and append that array element to the previous, followed by a comma.

But then I found some records were a field contained "ABC, DEF, GHI", so I had to add logic that would account for multiple commas embedded within a single field and return ABC DEF GHI.

I'll be writing an article on this process it shortly.

By replacing the embedded commas with spaces and then stripping double spaces I was able to accomplish what I was looking for.