1234_AP email@removed a ?crit :
Question from 1234_AP on Oct 18 at 4:59 AM
I have a few CSV's that need importing in my C# to SQL Server tables. I have done this before but found it pretty boring and tedious. Does anyone know of any tools or methods to speed up this process?

They are correct to use SSIS to import multiple .csv files into your SQL Server tables. When you use the Business Intelligence Development Studio (BIDS) and you create a new Integration Services Project look at using the 'For Loop' container to iterate through your .csv files.

Wouldn't the absence of indexes on the table(s) help with import performance? Seems to me the overhead of keeping indexes up to date as data is being bulk loaded would slow it down. If possible, drop indexes, load the data and then re-create them. Someone please validate this logic.

If so you could probably build an import routine that creates a staging
table on the fly by reading the header, and then import the data all as
text. Once it's in the DB then it easy to route it to the final table using
queries to convert the data.

Thanks all of you.
Actually, I am asking that I have csv file and take to outside but I have create table according to csv file but one case occure in this csv file that it have more than column according to table column.
Then I want to temp table and this table count column and then import to another table. How is it possible? If yes please reply otherwise give me right way.
Thanks all of them.

How to will create an editable package in sqlserver.? Pease give me code and explain it.
and I have confused that.
Only one question asking to you if this is large csv file and it have like 50million records and column is more and please help me....??

Create a staging table with one wide column varchar column.
Using the import wizard, import the file into the table as a "ragged right". Parse the contents of the table to determine column layout in a calling the correct procedure to process that format in a procedure.

My question is:
I have table in database and define table structure when I import csv file in table then import when column is same but I have another csv file whereas csv and table structure are not same. So, how to import … please give me solution and not define table structure again and again.
Please help me…

My question is:
I have table in database and define table structure when I import csv file in table then import when column is same but I have another csv file whereas csv and table structure are not same. So, how to import … please give me solution and not define table structure again and again.
Please help me…
and how to speed up import

We have responded to your question numerous times with essentially the same
answer.

You have not given us any data samples, possible table structures, or any
significant detail that we can respond to other than "I have a variable
import file I want to code for."

We can not see your data, read your mind, hear your thoughts or work with
"I want to code the ghost file. Please help me." We don't even know if you
receiving DB is normalized. If you have no clue, then we are equally
clueless. At this point I would suggest you either look at a site like http://www.vworker.com/ (formerly RentACoder) or give us a clue of your
industry and data.

Without that a pertinent information we can't help you. We are volunteers
here. Not a single one of us is making a dime from trying to help you.

How do you expect to NOT define the table structure again and again if it changes? The only way around it, and not a good solution, would be a varchar Max column you insert the entire row into and the parse the record from this 'staging" table.