All replies

I cannot understand why you need to return 210,000 records in one operation but you need to close all the connections associated with all the Data Adapters and make sure all the DataReaders are wrapped with the second using statement which calls dispose for you automatically.

If you're trying to fill a DataSet with 2.1 million rows, it's not terrifically surprising that you're running out of memory. Remember that the DataRow is not just an array of data. It contains state and version information about each row, and each row is also pointed to by one or more indexes that the DataTable is maintaining internally.

If you need to keep information about 2.1 million things in memory, you probably don't want to need to use a DataTable for the purpose. All of the operations that you'd need a DataTable for - data binding, state and version management, filtering, sorting - will run slowly with that many rows, and will often grab huge chunks of memory as they run, too. A single statement (DataRow[] r = t.Select("1 = 1")) can eat up 8MB or so.

It may make more sense to build a lightweight class with a very few fields, and use a SqlDataReader to execute the query, create an object for every row it reads, and add the row to an in-memory collection like a list or dictionary. That will give you much more fine-grained control over the amount of memory your program uses.

That said, my instinct is that if you're moving 2.1 million of anything over a wire, that's probably a design defect. There are certainly applications where it's not, don't get me wrong. But the first thing I would do in your case is figure out if I can use a smaller working set of rows. Is the application aggregating data? Maybe it can be aggregated in a query instead. That sort of thing.

I always struggled againt this bad habit of considering that memory is no more a problem.

I know that is an error.

I remember that a beginner programer succeeded to crash a HP-UX with 32 Go because he have thrown directly a request which got 1 000 000 records on Oracle 9i.

I think that the problem is the people have everything to do simple and effective programs but they often don't want to think what's happening on the server.

I's true that a SqlDataReader is easy to use. The only defect is that you can go backwards and so you have to treat your rows quickly and somtimes it's difficult to have time to think about the best way to organize the progs.

A View is the best tool to move data in specific columns because if you are pulling data from many tables with a View you have one Virtual table to access. SQL Server 2005 comes with the RowNumber function that lets you page through large dataset more efficently because in a web application it is very easy to get out of memory exceptions. In SQL Server you can use the TOP clause and RowNumber to return subset of data as need, here are three links with code for web application because I am a web developer.

If all that you're going is reading rows and writing them to a text file, you should consider using a SqlDataReader to fetch the rows from the result set one at a time, rather than pulling all of them into a DataTable.

A View is the best tool to move data in specific columns because if you are pulling data from many tables with a View you have one Virtual table to access. SQL Server 2005 comes with the RowNumber function that lets you page through large dataset more efficently because in a web application it is very easy to get out of memory exceptions. In SQL Server you can use the TOP clause and RowNumber to return subset of data as need, here are three links with code for web application because I am a web developer.