Re: How to read all data from a large table with minimum locking

It was not the intent of Carrie's article to say that No Spool is only for small tables. That option can be used with any size table to avoid the spool as long as the entire table is being exported.

The select statement in Export allows a locking modified so the LOCKING FOR ACCESS can be specified for each select.

It would be valuable to us if the required processing that cannot be done in the database could be described. We are always looking for use cases like that to give to our engineers so that they can build new capabilities into the database engine - allowing the processing to be done in the engine and taking advantage of power of the parallel system that is in place.

Re: How to read all data from a large table with minimum locking

It was not the intent of Carrie's article to say that No Spool is only for small tables. That option can be used with any size table to avoid the spool as long as the entire table is being exported.

The select statement in Export allows a locking modified so the LOCKING FOR ACCESS can be specified for each select.

@ToddAWalter Thanks! That's very useful information. If I can use No Spool for any table then this solution looks really promissing.

It would be valuable to us if the required processing that cannot be done in the database could be described. We are always looking for use cases like that to give to our engineers so that they can build new capabilities into the database engine - allowing the processing to be done in the engine and taking advantage of power of the parallel system that is in place.

The processing part does not completely care about the data structure - it expects only binary or text stream so it will not make use of the database capabilities. When I wrote before it's impossible to move the functionality to the database I meant rather that it's not feasable and impractical. It's a generic solution that supports a number of data sources, developped over the years with limitted documentation and it's own language for defining processing rules. Any attempt to migrate the functionality to use Teradata mechanisms would require enginering time far beyond any acceptable time frame.

Re: How to read all data from a large table with minimum locking

ToddAWalter wrote:

The select statement in Export allows a locking modified so the LOCKING FOR ACCESS can be specified for each select.

The only problem I see now is the .LOGTABLE directive, which requires me to have write permission to at least one database, where the table will be created. I understand that under normal circumstances this table is needed to create a restart log. In my case I don't really need it as I simply can run the export again when something goes wrong. I would prefer to have an option of running the FastExport tool having only read access to the database. Is that possible?