If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Unanswered: SQL Loader and large records

I have been able to use SQL Loader with data that is fairly conventional.
Howerver, I am faving a problem with data which is from a full text search databases. For those of you who do not know, full text search databases have a field which is very large and would be called something like OCRText and would represent the ASCII representation of the text in a document. This field when exported to an ASCII file can be quite larege and is over the 1048576 bytes that Oracle allows for a physical record.

Could someone please help me out because my loader routine crashes after the 22nd document (ie. row) because the OCR field is quite large?

I have the Book by O'reilly on SQL Loader and I need some help here as I do not see how using the Concatenate, ContinueIf and ContinueIf statements will help me out.

reply for SQL Loader

Please note that my table has a definition of type CLOB for the OCRText field.
Also, I can expand my CHAR definition from 120 million to 240 million and the program gets slower. For instance, if I change the CHAR definition to 1.2 gigs then Oracle loads one document every 20 minutes, which is too slow.