If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Use ERRORS=N in your sqlldr command line or parameter file, where N is an arbitrary large number. I don't know if there is any upper limit for this parameter. As you've allready found ot, the default is 50.

Jurij ModicASCII a stupid question, get a stupid ANSI
24 hours in a day .... 24 beer in a case .... coincidence?

i did a lot in data conversion from legacy systems into oracle apps. the resone why i'm telling you this is that when i read your thred i think: why do you need so many errors? i think that you should look at your data and clean it befor you start to load. i think that too many errors in the sqlloader stage is a sign of bad data. better take care of it befor you start inserting data useing sqlloader, lok at the logic of the data creation. and, if i'm wrong in my argument, why not use the power of the sqlloader to clean the data first and then end up with errors that are importent and help you learn about potential problems in that data.

Jmodic is rite...When u issue ur sqlldr command also give this ERROR=n at the command line for specifiying ur required number.

Well...shawish_sababa,

I too did a lot of data transfer from Unysis MF datasets to oracle tables after importing them from Unysis. There were instances like loading in millions of records where some corrupt records tend to get in between and if we have to sort them, then we have to do the entire process rite from the begining viz, import it from dataset, check them manually to confirm the absence of corrupt records and therby the data feed in has to b stopped for time being....

Well..in this case, we can alone get to a stand where we will b aware what r all the records that has got corrupt data and we can minimise our effort while doing the reqd operattion there by the work around won't b tampered as well...

shawish_sababa, the reason I need to allow high errors is beuase it will be a batch process every week which will load 6-7 million records. Manually cleaning those huge files every week is virtually impossible, instead cleaning of the rejected records and feeding them is a better idea.