I am trying various ways of importing data from a fairly large XML file into database tables and I'm countinuously hitting the wall of 2^16 node occurences, which generates an error.
Has anyone ran into the same problem and found a workaround?

The content is an ISO20022 formatted file with a bank-to-customer statement (camt53) with 70k+ transactions. One upper lever being the statement and 70k+ entries for that statement.

My first approach was:
1. load the XML file in a table with a CLOB column
2. "load" the CLOB content into a xmltype variable

This approach was working fine while the number of entries was moderate - up to 10k, but when the number of entries grew, the processing time grew exponentially, ie. 5.000 rows in 5 minutes, 70.000 rows in an estimated 8h, I didn't wait for it to finish.

To avoid this bottleneck I tried approach 2:
1. created a table of xmltype
2. converted my clob into xmltype:

3. created an index on xmltype_table.xml_content
4. same loops and fetch as in the first approach, but from the xmltype table, hoping that the indexes and lack of conversion clob->xml would speed thing up

The result was exactly the same as in approach 1 :-( , but with a twist!:
If I fetched the n-th occurence, Oracle returned the values of the n-th occurence concatenated with the value of the (n+2^16)th occurence :-O

I have just written a small example with 76.000 node occurences and it works normally, evidently the problem is in the server resources. I have been trying this on a test environment with inferior hardware, I'll try importing the data on a better server.
I would have never imagined that a 84MB XML would be too much to hande in this day and age.

Its probably good to mention that Odie used an XMLType table create statement in 11.2.0.x which defaults to securefile Binary XML (XMLType). The underlying storage, that is picking the right one for your use-case, makes a HUGE difference regarding performance.

Have a look at "*Oracle XML DB : Choosing the Best XMLType Storage Option for Your Use Case* (PDF) Jan 2010" on http://www.oracle.com/technetwork/database/features/xmldb/index.html and bare in mind that your storage, almost certain is XMLType Basicfile/CLOB storage, will be deprecated in 12C and with reason...

Thank you for the suggestion, I have already read the recommendations and chose the Binary XML storage.
I had to leave for a meeting and will try Odie's insert statement tomorrow. I have already tried inserting data this way, but master/detail separately, I will try this option too.
I suspect that the test server might not be up to the challenge, I don't know what kind of hardware it is based on. If I won't achieve better results with Odie's insert (and I suspect I won't), I'll try running the whole thing on a more powerful server.

I pointed the finger at Oracle first because of a very strange behaviour when trying to extract data from xmltype tables, if you have more than 2 ^16^ node occurences using the extract function returns the n ^th^ value AND the (n+2 ^16^ ) ^th^ value. I need to try this out on a different machine to exclude the client etc., but that's a whole different story, this approach is not suitable to retrieve large quantities of data anyway.

Yes, I have noticed, CLOB based storage worked horribly slow, xmltype binary storage was about 1000x faster, unfortunately it threw the "out of memory" error while inserting using SQL/xmltable in less than 5 minutes :-/
I'll try Odie's insert and call the client's DBA about the server's HW detail, hoping to find out that the problem resides there so this problem won't occur in the production enviroinment.

Possibly... but just for you to know I tested my sample code on a simple laptop PC with 2GB of RAM, and Oracle 11g XE.

The two-step approach (master first, then details) might be actually faster than the multitable INSERT because of the overhead introduced by the analytic function (I used ROW_NUMBER to workaround a bug with the FOR ORDINALITY clause).
Will test that tomorrow too.

I pointed the finger at Oracle first because of a very strange behaviour when trying to extract data from xmltype tables, if you have more than 2 ^16^ node occurences using the extract function returns the n ^th^ value AND the (n+2 ^16^ ) ^th^ value.

As Marco pointed out, storage strategy is the key.
As soon as you move to PL/SQL processing, i.e. involving transient XMLType instances, Oracle can't use streaming evaluation anymore and goes to DOM processing.

Always check explain plans for your queries. If you see "XPATH EVALUATION" step, you're on the right track.

From the same URL as shown above, try applying the techniques as shown/demonstrated in "Oracle XML DB : Best Practices to Get Optimal Performance out of XML Queries (PDF) Nov 2011" combined with XMLType binary XML and get rid of the XML/SQL operators and functions (extract, extractvalue, etc). Start using xmlexists etc. instead as mentioned in the "new features/_DEPRECATED_" section of the 11.2 manual.

I have narrowed down the problem - it's in the data!
As soon as I eliminate all the fields that contain non-ASCII-7 characters everything works fine. If I select a single column containing special characters, the out-of-memory error is back, even in a select-only situation, no insert needed at all.
The problem might be that the data is in UTF8 format and our database uses EE8MSWIN1250 charset, I intended to convert the data after retrieving it with an xmltable select, but I might have to convert it before importing it in the database, I'll try this out to further narrow down the problem.

After extensive testing I have to discard my hypothesis that the problem resides in UTF8 encoded special characters. I have created a clean source XML with 100.000 entries end tried to import it into a table resulting in an error every time, on 3 different servers, all running 11.2.0.3 on 64bit linux.

Could you please take a look at the test case if you spot an obvious error in my insert script and run the test case on your server?