what amount of memory should be expected for writing about 100000 EntityObjects to the Oracle 10 database in one SaveChanges().?????

We have a Migration-Framework, that creates an object tree of about 100000 objects. If I will save the context, the memory usage grows from 250 MB rapidly to 1.6 GB and couse my machine has only 2 GB an OutOfMemory-Exception is thrown after about one minute.

if I call SaveChanges() after migration of every 1000 master objects all works fine and the memory usage is of about 400 to 500 MB.
But this is not in one transaction and I can not rollback the whole data on errors.

(SaveChanges() reports about 35000 objects including the associations for 1000 master objects)

I tryed TransactionScope, but couse of Identity columns?? the transaction crashes with an "open cursor error" from oracle at the beginning of the second package of 1000 records. I am not sure, this will avoid the large memory consume...

Is there a solution how to handle such a moderate count of objects in one transaction???

It is not so easy to isolate the problem from the application, couse i need the data of two schemas and some logic, i hope to have time next days...

up to now i play with Pooling=false and it does not work, now i am getting index errors for duplicate keys,
but i do only inserts and all keys are generated via sequences during insert triggers.

But now i have seen the explicite BeginTransaction() method in the objectConext connection.

couse i do not need multiple connections this is sufficient for our application!!!
I do not need the TransactionScope now.

Combined with the SaveChanges() after each 1000 new records avoiding OutOfMemory-Exceptions all works now.
The code writes 180.000 objects in about 13 minutes to the database.
Calculation (reading and conversion needs < 1 Minute) the rest is SaveChanges()