Tackling large size XML files

I have a requirement to port large amount of data from one DB to another via XML. We are estimating the size of the XML file (my XML includes data from more than a score tables) in the filesystem to be somewhere near 20MB. In the process of uploading, if I have to load this XML file to memory as a DOM object, then it will take up almost 40MB. (or more ?). Is there any way to avoid loading the whole XML but extract only portions ( say, table by table...Due to some technical limitations, we found the option of having separate XML files for each table not viable.)and then extract data phase by phase ? SAX and jDOM have been suggested as possible solutions...I am new to both and as there is a time constraint, will appreciate if anyone can throw more light on the best approach possible

I was faced with a similar problem-- a rdf file containing thousands of objects and I solved it with SAX. Use SAX to parse the file. This way you avoid creating the whole tree in memory. And to create a large document use a simple PrintWriter, otherwise use JDOM.