Data Migrator 7.7.03 seems to run slower and slower as a flow processes, starting out at 20,000 records per minutes even on tough transforms but dragging down to 50 or even less records per minute as the flow processes more and more records. At first I thought this was only for my XML transform, but now I'm seeing it with SQL*Server as well.

Question -- has anyone come up with ways to mitigate this? I'm at 2800 records and it has slowed to a crawl.

If you are using "Insert/Update" and an "If the record exists" value of "Reject" or "Update" then for every input record a SELECT is done against the target table for the key column(s) to see if there is a match. As the target table grows larger that takes longer and the load process slows.

Instead of "Reject" you could use "Include" which for a RDBMS target sends the record to the database which will reject it anyway with a unique constraint violation. In most cases this is faster.

If there won't be duplicates you can speed things up even more by using a load type of "Insert records from memory" which inserts a block of records at a time.

On the other hand if you need to use "Update" the best option for an RDBMS source would be license the Change Data Capture add-on for DataMigrator to process just the changes.

I just want to let you know that our developers have rewritten the XML output routines for 7.7.06 to dramatically enhance the throughput when writing very large XML documents. In one customer case a flow that previously took 30 minutes now completes in 30 seconds. Of course YMMV.

Because this is such a significant improvement and would benefit other customers I requested that this change be retrofitted into 7.7.05M. For John Edward however, I regret that it's not possible for earlier releases.

It's available in 7.7.05M gen 136 which is available for download for 64-bit Windows; other platforms can be requested from Hotline.