Attunity driver for extracting data from Oracle is faster but if you are extracting millions of rows from Oracle and you think it is slower than anticipated, make sure you change the following advanced settings

Right click on Oracle Data Source and click on “Show Advanced Editor…”.

The default batch size is 100 which means that as soon as 100 rows are fetched, it will be sent down the pipeline. If you are using cloud environment, sending through pipeline has extra overhead causing performance degradation while fetching millions of rows. Instead, increase the BatchSize value to 10000 and you will feel the performance difference. 10000 is my magic number and worked in all the situations for me but you may have to find your own magic number by experimenting with it

About Vishal Monpara

Vishal Monpara is a full stack Solution Developer/Architect with 13 years of experience primarily using Microsoft stack. He is currently working in Retail industry and moving 1's and 0's from geographically dispersed hard disks to geographically dispersed user leveraging geographically dispersed team members.

I had a question which is a little different from the topic. But I couldn’t get an answer to it so I am back here 🙂
I am trying to do an incremental load using attunity connector for connecting to the database. But I am not sure what I should be using. I read that we cant use lookup so I had been trying the Merge join but somehow haven’t been able to achieve anything. There is a pic of the package on the below link. It would be great if you could help me outhttps://stackoverflow.com/questions/51777894/incremental-loading-using-attunity-connector