1. How many API calls does Salesforce allow your company to have every 24 hours? (This can be found on the company information page on Salesforce)
2. How many records would be in the ContentVersion table?
3. What are the 5 largest files in that table by ContentSize?

1. How many API calls does Salesforce allow your company to have every 24 hours? 10 million
2. How many records would be in the ContentVersion table? - around 6000
3. What are the 5 largest files in that table by ContentSize? - we have average size is 5 MB

When you are ready to Replicate them down, go to the Registry Settings page of the DBAmp Configuration Program on the server and set it to a number larger than your largest ContentSize in the ContentVersion object. Then, run the following:

Yes, as long as you set the the MaxBase64 size field number in the Registry Settings to larger than your largest ContentSize before running the SF_Replicate. Also, just as an FYI, downloading blob (binary) data is the worst performing functionality of the Salesforce server. Therefore, this Replicate could take awhile.

currently SF_Replicate command not bringing back all documents if it is not shared to SSIS_Data_Ingeration user when uploaded document to SF. We want to retrieve back without every time sharing to SSIS_Data_Ingeration.

Is there any option to USING SCOPE Team .

And what will be best solution to handle this issue and retrieve all documents through SF_Replicate