By comparing the current version of a file to a previous copy, it is possible to determine which portions of a file have changed, and then send only those changes. The problems with these techniques (we’ll call them “delta analysis”) are:

You need to reserve as much free disk space as the size of the files you are backing up so that a local copy of all your files can be stored for comparison (in addition to the free space you need to compress all the data before it’s transmitted)

The CPU overhead to find the file changes can be significant. In some cases, it will take longer to perform a delta analysis than it would take to just compress and send the entire file each time.

If a file is newly created or if a file has not changed for a while, there may not be a previous copy to compare it against. In most of these cases, the delta analysis will then not provide any benefit at all.

Maintaining local file caches, performing the analysis and rebuilding the file at the remote end to its original format can be a very complex process that would be more prone to errors and failures.

The level of security (protection of your data) is reduced since your data will now exist unencrypted on the remote server for at least a period of time in order for the delta process to work.

Configuring and fine tuning these types of systems will require more effort and trial and error.

Depending on what parts of a file are changed or how much of a file is changed, a delta analysis may not be able to effectively determine what changes were made to the file and would then have to back up the entire file.