but your Internet connection got disconnected and stopped the scp command from completing the process of copying files from remote server to local server. If you don’t have a lot of files to copy, then you should be able to use the same scp command to copy the same files again until everything got copied from remote server to local server. What if you got huge amount of files (i.e., in tens of Gigabytes) to copy down from remote server to local server? The disconnection of the Internet during scp process is a devastation in this situation, because the scp command would restart the copying of existing files that already downloaded to your local server. This would be a waste of time.

No sweat. I got a solution for you. Try to use rsync command to sync remote files to local files instead. This would mean existing files will be skipped, and rsync would only download new files from remote server to local server. Of course you can reverse the direction of file copying too such as from local server to remote server using rsync. Nonetheless, the command right after this paragraph shows you how to stop wasting time and continuing the copying of files from remote server to local server in the case scp got interrupted.

This rsync command I’d mentioned above uses the e parameter to append the ssh command so rsync can be done through SSH for secure file copying. Basically, the e parameter specifies a remote shell to be used. By the way, the other parameters are -a (equivalent to -rlptgoD – meaning preserving more files attributes than just using -r), -v (verbose printout), -z (compress files during transfer for faster file transfer), and -h (output numbers in human readable format). By using rsync this way, you can now continuing the process of copying files from remote server to local server when scp failed to complete the job the first time around.