This site uses cookies to deliver our services and to show you relevant ads and job listings.
By using our site, you acknowledge that you have read and understand our Cookie Policy, Privacy Policy, and our Terms of Service.
Your use of Stack Overflow’s Products and Services, including the Stack Overflow Network, is subject to these policies and terms.

Join us in building a kind, collaborative learning community via our updated
Code of Conduct.

Super User is a question and answer site for computer enthusiasts and power users. Join them; it only takes a minute:

I want to copy a number of large files that will compress well from a mapped drive to my local hard drive. Will zipping the files make the overall processing time shorter?

What I am thinking is, can the zip program zip the files on the mapped drive or does it need to pull the files to memory (that is locally) to zip them? This will result in the files being copied across the network 3 times - once to get the raw file to zip, once to put the zipped file back on the mapped drive, and the once to copy the zipped file back again. If this is the case am I better off just copying the files across uncompressed?

My other option is to remote into a server which is geographically close to the drive I am mapping, and zip it from there, then come back to my local pc and copy the zipped files.

You do know that there's at least a dozen programs that can "zip", right?
– Ignacio Vazquez-AbramsJan 9 '12 at 23:54

Yes Ignacio, and if there is one that can do the zipping at the remote location that would be great, but I don't think it is possible. I use 7-zip and win-zip. Can you tell me if it is possible?
– iamdudleyJan 10 '12 at 0:21

1 Answer
1

Any program that creates a ZIP file is basically reads some files and writes out a ZIP file. So, if you're zipping files that are located on a remote computer, these files will have to be read over the network, uncompressed. The ZIP program will write the ZIP file locally, so no further network activity will occur. But in terms of network traffic, it is no different than first copying the files to a local disk and zipping them afterwards.

This is true regardless of what program you use to create a ZIP archive.

If you can remote into that server and compress the files there, and copy the already-zipped files, then you'll save some network traffic if the files can be compressed effectively (e.g. some files do not compress well, like most video files, encrypted files, etc.).

If your network is not 100% reliable, and your data transfer will take considerable time, it is better to copy uncompressed files first, and ZIP them later, and not ZIP the files directly over the network. If some network error occurs while zipping, you'll have to start the process from the beginning. And there are file copy programs that can re-start aborted/failed copy without re-transmitting already copied data.

thanks @haimg. I thought that was the case but you never know when there might be a bit of programming magic out there that does the seemingly impossible.
– iamdudleyJan 10 '12 at 0:25

Actually, the location where the archiving app creates its temporary archives is a configurable option, but by default it's the users %temp% directory.
– HydaralJan 10 '12 at 1:17

Depending on the protocol (SMB sucks, NFS not so much), it can be MUCH quicker to remote zip a large batch of files then transfer them even if the actual file size doesn't shrink much.
– Chris NavaJan 10 '12 at 4:53

@ChrisNava: It can be, but it can even be the opposite: E.g. with high latency high bandwidth link, starting two file transfers simultaneously might be faster than one after another.
– haimgJan 10 '12 at 5:00