If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Download a 80 MB folder with 4000 files

Hello all. On my web server (Linux VPS), I have a quite large image folder with about 4000 files that are about 80MB. What is the best way to download all files to my computer?

FTP trunctates the output to 2000 files, and I don't think it would be very efficient anyway. I seem to recall using a ssh2 command to transfer files between two servers, would that be possible in this case?

I have another question about this folder: Would Apache load the images faster if they were in smaller folders? Will 4000 images in one folder result in slower response/more system resource usage?

You are downloading them to your local machine right? Then maybe using some FTP freeware like FileZilla will do??

Splitting them into smaller subfolders will not help. And actually if it is 80MB, my suggestion if possible is to use the Apache caching capability to cache all the images in memory (you can specify the directory / file type to cache). I think that will improve performance.