I'm running Windows Server 2008, Service Pack 2, 64 bit, on quad-core servers with 32 GB of RAM. I regularly work with files that are 100 GB in size, and I've run into a rather odd problem.

If I log on to Server A and copy one of these large files to Server B, things work as expected. It takes a considerable amount of time, of course, but both machines remain responsive. However, if I log in to Server B and copy the file FROM Server A, then Server A begins consuming memory at an alarming rate until memory is full all other work on that server comes to a complete halt, and copy progress slows to a trickle.

This also happens if I have a program reach across the network and read the file sequentially.

I've tried COPY, XCOPY, and ROBOCOPY, all with the same result.

So how do I prevent Windows from stupidly trying to buffer the entire file as it's copying? If I have to, I'll ensure that all copies are From A to B, but that's a less than ideal solution. It seems to me like a "server" operating system should be able to handle this scenario without trouble.

Interesting idea. But RichCopy doesn't appear to work in command line mode. The help says "richcopy source destination flags". Giving that command (i.e. "richcopy \\server\path\filename .\filename") shows me the splash screen and then exits without displaying any information. I need a command line tool, not a GUI.
–
Jim MischelAug 25 '09 at 21:08