Corrupt File Zip with no Extension

I am using the AddDirectory method to add several folders to a zip file. I then save the zip file to the output stream. I have some users complaining of corrupt zip files.

I also get the error when I manually test the app on the server, however, when I debug the code locally I cannot recreate the issue.

I found a discussion thread that was on point (Thread 64325). I am having the same type of issue, but I definitely never "accidentally" created a zip that I am adding to the new zip. After downloading, when using Windows standard
unzip program, I will get a corrupt file error. If I use WinRar, I find a file with no file extension. If I rename it and give it a zip extension everything is there.

Try this: remove the call to HttpContext.Current.ApplicationInstance.CompleteRequest(), and replace it with Response.Close().

There was a time when a sample shipped with that CompleteRequest in it, but that is wrong. I have since discovered that using it can result in corrupt zip files. You should use Response.Close() instead.

Thank you for replying. I use this library in a few apps and I really like working with it. In the other apps, I am zipping on the fly client side and uploading to the server. Works like a charm.

Response.Close() resulted in a definite corrupt zip. This time around even WinRar throws an error when unzipping - unexpected end of archive. Could this be caused by the Response ending abrupt;y before the process is finished? It all happens
on the same thread so I am not sure if that is possible. Perhaps I shouldn't make assumptions there. Is the process asynch?

C:\Users\stulo.FVTC\Desktop\Wisc-Online-2010-Nov-09-202621.zip: Unexpected end of archive!
C:\Users\stulo.FVTC\Desktop\Wisc-Online-2010-Nov-09-202621.zip: CRC failed in Wisc-Online-2010-Nov-09-202621. The file is corrupt

After trying every possible combination of ending the response with or without flushing I ended up with variations of corruption.

After a little digging, I found out that Response.Flush is only necessary if not buffering the response. After a bit more research, it occurred to me to setting BufferOutput to false may have been the cause of the issue. I commented
out that line and the files downloaded intact. So as I understand it, now the user will have to wait for the entire zip to finish before the download begins. I can live with that.

My theory is that somehow allowing the download to start as the zip was processed, caused the error. It know for sure now that it is all the same thread but... possibly a race condition?

I couldn't judge the cause unless I was able to see the zip files produced in each case. or reproduce it here. Your understanding is correct - buffering usually means the entire response is buffered before IIS begins to transmit the first byte to the
requester. For a large zip this can be a large problem. For smaller zips, no problem. The other side effect is, of course, memory usage on the server. If you have many concurrent requests, they will ALL buffer their results before sending. This
can cause memory usage to spike. Test to be sure.

Regarding the differences in generating zips with buffering and without....Zips written to non-seekable streams use a slightly different format than those written to seekable streams. When you turn buffering off, Response.OutputStream becomes non-seekable,
and DotNetZip uses the slightly different metadata format (described in some other locations as "bit 3 encoding" - check the zip spec for what this means). Some zip utilities do not properly read bit-3 encoded zip files. One notable example
is, I think, the built-in zip handling in MacOs. It will classify such a zip file as corrupted, though other tools can read and extract the zip just fine. WinZip has never exhibited a problem handling bit-3 zip files, as far as I know. The
DotNetZip tools and library can read zips with or without bit-3 encoding.

if you're happy then I guess it's no problem. It's troubling to me that turning off buffering gives you corruption, and I'll want to investigate that for myself. But if you're satisfied then I guess this particular issue is closed.