There appears to be a bug in gzip decompression using quickbms.i can extract a few files at a time like this but when i extract multiple gzlip archives using one script the program runs out of memory but if i start 1/2 way through the same file it extracts those files that it failed at at first.

uhmm I have used the virtua_fighter_5.bms script on my website but I had no problems during the extraction.I have even used it in loop ("for goto 0 ... next") but it still continues to use only 50 megabytes of RAM in total.

I did not know you made a script i will try it on some files i had problems with. Why do you need to set the endian little for gzip?I did not set it to little in my script i had made and i was able to extract all the files but only about 1 - 3 per pass ill try to replicate it for you.

I guess that one is the reason of the memory consuming you had.in short the gzip files have the size of the original file set at their end and for reading it I used the function that takes care of the global endianess.but in the next version I will remove it because this is the proof that even on big endian systems gzip maintains the little endian size at the end

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum