I have some huge log files that are too big to open in TextPad. I need a script that takes in one of these giant log files and breaks it into several 500MB text files.

The main issue is runtime. I had something that crawled through line by line, outputting the line to the chunked file, and checking to make sure the file was still under the 500MB limit, but this was going to take about 4 hours to finish.

Looking for a trick to avoid line by line iteration or some library I don't know about to help get this done - I don't need a completely written program necessarily.