Listing Files in a directory with too many files for ftp

I ran into an issue last week trying to list files in an ftp folder with over 100,000 files in it. This is a no-brainer because what happens is the data retrieval takes too long to build and spit out to screen.

I thought up a little php script to do this for myself and tried to figure a way to prevent it from time out in the browser. There are a couple of tricks to this as you will see from the code. The first trick is to limit the output until the script is complete or some large portion of files is out. This way we dont spend a lot of time making calls to echo or rendering per say. The second trick is not to buffer too much data so that the script memory is exhausted.

The first trick starts around line 6 where we use the variable $output and assign it to “” or nothing – then on line 12 we put the guts of the file data. The second trick is realized on line 14 where we take the file count modulous 10000 and output