The SitePoint Forums have moved.

You can now find them here.
This forum is now closed to new posts, but you can browse existing content.
You can find out more information about the move and how to open a new account (if necessary) here.
If you get stuck you can get support by emailing forums@sitepoint.com

If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

I know that it's possible to stop this error using set_time_limit at the start of the script. However, this may be disabled on some systems.

Is there a better way to prevent this error?

Sounds like a script you want to distribute. You should prevent a single directory from getting too large, so that you can work in more manageable chunks. You can have the procedure split up into multiple http requests until you have completed, kinda like pagination. Use multiple directories, or store filenames in a database to make the the jobs easier to split up and resume.

Sounds like a script you want to distribute. You should prevent a single directory from getting too large, so that you can work in more manageable chunks. You can have the procedure split up into multiple http requests until you have completed, kinda like pagination. Use multiple directories, or store filenames in a database to make the the jobs easier to split up and resume.

What I am actually doing is taking a backup of the entire "public_html" directory of an account and yes the script will be distributed.

My question at the moment is how can I list all the files inside the "public_html" directory without the script timing out after 30 seconds?