I've been battling this script for a week now, trying to get it to work with memory limitations. I don't have a problem with server limitations (yet), but it's been tremendously frustrating to try to work with a script that can't respect the limits it is configured to work within. When the script is allocated 100 MB of RAM to use, it should surely be able to operate within it (or at least default to some chunking code). This may be implemented in the browser-triggered crawl, but cron-based crawl has consistently failed out due to an over-memory error (the script itself exits, stating it is asking for more the config says it should).

However, now I have managed to get it to run! But it still dies. (Last line of command-line output + error (domain removed for privacy):

it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.

This comes when I configure a maximum memory amount for the script (in this case, specifically 256MB). Should the script not be able to handle this situation and chunk the data appropriately? Having seen this message after crawling is complete regardless of what amount of memory I allow it to use (first 32MB, then 100MB, now 256MB), my suspicion is that part of the script that is checking to see how much it should use, then trying to behave appropriately is failing.

Any help you can offer would be greatly appreciated – any thoughts as to how to get this to finally work and, perhaps more importantly, how to resume after it has failed in this way (as I am getting somewhat tired of having to wait 8+ hours to see whether the script will work).