Hi, my site took 52hrs to crawl through 260k pages. This is slower than desired.

How can I increase the speed of crawling?

I see the feature 'Make a delay between requests, X seconds after each N requests'. What is the default value is left blank, and what are the increments I should decrease the X seconds delay to boost the crawl speed?

with website of this size the best option is to create a limited sitemap - with "Maximum depth" or "Maximume URLs" option limited so that it would gather about 100-200,000 URLs, which would be main pages representing "roadmap" sitemap for search engines.

The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.