Google Drastically Changes Crawl Limits In Search Console

For the past ten days or so we have been following the quota changes with how many URLs you can submit to Google via the Google Search Console fetch as Google feature.

Well, Google has updated the help document on this topic and now the crawl limit has been drastically changed. From 500 per month to about 300 per month for the first option and but the second option went from 10 per month to 2 per day.

Before the limits per option were:

Crawl only this URL submits only the selected URL to the Google for re-crawling. You can submit up to 500 individual URLs in this way within a 30 day period.

Select Crawl this URL and its direct links to submit the URL as well as all the other pages that URL links to directly for re-crawling. You can submit up to 10 requests of this kind within a 30 day period.

Now the limits per option are:

Crawl only this URL submits only the selected URL to the Google for re-crawling. You can submit up to 10 individual URLs per day.

Select & Crawl this URL and its direct links to submit the URL as well as all the other pages that URL links to directly for re-crawling. You can submit up to 2 of these site recrawl requests per day.

Here are screen shots of the help document.

Before:

After:

Again, I don't overall think this is a big deal as most SEOs don't really use this tool that often. And if they do, normally it should be on a very limited basis. XML Sitemaps and normal crawl methods are often the best way to handle bulk content indexing with Google.

Hat tip to:

@rustybrick check out googles “Ask Google to recrawl your URLs” article. The quota has changed to 10 crawls for an individual URL per day and 2 crawls of URL + direct links per day. Interesting change!