Launching the Searchpartner.pro website was an interesting experiment in measuring Google's crawl rate. The domain had been parked at a registrar for some time, nearly a year, so Googlebot and other crawlers would have known about it, but would not have found any content. This may have been a negative factor in the subsequent crawl rate.

Before launching the website, all the appropriate actions were taken to insure a rapid crawl and index rate:

Creation of all relevant pages, with informational pages of high quality and narrow focus

Implementation of appropriate META data

Validation of all links and HTML markup

Implementation of crawler support files such as robots.txt and an XML sitemap

Finally a sitemap was registered with Google and the site brought online...and then the waiting began.

It took more than two days (approx. 57 hours) after registering the sitemap for Google to actually parse it. Google found no errors.

It took three more days after parsing the sitemap for Googlebot to actually crawl the site.

More than 24 hours after crawling the site, Google had added only three pages to its index.

It seems that the days of "launch today, indexed tomorrow" are in the past. Even with publishing a website based on Google's best practices, it seems that Google is somewhat overwhelmed at this point and crawl rates for new sites are being delayed.

Two unknowns:

Does leaving a domain parked for a long time negatively impact the initial crawl rate?