I've got the message on the google webmasters tools that the change speed has been taken in account:

But after more than three days, nothing happens: still one request every ten seconds

See here:

My webserver is very fast and can handle up to twenty simultaneous connexions. And my website is brand new, this means google is almost the only one here crawling my website. After more than 30000 successful requests (= no 404), I think there's something going on... or maybe this is just a bug?

How often does your content change? Because even though they take this GWT setting into consideration, they ultimately decide what the crawl frequency will be for your site. And it really depends on how popular your site is and how frequently you update it. A Site with low PR and doesn't get updated often won't be crawled as frequently no matter what you set these values to. And even if for now they're crawling at your set speed, they may change it again in the future if they deem it inappropriate for your particular site.
–
Lèse majestéFeb 12 '12 at 0:26

Hi, no offense but I explained in my question that my webserver is very fast and can handle up to twenty simultaneous connexions. A few hours after my question everything went fine (see my own answer). So I calculate this: 2 request by second X 3600 seconds X 10 hours (the morning after)= 72000. I launched this morning awstats and guess what? Found 71101 new qualified records. Conclusion: google crawler configuration has been properly applied, and my website is able to handle far more things :) . Ain't life good?
–
Olivier PonsFeb 12 '12 at 10:14

As I said, even if it's set to your desired crawl rate right now, it could change in the future if Google changes its mind. The speed of your server is irrelevant. Most servers can handle a lot more than 2 RPS, but Google's not going to keep crawling your site at 2 RPS if it detects your site is only updated once every 6 months. What purpose would that serve? I also very much doubt the change was due to a Google engineer manually tweaking the crawl rate for your site after reading your question.
–
Lèse majestéFeb 12 '12 at 10:55

Today google went down to 1 URLs every three seconds, this is very annoying. You're right: in the end they do whatever they want, and this "customization" is kindof useless.
–
Olivier PonsFeb 13 '12 at 12:02

1

It's not useless. It's just not as simple as you telling Google what to do and them doing it. It's one of the factors that Google looks at, but they still have ultimate discretion. The same applies to the country-targeting setting, sitemaps, and other features.
–
Lèse majestéFeb 13 '12 at 12:07

2 Answers
2

Crawling is the process by which Googlebot discovers new and updated
pages to be added to the Google index.

We use a huge set of computers to fetch (or "crawl") billions of pages
on the web. The program that does the fetching is called Googlebot
(also known as a robot, bot, or spider). Googlebot uses an algorithmic
process: computer programs determine which sites to crawl, how often,
and how many pages to fetch from each site.

Google's crawl process begins with a list of web page URLs, generated
from previous crawl processes, and augmented with Sitemap data
provided by webmasters. As Googlebot visits each of these websites it
detects links on each page and adds them to its list of pages to
crawl. New sites, changes to existing sites, and dead links are noted
and used to update the Google index.

Said that, we can say that: The more google visits your site, the more information would be indexed. The problem with that is that the traffic to your server will be affected.
Google's resources are involved too... so, if your page doesn't need to be crawled again (google knows), they will not follow the speed you set.
That's an option, but it is not a rule.

If your site could need it, google will do it (at least always in the beginning, until they notice that you don't really need to index the information so often),

[Update]Okay, after a few days of analyzing, it seems that Google doesn't actually asks for the "crawling speed" he should use, but for the "**maximum crawling speed" the webserver can handle. This makes a huge difference to me.

So, for example, I've set it up to 10 connexions per seconds, and sometimes it's only 1 "crawl" every ten seconds, and sometimes it goes up to to 10 "crawls" per second.

Any settings that control crawl rate (including those in sitemap.xml) only influence how you'd like to allow google to crawl your page. You can manipulate the settings to give parts of your site preference over others (because the crawlers may only be given a limited time to crawl your site), or to limit how often Google can crawl your content(some sites have been brought to their needs by the crawlers in the past), but schedule for when the crawlers run is controlled by Google and Google alone.
–
Evan PlaiceFeb 14 '12 at 20:56