Network unreachable: robots.txt unreachableWe were unable to crawl
your Sitemap because we found a robots.txt file at the root of your
site but were unable to download it. Please ensure that it is
accessible or remove it completely.

and

Network unreachable: robots.txt unreachableWe were unable to crawl
your Sitemap because we found a robots.txt file at the root of your
site but were unable to download it. Please ensure that it is
accessible or remove it completely.

Yet, I can access both my robots.txt and sitemap.xml. I have reading other posts here and there, but could not solve/understand what is causing this issue. Anyone knows?

2 Answers
2

You probably want to check with your site's hoster to find out more. These messages essentially mean that Googlebot can't reach your site, which is generally something that happens more on the hosting side. We see a lot of "unreachable" failures for the domain overall, for what it's worth, so if they want to have those sites listed in Google's search results, it would probably make sense for them to figure out why these requests are being dropped.

Unfortunately, the fact you can access the files doesn't guarantee that Google can. Are you doing any user-agent detection or similar that may be interfering? Try downloading a page with the Fetch as Googlebot tool, and see if it encounters any problems – if that works, that suggests anything else should, too.

Depending on your site, there may be a few days or more between Google's attempts to access the file. As such, it could be there was an issue then that has disappeared now (like a server outage). So if the Fetch tool doesn't suggest any problems, I'd consider waiting a while to see if the issue recurs before doing more.

No, I am not doing user-agent detection. I did try a fetch as Google and got a 'Unreachable robots.txt' too. But, I am gonna wait a little to see if the issue disappears...
–
JVerstryJan 21 '13 at 9:58