Robots.txt is on hubpages server. It's where hubstaff specifies which pages search robots should not scan. For some reason they chose to hide these hubs of yours from search engines. You have no access to robots.txt so you can do nothing about it.

Huh. I looked too and most of my top performers are blocked were listed as blocked as well. I wonder if that has anything to do with my ads suddenly a lot of five cent clicks when I was getting fifty cents or more before.

Our robots.txt doesn't block the Google AdSense bot from any page (it's separate from the crawler for search, although they probably do share data). I suspect these occasional messages are simply errors, but it's hard to tell without seeing the entire blocked URL.

and, interestingly, translate.googleusercontent.com was not in my Allowed Sites but had been displaying my Google AdSense code in the last week.

If you're worried, check back in your Site Diagnostics in a week, since I think they crawl no more than once a week. When this has happened to me before, the blocked listing disappeared at the next crawl.

Any time the HTML from your hub is served on another domain, such as through Google's translation service or a proxy, it will be that sites robot.txt that determines whether the Google MediaPartner's bot can crawl it. There is nothing that can be done about those.

If you are seeing just normal HubPages URL (ie. http://hubpages.com/hub/...) in that list, I have no good explanation for it (we aren't seeing any of those ourselves). However, if you are seeing AdSense ads on the hub (in particular, well targeted AdSense ads), then I wouldn't worry too much about it. It's probably just a glitch.

What if you're not seeing targeted ads? Some of mine that are the aboe list are not targed ads. On the native habitat for leopard geckos the ads include crazy white teeth, truth about belly fat, fix.op

Its the first time I have been into those diagnostics and I found 4 blocked

They say Robots.txt file: If you have a robots.txt file, the page requesting Google ads may be marked as 'disallow' in your robots.txt file. You can update your robots.txt file to give us access by following these instructions. If you do not have a robots.txt file, please create one in the root directory of your domain and then update it by following these instructions. If you don't know whether you have a robots.txt file, please consult your web provider or webhost and ask them to modify the robots.txt file on your domain to comply with these instructions.

Do we or dont we have on in the root directory of oyr domain.Please helpI know nothing about that.......

cashmere, take a look back about 10 posts were Paul Deeds (Hubpages Staff) gave a good explanation. I looked at the robots.txt file ... HubPages does not block any individual hubs at all.

Two things can be happening that is confusing to see...

(1) If you give referral links to people and they click on it, your affiliate code is placed in that other Hubbers ads. I think for 10% of the time, so you get 10% of their earnings for the referral. As I understand it, that 10% is part of HubPages 40%, so the other Hubber still gets his or her 60%. Anyway, when Google AdSense and Analytics sees your affiliate code in the other Hubber's Hub, that hub will appear in your report. This is all normal and it is what gives you the credit for the referral.

(2) The other thing that's confusing is that some web sites may serve your html (possibly in a frame) to show what you have written. If Googles see that the domain part of the URL is not one of your approved domains, then it will be rejected. This has nothing to do with anything that HubPages is doing.

Hope that clarifies a few things for those I've been reading in this thread.

I'm having that same issue as Cashmere. And I notice a dramaitic drop in my Adsense earnings since this started over the weekend. I have had several consecutive days without even getting $0.01 for page impressions and I have hundreds. There has to be something more to this problem. I have 4 hubs all showing blocked even after a new crawl that added the 4th hub to the blocked list. Here is what is showing on Adsense Diagnotics report:

I don't seem to be having the problems I read about in this thread. Everyone...try adding your URL's of all your hubs directly into your AdSense account. See if that helps and report your results here after a few days. That is what I'm doing. I am guessing that Google is looking for the URL with "?done" at the end, as "Benz B" reported. I don't know where Google is picking that up from. But if you put the URL's in yourself without anything extra, then Google should have no trouble finding it.

I just received the below message from Adsense.Publisher ID missing from ads.txt filesYou need to add your publisher ID to the following ads.txt files: hubpages.com/ads.txtThis will prevent a potentially...

Just after being moved from Squidoo, I tried to apply for an AdSense account, which failed for this reason:"Insufficient content: To be approved for AdSense and show relevant ads on your site, your pages need to...

Hi I was checking my adsense account and in the reports section a number of my URLs appeared as blocked URL and the reason stated was the Robots.txt file. What is a Robots.txt file?Does a blocked URL mean that the...