Webmaster Tools 500,000 Inter Website Links

I will quickly put you in the picture…I have several travel websites which in a nut shell advertise holiday cottages for different areas of England. All the sites are on the same server, have different content but the same site template.

The problem….I have recently noticed something rather strange in webmaster tools.

This is that it is showing an enormous amount of links between some of my websites (500,000), on further investigation webmaster tools has pin pointed a button called “enquire now” which simply brings up a pop up window allowing users to send an email enquiry to the property owner.

Webmaster tools is suggesting that on website A every time there is an “enquire now” button it is linking to website B via an intermediate link (which is the enquiry form pop up/iframe).

However I have studied the source code both the page the enquire now button is on and also the enquiry form iframe and neither link out to any other website at all.

It is all rather puzzling.

I am concerned that if Google is thinking I have added 500,000 links between my sites it will believe I am trying to spam.

Does anyone know a way I can tell Google not to follow/crawl the enquire now link?

I had thought of the re= no follow however I am not sure this is the kind of use it was designed for.

Alternatively a solution to why Google thinks I am linking between my sites when I am actually not, would be even better!

If the code for the button is using hexadecimal characters in any of its internal URL references, that may be the source of your problem. I recently found this to be an issue on a Website where a widget for a social media button had been encoded with hex-characters in a URL reference (not a LINK, but a reference in the widget code) and the search engine was crawling non-existing URLs because it could not properly translate the hex characters into proper URLs.

That's something to look for. Hope it helps. If it doesn't, you may want to share an image of the code.

If you have an autogenerating "ghost link" you're going to see the problem escalate as the search engines keep trying to fetch non-existent content. I think that IS a serious problem, especially for small Websites that are not on dedicated or co-located servers, because the crawlers will chew up a lot of resources shortening the life spans of the servers' CPUs and hard drives (and as someone who has burned through many servers, I say with all confidence that is a real issue).

There may be a secondary order negative effect as the search engine may eventually conclude that most of a Website's navigation is broken. I don't know that is what they would conclude, but I would -- if I were running the crawler -- be wary about giving much consideration to a Website where most of the URLs I'm trying to fetch are broken.

ON EDIT: And as for "nofollow", it turns out that Google *DOES* crawl those links. According to a recent post (or comment -- I forget which) from ex-Googler Pedro Dias on Google+, GoogleBot will still use NoFollowed links for discovery but will not associate the linking page with the destination (supposedly -- even though we all know that NoFollowed links are reported in Webmaster Tools).

I think fixing broken code -- if there is any -- is the best solution.