Google SEO News and Discussion Forum

I was penalised in December last year for 'inorganic links' pointing to my website.

I have since backtracked and made a massive effort to have these links removed by contacting webmasters and the people who created the links.

This work continues but it now seems that Google has amplified this 'penalty' with an algorithmic component that effectively targets the same things (Penguin)

I am now becoming rather despondent as I feel that this problem runs too deep and that I will simply be unable to undo the damage that has been done. I have some excellent naturally garnered backlinks but they seem to have been tarnished by these unfortunate low quality links.

I am thinking that it may be best to abondon ship and redirect as many of the 'good links' as I can to another place. I am rather fond of my domain name and it is certainly something that people seem to remember and really associates well with the content of the website. I currently use the www version of the domain and the non-www version 301s to there. I have analysed the 'link scape' of the non-www version and this seems perfectly clean. What is the best way to cut off the link juice (good and bad I'm sure it will have to be) whilst minimising disruption to my users? Is this even possible without shifting to another domain entirely?

I was hoping somebody would have some insight into or thoughts on this. I am just going to wait and see what happens for now but I am thinking longer term if I can't get things fixed it might be a worthwhile switch.

No that is when the penalty was initiated. I received an email (replyable to) in February stating that it was related to inorganic links and I have been busy trying to remove them and sending reconsiderations (or replying to the email). Again, www and non www are treated as seperate entities on WMT and each have different links pointing to them.

I suppose I will ask a few 'simple' questions to try to work this out.

I know it is a bit unconventional but this is an unconventional situation I suppose. Really what I want to do is 'start again' as far as linking is concerned with minimal direct impact to my visitors.

If I were to implement a 302 redirect from www to non-www would this block off 'link juice' flow (good and bad) whilst allowing the non-www to accumulate and build up authority of its own? Would rankings depend on the new non-www authority that is building up rather than the previous 'www' authority?

If you are mostly concerned about your visitors then just ignore what G is telling you and concentrate on quality content and good visitor experience.We webmasters have no control over who links to us. If G thinks some of those Bl are not to their high standards then those Bl should be ignored. Is not our work to clean up their SERPs. We already serve them our content at no charge, maybe if we start charging G for using our content, maybe then, they start putting more appreciation in to this process. What do you think?

Whilst I subscribe to this logic myself the reality of the situation is that Google has penalised the website for external factors (links) which are 'outside the quality guidelines'. This new 'Penguin' update seems to have simply intensified this. There is always the option to just sit tight and see what happens but I can't help thinking it would be easier to start again. At least in a way that as far as I'm aware can be reverted if I can't get where I need to.

I certainly think I've got the content element under control and my website has not been a target of any of the recent Google Panda updates. I can also redirect and build upon the powerful links and know I will get plenty of natural backlinks in the future. At the moment I am getting a fair whack of my traffic from people discussing my website on various forums and in that sense I am not wholly dependent on Google. I just see a lot of wasted potential at the moment.

301 WILL definitely pass bad links, you're going to be clean for a week or two and may see improvements but eventually everything will come back to what you have now. I most of those spam links point to the main page, there is nothing you could do but removing those links or abandon the domain.

Well that is my point and certainly how I thought the 301 would act. The links only point to the www version of the domain - the non-www has a completely seperate link profile and they are effectively treated as different domains (or subdomains). By switching off the www version the links would point to a dead end, they wouldn't transfer to the non-www version. The logistical challenge here is redirecting the users appropriately if they visit the www version without the search engines passing on the link value from www to non-www.

By removing any useful content from there. To give an extreme example - return a 404. A manual redirect. Same way you'd move to a new domain without a 301. Google gives clear guidance and indication that the 'www' and 'non-www' versions of the website are treated as seperate entities and are ranked seperately just like different domains. A 301 is out of the question as it transfers everything (including link attributes and profile) but the other options aren't. And I wouldn't set anything as a preferred domain in GWT because I don't want link power to transfer.

This might be a stupid question but a 404 page would cut off all pagerank/trust metric flow wouldn't it? I mean if you link to various pages from a page that returns a 404 response code those links are purely for the users not the search engines?

You got that right. A 404 as an http status in the header is exactly as you said. It is not actually a "page" and the text message/content that is returned means nothing. Google just sees the the URL is "not found".

Just an update on this. I have implemented this 410 redirect for the 'www' version of my website after requesting removal of the site on Google. I then put up my content on the 'non-www' and things seemed good... At first.

It seems Google is trying to crawl the www pages quite frequently. It should have been seeing these 410 errors for a few months now. Reading a little more into things it is mentioned in one of Google's own help articles that you need to also have a robots.txt block in place if you are wanting to remove an entire site or directory....... I can't really create a robots.txt specific to the 'www version' of my site (can I?) so I am wondering what to do here. How long does it take before Google would actually 'give up' on the www version. Or do I really have to move to a new domain here?

For some pages yes. Well I was at first. But I have been going through what I thought were my 'good' links and making sure they point to the non-www version... But now I have 'manual action' on the site related to unnatural links. I can't work out if it is due to links that I re-pointed, new links acquired or that Google is actually looking at some links from the www site. This has basically muddied the water completely.

I do see some examples where WMT lists a link "via this intermediate link" and the intermediate link is a 'www' version of the page. I haven't redirected in any way and this concerns me because it is as if they are redirecting automatically. I have no preferred domain set in WMT and I am serving a 410 on all of the www pages so I don't know what is going on.

Edit: I've had a good look through in WMT and actually a lot of instances of 'via this intermediate link' are there. I have a feeling something has gone wrong here.

It could be that the original URLs weren't properly removed from the index and Google has sort of redirected automatically. The number of 'via this intermediate link' instances and just the rankings of certain pages seems to indicate this might be what has happened.

I am running a little test at the moment. I have the 410 returned on all 'www' pages. I have requested removal of a specific page from the www version (it isn't displaying in the search results of course - only non www pages are). Strangely this has caused the non-www version to drop from the index. What's up with that? It is like no matter what I do it is viewing them as one and the same... Although the list of URLs removed in this tool is unique for both the 'www' and 'non-www'.

Well this is very interesting. I posed a question to John Mueller about this via the Webmaster Central office-hours, September 28, 2012 (b) hangout. He addresses the question 16 minutes in and basically says that Google does equate them as one domain. Returning a 410 on the affected version can work 'to some extent' but isn't as clean as starting on a new domain. He specifically said that algorithmically it is treated as one website!