Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!

Owning all of the top 10 positions for one keyword

I would like to know if it is considered bad by Google to rank every single spot in the top 10 with the same keyword?

All websites conver the same topic which this keyword, but have unique content. Those websites dont link to one another.

To be clear, I am not asking if it is a good idea because it will be harder to rank, find links etc. or because the last positions don't bring enough traffic. I really just would like to know if it against Google guidelines? Any thoughts or sources are greatly appreciated.

6 Responses

I agree with Tom and Lynn. It would be technically possible if you make sure there is no connection with the site, (who is, theme, content, different host/ ip range, etc), there would be no way for google to know one person is behind it.

But the amount of work to do that with white hat methods would be incredibly hard ( even with black hat it would be very hard), for a decent keyword worth ranking for..... I don't think its possible these days.

In most if not all of such cases you should ask yourself if this is a 'normal' situation for a 'normal' search query and I think you will find that the answer is no. If the sites are all owned by the same person then it is certainly borderline if this situation is resulting in searchers finding enough variety in the results for them to say that google is returning the *best* results. Further to that, for any semi-competitive keyword it would be quite unusual for this situation to be happening so I would ask myself if it is worth the bother maintaining all these sites and tracking their rankings. Chances are that at some point the algo will mix things up and some (all?!) of the sites will move out of the top 10.

So, officially against the guidelines? (assuming no funny stuff) probably not. An un-natural situation that is unlikely to be very stable and/or worth the effort of maintaining it... probably.

10 domains with unique content on each one. Then linkbuilders get links to those sites to make those the ten to be ranked first on Google.

Basically, I meant what Tom Roberts answered to.

So, Tom, for you this should not be done because there is always a risk. Do you have sources for that opinion? That's what I would think also but I'm looking for more sources. For example, people that have been penalized or official Matt Cutts article or a study on several top 10s by a big name like seomoz or something like that.

If you had 10 individual websites with completely unique content that did not link together and each one offered a unique value, it could happen.

The unique value factor is key here - does each website not only have unique content on the matter, but also a new theme, style and way of portraying the message? If they do, I could see it happening.

But even then, I can't imagine Google would be happy if they found out all the sites belonged to one webmaster. It would take a manual review, but even if the WMT information was different, site themes were different, who.is registrar was different, backlink profile was different - basically everything was different, if they suspected it was the same person behind it all they would simply deindex the sites.

Hey, it's their search engine. We have quality guidelines and ideas of what we can or cannot do, but at the end of it there is a manual team who can penalise who they choose. This is a situation where I can imagine they would.

It's worth mentioning that I have seen a top 10 where 10 different websites which fell under one subsidiary trading name ranked for a pretty commercial keyword. So yes, it's possible.

Also worth pointing out is that 9/10 of those sites are now nowhere to be found.

This is a blackhat technique known as "Page Stuffing" it's probably not as easy to do these days due to Panda and Penguin, but the technique would involve ranking for one page (say a homepage) for a certain keyword and then duplicating the content so that each page would rank, pushing the competition down by up to 10 ranks so that one website is flooded for a certain search term.

Luckily, crawlers can now determine if a page is duplicated and just won't rank it and Google will penalize the offender.

Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!
Learn more.