Google SEO News and Discussion Forum

Over the last year, Google has switched from the "host crowding" method of displaying multiple results to the same site, to a system of displaying all those results on SERPs.

eg: Host Crowding was the old method of:

Site Result --- indented 2nd site result

The new system displays all the results inline. In many cases, up to ten results can be for the same site.

The result has been (for many of us), a frustrating - if not - fatal loss of Google search relevance. The only way many of us have gotten around it, is to become very good friends with the minus operator "-" (which has also quite working at times) to remove the site from the results.

Google has not talked about this change much at all. Despite numerous people talking about the loss of relevance with the new system, Google has continued to turn up this dial.

For me, the new system has all but been a nail in the coffin of my Google usage. I can't recall a SINGLE time other than a navigation query, that I have EVER clicked on a result from a site showing 4 or more results from the same site. I had been using Google for about 20-40% of my queries (always starting at Bing) - which was actually a pretty good track record for Google if you consider that means Bing was failing for me 20-40% of the time.

The only real new thing here is that for the first time, Matt Cutts talks about the issue in a video, but never does in fact answer the question entirely. I have yet to hear any major voice say this is a plus.

I definitely see this as a thing thatís going to stick around a while. It seems in a lot of areas Google has developed this because the primary format being entered with a search is a question. The problem is you can obtain a lot of results from any one site that just plain suck. It definitely gives advantage to particular domains and site set-ups. In fact some domains likely saw traffic increases of perhaps a hundred fold while others saw drops of minimum 50% or more in many cases. Logically Bing should be showing an uptick in searches because itís not as dramatic in that direction offering a quicker re-route to the potential answer.

Initially, when variations on multiple Sitelinks started appearing, I thought it was great that Google considered that a particular client site was so dominant in the field that it deserved the extra exposure. Since Panda, though, maybe even earlier... I've also come to see that this is not always a great thing... that Google is also looking for overlapping material and sites that have too much of it, and it was providing the extra exposure in order to cull things down. I've felt that Google's been using various search refinements to calibrate both its own interface and also to compare user intent and engagement with all other pages it returns for similar queries.

The tough part for Google, I think, is that if people spend a huge amount of time on Amazon, eg, comparing similar product pages (and I know I sometimes do when comparing models, checking reviews, etc)... then search engineers might naturally think that Google should be returning pages in a number proportional to this time. It does make a degree of sense, and I don't think that Google decided to roll out this particular brand authority change lightly, even as a test.

To make a social/political analogy, one which I think goes deeper than just coincidental matching of numbers... in the US, we have the Senate (with two senators from each state), and then also a House of Representatives, where the number of representatives is more or less proportional to population. Fairness turns out to be a complex problem, though, having to do, eg, with the greater relative importance of small towns in sparsely populated states, along with many other issues.

Host crowding is like the senators... two listings per domain, more if it's a navigational query. Brand Authority is analogous to the size and importance of large states.

Please... I don't want to turn this into a political discussion, so lets discuss the model, not the government. ;)

That makes sense too, but I think an algo that tells Google to give people 29 Amazon results in a row is an algo that needs some work. Maybe they need to add a tweak for sites that have a huge reputation - call it the "we get it already!" factor. ;)

And I'm saying that as a searcher. It feels like going to the grocery store, and they only carry one brand of cake mix in three different flavors. Sure, I get that it's the most popular brand and these are the most popular flavors, but I want some variety. Now, the grocery store has an excuse: they can only stock so many of these physical items for people to take off the shelves. But the average searcher probably thinks Google is able to carefully parse every website out there. If all they return is three flavors of one brand, repeated over and over, it's really extremely irritating.

@Robert Charlton I agree with you 100%. I'm sure site owners with multiple positions are currently on a high but I think it's real cause for concern. I see a few quality sites with multiple listings but many are duplicate, auto generated garbage.

Our site's position in the serps is relatively the same for many keywords only, amaz.. is the 1st 6-10 results, c.. is the next 4 results, m.. is the next 2. If we rank 3rd or fourth, we can be on the bottom of page 2 instead of third or fourth on page 1. Where we used to rank 10th, nowhere to be found.

Our site's position in the serps is relatively the same for many keywords only, amaz.. is the 1st 6-10 results, c.. is the next 4 results, m.. is the next 2. If we rank 3rd or fourth, we can be on the bottom of page 2 instead of third or fourth on page 1. Where we used to rank 10th, nowhere to be found.

As a USER, I like the way that google often displays Yahoo answers, where they will list the most relevant page as a full result with a snippet, and then have one-line links with the title under that result.

I don't know why they wouldn't do that for the ehows of the world, since often the answers on various forums are SIGNIFICANTLY better than anything written by any of the demand media content.

Agree with Petehall. I cannot imagine how G can keep showing one whole page of youtube or ehow or wikihow results forever. Actually, right now it seems that content farms are thriving like NEVER before. It appears that if Panda has been a disaster all along, this is the lowest point.

I must add, though, that it is understandable that one website has more than one page on the same topic and it is better to show those results, but when you have page after page of same website, it seems not the best user experience. If I find out that Japan Times is the best website in English for news on Japan, I don't need to be shown all the pages ever written by them on Japanese politics. That is what site search is for.

if it was just google showing more pages from a site that has multiple useful pages why does it favor huge brands with nothing but product and reviews? how many pages for the same product do you want? its clearly an attempt at crowding out not crowding in. I presume the strongest pages are being shown first, and yet you still have the same domains at pages 3,4 etc despite already passing up the so called strongest pages. If this is just trying to help out the user by showing more useful pages from one site what it does say about their ability to identify the strongest pages? If they are allowing this to follow for so many pages deep.

I am searching the same term one in firefox and one in ie. the firefox shows 100 results on page 1 with major host crowding going on some have 6 results. the ie showing 10 results some host crowding on bottom of page 1.

If I make a search for a "blue car" I don't want 10 results from 3 websites, 9 results on page 2 from one website because it just makes me work harder to find what I need. Lots of websites advertise "blue cars" so what do certain sites do to deserve all the positions Do Google allow multi results in adwords? No - because that's where they want you. Its a very simple squeeze which pushes up the cost per click.

I don't get Google Recipes in Australia but the mac and cheese search gives me numerous results from a local recipe site, followed by numerous results from another site ad infinitum (SERP of 100). When I look closer, it's not necessarily "bad". This was a vague search and the SERP gave various kinds of mac and cheese recipes, e.g. tuna, classic, creamy etc. Searching for tuna mac and cheese recipes reduces the same-domain results, but's it's still annoying if it's not your own domain. :)

I have seen 20+ results in other niches of late and feel for the SEO agencies who work in those niches.

I don't get Google Recipes in Australia but the mac and cheese search gives me numerous results from a local recipe site, followed by numerous results from another site ad infinitum (SERP of 100). When I look closer, it's not necessarily "bad". This was a vague search and the SERP gave various kinds of mac and cheese recipes, e.g. tuna, classic, creamy etc.

anallawalla - Thanks for this. In the past, I've noticed that Google has frequently been using localization to serve up different results for testing purposes, and I've looked for examples in the US, but stupidly hadn't checked them internationally. I just checked google.com, .com.au, .ca, and .co.uk... and what I'm seeing, IMO anyway, clearly confirms that this is a test, serving up different sets of choices to various countries.

I don't know whether I'm seeing exactly what you're seeing, but I'm seeing multiple results per domain in all countries, often but not always in groups of more than two pages.

Some local sites are being dropped into different slots on all serps outside of the US... and, along with some obviously dominant international "brands", different sites are being tested in many positions.

On very superficial examination, the same site is dominant near the top in all countries... but Australia shows local results in #1 & #2 slots, and UK in #1 slot. US and Canada are the most similar. BBC, which is the #1 result in the UK, is #15 in Australia. Only US returns the Google Recipes features... and again, this isn't toggled "on" until an ingredient check-box is checked.

Related searches suggested at the bottom of the serps are different in each country.

It's too late at night to try to parse and describe other patterns, except to note that it takes several pages before various ingredients or specifics started to show up in recipe titles.

Here are working links to default search pages in the four countries noted, with no search terms entered....

Things are getting a lot worse. Google seems to be pushing Brand names even more than ever to the top. They are totally saturating the SERPs, much like listings used to in old SE Inktomi in the 90s. I am noticing the first 9 out of 10 listings belonging to one domain, with the first two listings each having 4 site links. Then they have another 9 listings between results 10 and 20.

Another example had 19 out of the first 20 from the same domain with the first listing having 6 sitelinks..

No way do I disagree... the posted result is horrible (ten pages, many of them utility pages, from the same domain)... only it's not what I'm seeing, searching either for...

[s*uthbeach diet]

or... [s*uth beach diet]

(I've used an asterisk in the posted queries instead of an "o" to avoid unnecessarily skewing results.)

Either way, in the serps that I see, I'm only seeing s*uthbeachdiet.com returned once on the first page. Sorry, no time for a screen capture at the moment. I hope others will confirm.

I've noted several times here that this has got to be testing on Google's part, and, somewhere in the forum, tedster has posted that he's not always seeing the same brand authority serps for some queries that others are reporting.

To expand just slightly on my thought that this is a test... in the past Google has been offering up Sitelinks and multiple results to provide more exposure for results. I'm guessing that to leave no stone unturned, Google has been showing these multiple results for domains as yet another variant on Sitelinks.

I should note that I'm searching from the SF Bay Area, and that if I reset my location to Austin, TX, I cannot duplicate Brett's results. That said, Brett posted his results on July 13, and it's now July 15.

It's also likely that Google may let the data rule, and that if showing ten results from the same site provides more user satisfaction, they may well go with that. In the SERP Brett posted, there's no way I can image Google users liking those results.

No way do I disagree... the posted result is horrible (ten pages, many of them utility pages, from the same domain)... only it's not what I'm seeing, searching either for...

I am seeing a lot of pages from that domain being returned on encrypted.google.com when I click that link shared by Bret, but not in the google domain for my country or when I do a search for that key phrase on encrypted.google.com.

Might be that you see more if you have visited the site (or maybe a local version of the site in the case of TA)?

I'm seeing the same as BT but I have visited the site (and possibly searched for it) in the past. And of course even though I have my results set not to be 'personalized' and my history not be recorded we all know how that works in practice.

RC, I too am seeing a more varied SERP, and I'm in the Los Angeles area.

That said, what Brett posted looks a lot like when I search for my no-site-link domain names as "my domain." But my domain name is not a keyphrase - it's a brand, so it doesn't have any real competition for that phrase, and anyone searching "my domain" IS looking for results from my domain.

That makes me wonder if one thing Google's testing is how it weighs domain names in queries.

Without a doubt it was likely a test at first but I certainly wouldnít pen my name to it as of this point. Why, because the results radiate the impression youíre in bed with big business. Simple logic shows that in many areas youíre likely narrowing the top results to 50-75 info and article sites for answers and tossing the rest of the heap anywhere you please. Reminds me a little of what technocrats might do with their ideas they know whatís best for you. At least Alta Vista fell because of the competition but when you have none itís a whole different ball game. With all the data freely given to Google theyíve tested what impact losing thousands of good to very good sites would be on their earnings and it is likely minable. In fact a test like this might pretty well establish the final scientific proof of it.

The reason why Brett was seeing all these being the same domain is because Brett's search URL had &num=100 in it, therefore showing 100 results per page.

I think it has been long around that Google moves up results from the same domain when more than one result is shown.

If you change Brett's URL to be &num=10 then you get much more varied results on the first page.

I checked google encrypted subdomain (SSL) and the ordinary google.com domain and they are pretty much the same, hence the difference is in the num parameter and not whether the search was on encrypted domain or not.