Last Thursday, Google took the next step in its continuing efforts to focus on high-quality sites by announcing a new feature that allows users to weigh in on the quality debate and block particular sites from their future search results. A similar feature is available in new search engine Blekko, which allows users to limit their search results to a select group of sites. Another search engine (DuckDuckGo) has chosen to exclude so-called content farm sites from its results entirely. Here is a quick rundown on how Google’s new blocking feature works.

A user submits a search query on Google and the results are served. The user clicks on one of the result links and then hits the back button to return to the search results page. That specific result will have a new option underneath its description that reads “Block all example.com results”. This blocking option is only available immediately after returning to the Google results page from a specific site. If the user clicks on another search result and then returns to the Google results page, only the most recently visited result site will be available for blocking. Also, (not surprisingly) the blocking feature does not apply to any of the sponsored results.

After a site/domain has been blocked, if it would normally appear in a future search query, a message will appear on the page that informs the user that 1 (or more) sites have been blocked from results, with an option to view the blocked results and to manage all blocked sites. This gives users the ability to remove a site from their block list, if they should choose to do so in the future.

So, on the surface, we have a fairly standard feature that will allow users to customize search results to suit their own needs and (hopefully) deliver more relevant results to each individual user. The method for actually setting up the blocking is relatively easy, but does require a couple of steps and probably won’t be obvious to most Google users. If I didn’t know to look for the “block all results” link, I probably wouldn’t have noticed it when I went back to the search results page from a result site that didn’t deliver what I was looking for. To me it feels a bit like Google is testing the waters on individual blocking to see how users will incorporate the feature into their search behavior, without making it too big of a deal – at least initially.

Things could really get interesting when you consider what Google may do with this personalized blacklist data across all of its users. On an individual scale, it just means that if I don’t like a particular site/domain, I can block it from all future search results. But on a global scale, it creates a database of sites that users consider to be low-quality. Google will be collecting this data from all of its users, but has stated that it will not begin using the stats on blocked domains as a part of its regular search algorithm, at this point. The company will be looking at the data and says that it may end up as part of website evaluation criteria in the future.

With Google’s ramped up efforts to improve search result quality, it certainly seems like blocked site data could become an important piece of the algorithm puzzle in the near future. If and when individual blocking becomes a significant part of the search algorithm and if Google makes the feature more visible (for example if it appears immediately on all results), then this individual feature could have a major impact on future search results and on the SEO industry as a whole. It would create an entirely new factor for companies to consider in their SEO and overall site building efforts.

What effects do you think this new blocking feature could have on the future of SEO?

Great comments Jim. I think the blocking option is going to be really fascinating to watch and see what kind of impact it has – especially if it becomes a more prominent user option. Will users take advantage of the option in large numbers? How much weight will Google give user blocking in future algorithm updates? While it could play a role in increasing the quality of search results, I can also envision unethical marketers using it to blacklist competitor websites and hurt their search rankings – much like posting multiple negative reviews are used to hurt competitor businesses in various industries. But, I’m hoping that Google has thought that side of it through and has some plans to minimize this practice.

Nwmhloans

This reminds me when Google introduced the sandbox and competitors would spam your url all over the internet getting you sandboxed. When we all work with multiple ip addresses now, it seems like one could essentially do the same thing!