Could Google+ Play a Role in the Upcoming Penguin Update?

Michael Garrity

Posted on 5.13.2013

On Friday morning, Matt Cutts, the person in charge of Google’s webspam team, announced via Twitter that the search engine would be rolling out the next generation, version 2.0, of its Penguin update (originally released in April of last year).

This update will likely have considerable ramifications on the way that Google sorts and ranks pages on its search engine results pages (SERPs), as these types of overhauls entail altering the search algorithm, as opposed to just an index refresh.

Penguin 2.0

The first Penguin update focused on black hat SEO and links/link quality, as part of the company’s apparent desire to keep "spammy" sites stuffed with low-quality links from reaching the upper echelons of the SERPs (or even the first couple of pages, for that matter). It’s been tweaked a few times since then, with the first refresh (Penguin 1.1, if you will) simply cleaning up and implementing the algorithm update in the way it indexes pages, and the second refresh adding the Disavow Links Tool, which puts the onus of identifying and removing bad links on the webmasters.

But this new update promises to be much more significant and should resonate more with all websites and webmasters. In a video released yesterday (see the bottom of this post), Cutts said that the upcoming Penguin update, which is expected in “a few weeks,” will likely go deeper and have more of an impact than Penguin 1.0, but didn’t really elaborate beyond that. He also explains that Google is looking to give special ranking "boosts" to sites that are authorities in a specific industry, community or space, meaning it will return those sites above loess authoritiative sites in related queries. He doesn't define how they will determine that authority, however.

Naturally, this has led many Web pros to wonder just what this new update will entail, and how it may affect their own performance in the SERPs. For now, all we can do is speculate.

What about Google+?

One likely scenario involves Google’s silently growing social network, Google+, which passed Twitter in Dec. 2012 to become the second largest online social network in terms of active monthly users (according to a Global Web Index study). Whereas Google started out by aggressively (over) marketing G+ when it first went live, the company has slowly backed off in favor of quietly integrating the social network with its myriad other products.

It wouldn’t be at all surprising, then, to see Google trying to incorporate G+ more intricately into its search engine, as well. And since the original Penguin was largely about links and link quality, one might expect Penguin 2.0 to put a special emphasis on links appearing on the social network. That is, content that gets linked to on or from Google+ will be weighted more heavily, or at least with more credibility, than if it were just a link on a random website or blog.

In other words, Google could end up viewing links that also appear somewhere on Google+ as being of a generally higher quality than others, so it may end up giving them a boost in the SERPs. And if that’s the case, it wouldn’t be surprising if Google Search also paid attention to the activity around links on Google+ (e.g. shares, +1s, etc.) in order to better determine just how “quality” they may actually be.

Should this end up being the case, it would mean that brands and content publishers, especially, will want to become more active on Google+, sharing links on the social network in order to give them more authority or credibility in the SERPs, especially if they expect these links to appear on other websites or blogs.

Of course, this is all just speculation, and Google is being, and will continue to be, quiet on the subject until the release of the new update, as it always is. I’m just saying, it would be an interesting way for the company to quietly nudge people toward using Google+ more often, although the downside would be the creation of an increasingly insular, exclusive “Google Universe” that may alienate Web professionals and users even more than it already has.