Penguin 4.0 algorithm update: key takeaways and optimisation tips

Posted on September 27th, 2016,
by Dateme Tubotamuno

The weekend is always an exciting time to catch up on some exciting Premier League games and boxing unification fights. My weekend is golden, a memorable period of the week when I’ll expect more drama from Arsenal’s Emirates Stadium than the world of search engine optimisation.

Surprisingly or somewhat unsurprisingly by Google’s nature, this past weekend, Gary Illyes from the Google Search Ranking Team published a post about the new Penguin 4.0 algorithm update. For almost 707 days, search marketers have been pondering and chatting about the next Penguin update. There have been many speculations and false alarms regarding a Penguin update in the past, but this time around it was an official statement from the search giant that captured our attention. By way of clarity, this post is not about the large flightless Penguin seabird mostly found in countries like Chile and Argentina but it is a Google algorithm named after the famous bird.

What is Penguin algorithm?

The Penguin algorithm was launched in April 2012 and aims to demote the rankings of websites that have gained links to a site via spammy, inorganic and unethical ways. About 3.1% of search queries in English are believed to be affected by the Penguin algorithm. Links pointing to a website are an important way for Google to evaluate the importance of a given site and ranks accordingly. The importance of links in search rankings prompted some digital marketing professionals to employ deceitful ways to gain links to websites, thus negatively impacting the search user experience.

Key points from Penguin 4.0 algorithm update

It is now in real-time: Prior to this update, websites affected by Penguin algorithm had to wait for the next refresh to have their websites re-evaluated. For example, a website that has taken the necessary steps to remove or discredit spammy links would have to wait for over 707 (timeframe between the last and current update) days to have the website reassessed and rankings reinstated. The good news is that the reassessment is now in real-time, which means Google is able to restore rankings after the next crawl. As Google crawls websites on almost a daily basis, the chances are that rankings can get back to normal in a few days or weeks as against 700+ days.

It is granular and not site-wide: In the past, if a single web page had bad or unnatural links, Penguin could negatively impact the rankings of the entire site. With this update, Google only devalues the rankings of the affected page. In the words of the Google Search Ranking Team: “Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting the ranking of the whole site.”

It is still part of the 200 ranking factors: Penguin is still part of the 200 factors used by Google to determine how it ranks a website. Last November, Google released the 200 search quality rater’s guideline used to determine the relevance of sites.

Google will no longer confirm future Penguin updates: Going forward, Google will no longer confirm future Penguin updates, which means search marketers will have to keep a close watch on their link profiles and search rankings to spot any dip or uplift.

Penguin 4.0 will devalue and not demote websites: Days after publishing this blog Gary Illyes from the Google Webspam Team shared this about how Penguin 4.0 is slightly different from previous webspam algorithms: “Traditionally, webspam algorithms demoted whole sites. With this one, we managed to devalue spam instead of demoting AND it’s also more granular AND it’s real-time. Once the rollout is complete, I strongly believe many people will be happier, and that makes me happy.” In essence, what this means is that Penguin 4.0 will not negatively demote the ranking of a site as a result of unintentional spammy links. Penguin 4.0 is inherently able to devalue these links and submitting a disavow file may not be necessary. On the other hand, Gary Illyes advises webmasters, that completing a disavow file could still help Google discredit spammy links. Overall, I personally think, it might be necessary to complete a disavow form in extreme occasions when you feel the presence of spammy links could be having a negative effect on your site’s performance. Below are steps you can employ to alert Google about these spammy links.

How to make the best from the Penguin 4.0 algorithm update

Link profile/health analysis: It is important to use tools such as the Moz Opensite Explorer, Majestic and other relevant tools to assess your link profile and detect any suspicious link.

Google search console link analysis tool: You can also research about sites that link to your website via Google Search Console. It also affords you an opportunity to see the number of links that originate from each site and the recipient pages. It is a good way to spot any suspicious links and sites that can harm the ranking of your core pages.

Disavowing suspicious links: Google has a simple disavow tool that helps you submit links that you feel are spammy. The standard procedure is to contact the site owners to remove these links. If this approach fails, you can submit a disavow file for Google to discredit these links and reinstate affected keyword and page rankings respectively.

Server log analysis: Search Console allows you to examine Google’s crawl rate and the number of pages crawled, but not the actual pages. You can use a variety of tools such as Splunk, Sumo-Logic, Loggly or the Screaming Frog Log File Analyser to ascertain the actual pages crawled. I use Screaming Frog Log File Analyser and it is free for smaller sites with about 100 daily crawl events or activities. This process will help you determine if Google has re-crawled a page that had a spammy link. If after a week or two that given page has not been re-crawled by Googlebot, you can use the ‘Fetch and Render’ option in Google Search Console to submit the page to be re-crawled, and any Penguin algorithm de-ranking effects can be re-evaluated. The Penguin 4.0 algorithm update is a welcome development in the life of a search marketer. We don’t have to wait for a century for the reassessment of websites. Since Google will no longer update us on future Penguin algorithm refreshes, we will have to keep a closer eye on our websites’ link profiles and rankings.

Update: Days after publishing this blog Gary Illyes from the Google Webspam Team shared this on Facebook: “

“Traditionally, webspam algorithms demoted whole sites. With this one we managed to devalue spam instead of demoting AND it’s also more granular AND it’s realtime. Once the rollout is complete, I strongly believe many people will be happier, and that makes me happy. “