Are You Stuck In Google’s Penalty Box?

Could Your Webmaster Tools Account Be Keeping You There?

Washington Capitals #25 Jason Chimera argues with the referee from the penalty box.

Anyone who has ever fallen foul of Google’s Manual Webspam reviewers will tell you there is no better feeling than hearing a manual action has been revoked. It’s like edging the puck past Dominik Hasek on the breakaway to snatch the Stanley Cup win in triple overtime.

The world looks different. You feel the rush of elation, savor the sense of relief and start seriously thinking about visitors flocking back to your site.

But what if nothing changes? What if recovery comes with a whimper and not a bang? Suddenly you find yourself stuck like Jason Chimera in that penalty box. What could be stopping your recovery? You’ve done your job as a Grinder. The work is done and you have the green light from the Webspam team.

Manual action revoked – problem solved.

Maybe. Maybe not.

First, it’s important to understand that recovery doesn’t happen instantly. In most cases you will not get back to pre-penalty traffic and visibility levels without a lot of extra work. In the case of unnatural links penalties, this is overwhelmingly the case. You might be out of the penalty box, but perhaps more is broken than you think.

Many links that influenced search visibility before the penalty will have been devalued or eliminated in the process of resolving it. Very few site owners are lucky enough to recover completely. Their sites do return to the same or greater visibility than they enjoyed pre-penalty.

Congratulations if you are one of the lucky few. If not, where do you go from here?

Start asking questions

The first question is “do I know everything there is to know?” This is where your Google Webmaster Tools account comes into play.

Chances are when you first noticed a traffic problem, you went to Google Webmaster Tools and checked the Manual Actions Viewer. If you already received the good news that a manual action was revoked, you should now see something like this when you choose Search Traffic -> Manual Actions in the left sidebar of Webmaster Tools.

The bad new is, this may not be the end of the story. You cannot know for sure that you have seen the last of manual actions unless you have claimed a Webmaster Tools profile for each active variant of your site.

What are variants?

There will always be a number of different versions of your web site. That is just the nature of site structure. The two most obvious examples are the www and non-www versions within the http protocol. While you may have created just one set of files, the www is actually a subdomain. Search engines recognize them as two separate and distinct versions of your site. We call them variants.

If you are actively using the https protocol to secure files on your server, you will have another two active site variants – the www and non-www secured versions. All subdomains will also be seen as a variant. Perhaps the two most common examples you may need to think about are the blog. and mobile. subdomains.

By now you’re thinking “that’s all very interesting, but what does it have to do with failing to recover from a penalty?”

If you missed the announcement, or for whatever reason felt the change wasn’t helpful to you, it’s time to take a closer look at this quote from the post:

In order to see your data correctly, you will need to verify all existing variants of your site (www., non-www., HTTPS, subdirectories, subdomains) in Google Webmaster Tools. We recommend that your preferred domains and canonical URLs are configured accordingly.

Note that if you wish to submit a Sitemap, you will need to do so for the preferred variant of your website, using the corresponding URLs. Robots.txt files are also read separately for each protocol and hostname.

What they didn’t say

What isn’t mentioned in the post is that Manual Spam Actions and the Disavow Links Tool are also handled separately within each site variant. What does this mean? Put simply, if you cannot see all active variants in Google Webmaster Tools, you cannot know that there are no more manual actions in place.

Let’s put that another way:

It is entirely possible there are manual actions that you are completely unaware of.

Think about that for a moment. You worked hard, fixed a bunch of problems and at the end of it all your manual action was revoked. You’re fairly sure there are no major technical issues on the site, but the long awaited climb to recovery just is not happening.

Let’s face it. Things are going nowhere. Time to claim profiles for each active variant and check the Manual Actions Viewer in each. Now you will see whether unseen manual actions are at the heart of the problem.

Adding new profiles in Google Webmaster Tools

It’s simple to add a new profile to your Webmaster Tools account. Log in and click Add A Site at the top right side of the account dashboard:

When you click this button you will see a screen which allows you to type in the specific variant to create your new profile. In this case we are creating a variant for the secured, non-www version of the site.

What happens next?

Information in the Manual Actions Viewer usually propagates to new variant profiles almost immediately, so it won’t take long for you to see whether there are other manual actions in the mix.

While manual actions for unnatural linking are what most webmasters think of first, there are actually twelve different manual actions in the webspam reviewers’ arsenal.

If you discover manual actions in other variant profiles you will need to regroup and prepare for another cleanup effort. If you successfully resolved a manual action of the same type, it is all good news. You already know exactly what you need to do to fix new problems.

What if there are no manual actions?

Once you are sure there are no more manual actions in play, you need to think about other issues that affect site visibility.

If the original manual action was caused by unnatural linking, it’s likely site visibility is also being influenced by the Penguin algorithm. Since this is a purely algorithmic adjustment, there is no appealing to the referee. Even if they adjudicated in your favor, there is no way to override Penguin. The only sensible way to approach the problem now is to eliminate the influence of any remaining unnatural links with the Disavow Links Tool.

Again, disavow submissions are handled separately for each site variant. This means you need to submit a disavow file for each variant to ensure that Googlebot recognizes the disavow across the entire site.

Remember to follow best practice recommendations when creating disavow files. This means disavowing entire domains unless you have a very good reason not to.

If you are uncertain about creating disavow files use this free tool. All you need to do is upload a list of links to generate a valid entire domain disavow file. The tool takes care of all the formatting for you. Grab the new file and you are ready to upload to Google’s Disavow Links Tool.

This is all very confusing, where do I start?

We thought things might get a little crazy, so we made this Penalty Resolution Flowchart to help coach you through to the end. Just follow the steps and it will help guide you safely to the final minutes of the game.

Does this mean I get more link data?

Strangely, there are just a few places where webmaster tools profiles for site variants do not return individualized data. Yes, you guessed it. Link data is only returned in one variant for a site. Just when you thought you might finally catch a break from the Webmaster Tools team your hopes are dashed. When you click “Links to My Site” in other variant profiles Google Webmaster Tools invariably returns “No Data” You won’t be enjoying expanded link data from those variant profiles.

Mystery solved

Sid the Kid Kisses the Stanley Cup

There has long been a debate whether a manual action can be resolved purely by using link data from Google Webmaster Tools. There are certainly people who have successfully resolved manual penalties using nothing other than Webmaster Tools link data.

For the vast majority of sites the solution is not that simple. A much broader data mining effort is required to uncover adequate samples of unnatural links.

Many wonder why some sites seem to benefit from more accurate and useful data than others. Some speculate that the size of a site could be a factor. This seems reasonable since Google applies limits to the amount of data available in the download. Still there seems to be more happening here than meets the eye. Why do some webmasters feel they are getting a worse deal from their Webmaster Tools account than others?

Actually the answer is simple. Link data download contains only links that point to the specific variant. If you happen to have verified the non-www variant and links have been primarily built to that variant, then the link data provided in Webmaster Tools will be quite useful.

If on the other hand, you verified that same non-www variant, but all unnatural links to your site point only to the blog. subdomain, it makes sense there will be significant deficiencies in the data you get from Google. This explains why the addition of link data from third party tools and other known link sources naturally improves the effectiveness of link cleanup projects.

There are many more implications in this variant model for Google Webmaster Tools whether you are dealing with manual actions or algorithmic effects.

It is easy to see the advantages for some in handling site data this way, but for those sites impacted by manual actions it can be potentially disastrous.

Google recently invited us to tell them what we would like to see from Websearch and Webmaster Tools in 2015. The long list of contributors suggesting they do away with this variant-specific approach gives the impression that many feel something is broken. In a recent Office Hours Hangout, John Mueller made it clear these suggestions will be looked at when planning for the future. That’s John’s way of saying don’t expect them to be implemented any time soon.

So if you’re going stay in the game long enough to lift that trophy, you need to chase down as much site data as you can and pay attention to variants when working on your link cleanup.