The Pandeguin Penalty: What to do if your website has been hit by both Penguin and Panda

If you’ve been following my blog recently, then you know how much Penguin analysis I’ve been performing. Since the algorithm update first hit on 4/24, I’ve been working hard at analyzing websites impacted by Penguin. You can read my previous posts to learn more, including my most recent post detailing my findings based on analyzing over 60 websites hit by Penguin. Well, now I’ve analyzed over 80 websites hit by the update, and I decided to write a new post covering a tough subject. Unfortunately, I’ve had several companies reach out to me that have run into the perfect storm of algorithm updates, so I thought it would be helpful to cover it today. More on what the perfect storm is soon.

Penguin or Panda?Many of the companies contacting me about the update automatically believe they were hit by Penguin. That’s not shocking, considering how much Penguin coverage there has been since 4/24. But what many companies aren’t aware of is that Google rolled out a Panda update on 4/19, and then a Panda refresh on 4/27. Panda is a another algorithm update initially launched in early 2011. I’ll cover Panda in greater detail soon. In addition, it seems there was another unofficial update around 5/11, which seems like it affected sites that were previously hit by Panda. Google denied a Penguin or Panda update when asked about it, but I can tell you that there was some type of update.

So, with the algorithm sandwich in full effect, I’m seeing a lot of confusion with webmasters not understanding which update actually hit their websites. That led to my latest Search Engine Journal post about how to determine which algorithm update hit your website. Based on the popularity of that post, you can tell how big of a problem it was… And knowing which update hit your site is extremely important, since Penguin and Panda target different problems. I’ll explain more about each update a little later in this post.

Introducing Pandeguin – Fear the BeastDuring my analysis, I’ve unfortunately come across several situations where websites were not only hit by Panda or Penguin, but instead, they were hit by both Panda and Penguin. Yes, this is the worst possible scenario for a website, based on the recent algo updates. These sites presumably had low quality, thin content, in addition to having horrible link profiles. Needless to say, it’s critically important to know that you’ve been hit by both in order to rectify the situation.

Pandeguin Trending:

After explaining the Pandeguin situation to webmasters hit by both Panda and Penguin, a long period of silence typically followed. Then comes the question, “OK, now what should I do?” That’s a great question, and the answer depends on the site in question. Penguin was more acute, where Panda is deeper and broader. That said, there are some top-level recommendations I would advocate following for webmasters hit by Pandeguin. Before I cover those bullets, let’s take a step back and quickly review Panda and Penguin 1.0.

What is Panda? A PrimerPanda was first released in February of 2011 and targets low quality content. It’s a rolling update, which means it is rolled out periodically (typically once per month). This means that if you were hit by Panda and made changes to rectify the situation, you wouldn’t know if those changes worked until the next update gets rolled out.

Many sites have been affected by Panda and there wasn’t a hard rule with why specific sites were getting hit. That led to a lot of confusion. For example, there were sites hit with duplicate content, thin content, affiliate content, scraped content, etc. Webmasters were forced to take an extensive look at their sites and content and make hard decisions. For example, gutting content, moving it to another domain, subdomain, etc. Should they 301 the URL’s, 404 them? The confusion led to Google releasing the famous 23 questions that webmasters should ask themselves about the quality of their content. Although helpful, Google still didn’t clearly explain what was causing a site to get caught in the Panda filter.

What is Penguin? A PrimerIn March of 2012, Google began hinting that a major update targeting “over optimization” would be rolling out soon. Nobody knew when it would roll out, what the update would target, etc. We just knew that Google was going to target webspam. When the update first rolled out, many called it the “Over Optimization Penalty”, which then turned into the “Webspam Algorithm Update”, and then was officially called “Penguin” by Google.

After performing heavy analysis once Penguin rolled out, it became extremely apparent that the update was very inbound link-heavy. Although there are many forms of webspam, unnatural inbound links were absolutely hammered. After analyzing 80+ websites, I can tell you that inbound links are the core problem being targeted by Penguin 1.0. Now, I fully expect future versions of Penguin to target additional types of webspam, so inbound links are just the start (in my opinion). Like Panda, Penguin will be rolled out periodically. You won’t know if the changes you implement actually work until Penguins come knocking again.

As you can see, the two algorithm updates are very different. As a webmaster, you don’t want to fix low quality content when you’re hit by Penguin and you don’t want to fix inbound links if you’ve been hit by Panda. But what about if you were hit by both updates? As I said earlier, I’ve had several companies reach out to me that were hit by both. Needless to say, they have a lot of work to do. But where do they start? Let’s take a look at some top-level recommendations for sites hit by Pandeguin.

Top-level recommendations for companies hit by both Panda and Penguin:

1. Start with Penguin, it’s more focused at this point:As I explained earlier, Penguin 1.0 was more focused on inbound links. If you were hit by Penguin 1.0, chances are you had a poor link profile filled with unnatural links. I would begin here, and start to analyze and then prune links. Perform an inbound link analysis and organize your links by quality. Then target the ones you want to nuke, and then execute. Panda is a deeper algorithm update at this point and requires much more analysis and work. Start with Penguin and move quickly.

2. Move to Panda, it’s a deeper update:Since there are a number of problems that could have caused Panda to hit your site, you really should have a professional SEO analyze your website. I’ve written previously about SEO audits here on my blog, and I’m a firm believer that audits are the most powerful deliverable in all of SEO. You need to determine the risks on your website from a Panda standpoint. Is there duplicate content, is it just thin content, does your site look too affiliate for Google, are you scraping content, etc? Once you fully understand your current state, you can start to form a plan of attack. Panda changes could be more complex, depending on what you need to refine. It’s not as simple as pruning links (if you can). You might need to develop an entirely new strategy for your website or business.

3. Execute, Wait, and AdjustAs I mentioned earlier, both updates will be rolled out periodically. This means you need to wait until they are rolled out to know if the changes have succeeded with lifting the penalty. This also means you need to move quickly. If each update is rolled out monthly, then you need to analyze the problem, map out changes, and execute those changes before the next update. If not, you can miss your window of opportunity. If you miss the window, you might blow an entire month. If your business relies on Google traffic to survive, that can be extremely costly. Once the updates roll through, you can determine what worked and what didn’t. Then you need to adjust quickly. I wish Panda and Penguin were live all the time, but they aren’t at this point.

4. Get Search Analytics In OrderAs you can imagine, in order to accurately analyze the situation, you need Search Analytics in order. That includes your analytics package like Google Analytics, Omniture, WebTrends, etc. In addition, you should have Google Webmaster Tools and Bing Webmaster Tools set up. If you are using Google Analytics, you can create advanced segments for various categories of organic keywords. That will enable you to quickly analyze core sets of data. In addition, you should use annotations to document changes in GA. You also might want to set up custom reports, based on your own organic search situation.

In Google Webmaster Tools, you should be tracking a number of items, including the Search Queries report (which will show you the number of impressions, clicks, average position, etc. for queries that returned your site in Google search results.) You can also see the percentage of change for core metrics in this report. You should also be exporting your data from Google Webmaster Tools, since the data only goes back 90 days.

Summary – Overcoming PandeguinBased on speaking with many webmasters since Penguin hit, I know how frustrating it can be when you’ve been impacted by an algorithm update. But some business owners have a bigger problem to deal with, namely Pandeguin. If you’ve been hit by both algorithm updates, then follow the recommendations I provided in this post to begin building back search traffic from Google. It might be a long road back, but you need to start somewhere. Good luck.

I still don’t understand why you support Google Analytics that much. Yes, it is a great product, but at the same time you give Google more data that can bite you in the rear. Google is only so powerful because of the amount of data it collects from all the different sources. The more you feed them, the more they can hurt you.

Thanks for your comment. I can’t argue with your main point. I love Google Analytics, and it has increased in power over the years. But, it is another mechanism Google can leverage to understand your site, user activity, etc.

There are plenty of other packages out there, though. You are absolutely free to use them. :)

IMHO it’s not in Google’s best interest to “hurt” any legitimate site, that’s not what their intention is.

The data they collect is used to determine which of the websites in THIER index provide the most valuable information, and which of those results are a “best-match” when compared to the search terms used. It’s necessary for all search engines to get this right in order to survive.

Why give them more data? Again, IMHO, if you own, administer or otherwise optimise a truly legitimate site, then providing more data to Google can only help to prove you sites legitimacy. In the long term, this can only help your site to appear higher in the results than those sites that seem relative but less legitimate (spammy).

There seems to be a lot of people out there who think that Google is “out to get them” or that Google will somehow use their data against them. Here’s the bottom line:

*If you know that a search engine won’t like what you’re doing, you’re doing it wrong.
*If you are doing things the right way and are concerned that a Google update could destroy your (or your clients) business by temporarily moving you down in the search results, then it’s time to rethink your business strategy.

Syates, you make some great points. Many people are simply concerned with how much data Google is able to collect about websites (and individuals). But I agree with your point about “doing things right” and that you shouldn’t be worried if you are. I’m a big fan of Google Analytics (and you’ll find several of my posts in the GA Help Center). :)

Thanks for your comment Al. If a company has been buying links, gaining unnatural links, etc., then I definitely would look to remove what it can. Penguin was hyper-focused on unnatural links, so rectifying the situation would be a smart thing to do before the next Penguin update rolls out.