The Danger of Crossing Algorithms: Uncovering The Cloaked Panda Update During Penguin&nbsp3.0

The author's views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Penguin 3.0 was one of the most anticipated algorithm updates in recent years when it rolled out on October 17, 2014. Penguin hadn't run for over a year at that point,
and there were many webmasters sitting in Penguin limbo waiting for recovery. They had cleaned up their link profiles, disavowed what they could, and were
simply waiting for the next update or refresh. Unfortunately, Google was wrestling with the algo internally and over twelve months passed without an
update.

So when Pierre Far finally
announced Penguin 3.0 a few days later on October 21, a few things
stood out. First, this was
not a new algorithm like Gary Illyes had explained it would be at SMX East. It was a refresh and underscored
the potential problems Google was battling with Penguin (cough, negative SEO).

Second, we were not seeing the impact that we expected. The rollout seemed to begin with a heavier international focus and the overall U.S impact has been
underwhelming to say the least. There were definitely many fresh hits globally, but there were a number of websites that should have recovered but didn't
for some reason. And many are still waiting for recovery today.

Third, the rollout would be slow and steady and could take weeks to fully complete. That's unusual, but makes sense given the microscope Penguin 3.0 was
under. And this third point (the extended rollout) is even more important than most people think. Many webmasters are already confused when they get hit
during an acute algorithm update (for example, when an algo update rolls out on one day). But the confusion gets exponentially worse when there is an
extended rollout.

The more time that goes by between the initial launch and the impact a website experiences, the more questions pop up. Was it Penguin 3.0 or was it
something else? Since I work heavily with algorithm updates, I've heard similar questions many times over the past several years. And the extended Penguin
3.0 rollout is a great example of why confusion can set in. That's my focus today.

Penguin, Pirate, and the anomaly on October 24

With the Penguin 3.0 rollout, we also had
Pirate 2 rolling out. And yes, there are
some websites that could be impacted by both. That added a layer of complexity to the situation, but nothing like what was about to hit. You see, I picked
up a very a strange anomaly on October 24. And I clearly saw serious movement on that day (starting late in the day ET).

So, if there was a third algorithm update, then that's
three potential algo updates rolling out at the same time. More about this soon,
but it underscores the confusion that can set in when we see extended rollouts, with a mix of confirmed and unconfirmed updates.

Penguin 3.0 tremors and analysis

Since I do a lot of Penguin work, and have researched many domains impacted by Penguin in the past, I heavily studied the Penguin 3.0 rollout. I
published a blog post based on the first ten days of the update, which included some interesting findings for sure.

And based on the extended rollout, I definitely saw Penguin tremors beyond the initial October 17 launch. For example, check out the screenshot below of a
website seeing Penguin impact on October 17, 22, and 25.

But as mentioned earlier, something else happened on October 24 that set off sirens in my office. I started to see serious movement on sites impacted by
Panda, and not Penguin. And when I say serious movement, I'm referring to major traffic gains or losses all starting on October 24. Again, these were sites heavily dealing with Panda and had
clean link profiles. Check out the trending below from October 24 for several
sites that saw impact.

A good day for a Panda victim:

A bad day for a Panda victim:

And an incredibly frustrating day for a 9/5 recovery that went south on 10/24:

I saw this enough that I tweeted heavily about it and
included a section about Panda in my Penguin 3.0 blog post. And
that's when something wonderful happened, and it highlights the true beauty and power of the internet.

As more people saw my tweets and read my post, I started receiving messages from other webmasters explaining that
they saw the same exact thing, and on their websites dealing with Panda and not Penguin. And not only
did they tell me about, they
showed me the impact.

I received emails containing screenshots and tweets with photos from Google Analytics and Google Webmaster Tools. It was amazing to see, and it confirmed
that we had just experienced a Panda update in the middle of a multi-week Penguin rollout. Yes, read that line again. Panda during Penguin, right when the
internet world was clearly focused on Penguin 3.0.

That was a sneaky move Google… very sneaky. :)

So, based on what I explained earlier about webmaster confusion and algorithms, can you tell what happened next? Yes, massive confusion ensued. We had the
trifecta of algorithm updates with Penguin, Pirate, and now Panda.

Webmaster confusion and a reminder of the algo sandwich from 2012

So, we had a major algorithm update during two other major algorithm updates (Penguin and Pirate) and webmaster confusion was hitting extremely high
levels. And I don't blame anyone for being confused. I'm neck deep in this stuff and it confused me at first.

Was the October 24 update a Penguin tremor or was this something else? Could it be Pirate? And if it was indeed Panda, it would have been great if Google told
us it was Panda! Or did they want to throw off SEOs analyzing Penguin and Pirate? Does anyone have a padded room I can crawl into?

Once I realized this was Panda, and started to communicate the update via Twitter and my blog, I had a number of people ask me a very important question:

"Glenn, would Google really roll out two or three algorithm updates so close together, or at the same time?"

Why yes, they would. Anyone remember the algorithm sandwich from April of 2012? That's when Google rolled out Panda on April 19, then Penguin 1.0 on April 24,
followed by Panda on April 27. Yes, we had three algorithm updates all within ten days. And let's not forget that the Penguin update on April 24, 2012 was the
first of its kind! So yes, Google can, and will, roll out multiple major algos around the same time.

Where are we headed? It's fascinating, but not pretty

Panda is near real-time now

When Panda 4.1 rolled out on September 23, 2014, I immediately disliked the title and version number of the update. Danny Sullivan named it 4.1, so it stuck. But for
me, that was not 4.1… not even close. It was more like 4.75. You see, there have been a number of Panda tremors and updates since P4.0 on May 20,
2014.

I saw what I was calling "tremors"
nearly weekly based on having access to a large amount of Panda data (across sites, categories, and countries).
And based on what I was seeing, I reached out to John Mueller at Google to clarify the tremors. John's response was great and confirmed what I was seeing.
He explained that there
wasnot a set frequency for algorithms like Panda. Google can roll out an algorithm, analyze the
SERPs, refine the algo to get the desired results, and keep pushing it out. And that's exactly what I was seeing (again, almost weekly since Panda 4.0).

When Panda and Penguin meet in real time…

…they will have a cup of coffee and laugh at us. :) So, since Panda is near-real time, the crossing of major algorithm updates is going to happen.
And we just experienced an important one on October 24 with Penguin, Pirate, and Panda. But it could (and probably will) get more chaotic than what we have now.
We are quickly approaching a time where major algorithm updates crafted in a lab will be unleashed on the web in near-real time or in actual real time.

And if organic search traffic from Google is important to you, then pay attention. We're about to take a quick trip into the future of Google and SEO. And
after hearing what I have to say, you might just want the past back…

Google's brilliant object-oriented approach to fighting webspam

I have presented at the past two SES conferences about Panda, Penguin, and other miscellaneous disturbances in the force. More about those "other
disturbances" soon. In my presentation, one of my slides looks like this:

Over the past several years, Google has been using a brilliant, object-oriented approach to fighting webspam and low quality content. Webspam engineers can
craft external algorithms in a lab and then inject them into the real-time algorithm whenever they want. It's brilliant because it isolates specific
problems, while also being extremely scalable. And by the way, it should scare the heck out of anyone breaking the rules.

For example, we have Panda, Penguin, Pirate, and Above the Fold. Each was crafted to target a specific problem and can be unleashed on the web whenever
Google wants. Sure, there are undoubtedly connections between them (either directly or indirectly), but each specific algo is its own black box. Again,
it's object-oriented.

Now, Panda is a great example of an algorithm that has matured to where Google highly trusts it. That's why Google announced in June of 2013 that Panda
would roll out monthly, over ten days. And that's also why it matured even more with Panda 4.0 (and why I've seen tremors almost weekly.)

And then we had Gary Illyes explain that Penguin was moving along the same path. At SMX East,
Gary explained that the new Penguin algorithm (which clearlydidn't roll out on October 17) would be structured in a way where subsequent updates could be rolled out more easily.
You know, like Panda.

And by the way, what if this happens to Pirate, Above the Fold, and other algorithms that Google is crafting in its Frankenstein lab? Well my friends, then
we'll have absolute chaos and society as we know it will crumble. OK, that's a bit dramatic, but you get my point.

We already have massive confusion now… and a glimpse into the future reveals a continual flow of major algorithms running in real-time, each that
could pummel a site to the ground. And of course, with little or no sign of which algo actually caused the destruction. I don't know about you, but I just
broke out in hives. :)

Actual example of what (near) real-time updates can do

After Panda 4.0, I saw some very strange Panda movement for sites impacted by recent updates. And it underscores the power of near-real time algo updates.
As a quick example,
temporary Panda recoveries can happen if you
don't get out of the gray area enough. And now that we are seeing Panda tremors almost weekly, you can experience potential turbulence several times per
month.

Here is a screenshot from a site that recovered from Panda, didn't get out of the gray area and reentered the strike zone, just five days later.

Holy cow, that was fast. I hope they didn't plan any expensive trips in the near future. This is exactly what can happen when major algorithms roam the web
in real time. One week you're looking good and the next week you're in the dumps. Now, at least I knew this was Panda. The webmaster could tackle more
content problems and get out of the gray area… But the ups and downs of a Panda roller coaster ride can drive a webmaster insane. It's one of the
reasons I recommend making
significant changes when
you've been hit by Panda. Get as far out of the gray area as possible.

An "automatic action viewer" in Google Webmaster Tools could help (and it's actually being discussed internally by Google)

Based on webmaster confusion, many have asked Google to create an "automatic action viewer" in Google Webmaster Tools. It would be similar to the "manual
actions viewer," but focused on algorithms that are demoting websites in the search results (versus penalties). Yes, there is a difference by the way.

The new viewer would help webmasters better understand the types of problems that are being impacted by algorithms like Panda, Penguin, Pirate, Above the
Fold, and others. Needless to say, this would be incredibly helpful to webmasters, business owners, and SEOs.

So, will we see that viewer any time soon? Google's John Mueller
addressed this question during the November 3 webmaster hangout (at 38:30).

John explained they are trying to figure something out, but it's not easy. There are so many algorithms running that they don't want to provide feedback
that is vague or misleading. But, John did say they are discussing the automatic action viewer internally. So you never know…

A quick note about Matt Cutts

As many of you know, Matt Cutts took an extended leave this past summer (through the end of October). Well, he announced on Halloween that he is
extending his leave into 2015. I won't go crazy here talking about his decision overall, but I will
focus on how this impacts webmasters as it relates to algorithm updates and webspam.

Matt does a lot more than just announce major algo updates… He actually gets involved when collateral damage rears its ugly head. And there's not a
faster way to rectify a flawed algo update than to have Mr. Cutts involved. So before you dismiss Matt's extended leave as uneventful, take a look at the
trending below:

Notice the temporary drop off a cliff, then 14 days of hell, only to see that traffic return? That's because Matt got involved. That's the
movie blog fiasco from early 2014 that I heavily analyzed. If
Matt was not notified of the drop via Twitter, and didn't take action, I'm not sure the movie blogs that got hit would be around today. I told Peter from
SlashFilm that his fellow movie blog owners should all pay him a bonus this year. He's the one that pinged Matt via Twitter and got the ball rolling.

It's just one example of how having someone with power out front can nip potential problems in the bud. Sure, the sites experienced two weeks of utter
horror, but traffic returned once Google rectified the problem. Now that Matt isn't actively helping or engaged, who will step up and be that guy? Will it
be John Mueller, Pierre Far, or someone else? John and Pierre are greatly helpful, but will they go to bat for a niche that just got destroyed? Will they
push changes through so sites can turn around? And even at its most basic level, will they even be aware the problem exists?

These are all great questions, and I don't want to bog down this post (it's already incredibly long). But don't laugh off Matt Cutts taking an extended
leave. If he's gone for good, you might only realize how important he was to the SEO community
after he's gone. And hopefully it's not because
your site just tanked as collateral damage during an algorithm update. Matt might be
running a marathon or trying on new Halloween costumes. Then where will you be?

Recommendations moving forward:

So where does this leave us? How can you prepare for the approaching storm of crossing algorithms? Below, I have provided several key bullets that I think
every webmaster should consider. I recommend taking a hard look at your site
now, before major algos are running in near-real time.

Truly understand the weaknesses with your website. Google will continue crafting external algos that can be injected into the real-time algorithm.
And they will go real-time at some point. Be ready by cleaning up your site now.

Document all changes and fluctuations the best you can. Use annotations in Google Analytics and keep a spreadsheet updated with detailed
information.

Along the same lines, download your Google Webmaster Tools data monthly (at least). After helping many companies with algorithm hits, that
information is incredibly valuable, and can help lead you down the right recovery path.

Use a mix of audits and focus groups to truly understand the quality of your site. I mentioned in my post about
aggressive advertising and Panda
that human focus groups are worth their weight in gold (for surfacing Panda-related problems). Most business owners are too close to their own content and
websites to accurately measure quality. Bias can be a nasty problem and can quickly lead to bamboo-overflow on a website.

Beyond on-site analysis, make sure you tackle your link profile as well. I recommend heavily analyzing your inbound links and weeding out unnatural
links. And use the disavow tool for links you can't remove. The combination of enhancing the quality of your content, boosting engagement, knocking down
usability obstacles, and cleaning up your link profile can help you achieve long-term SEO success. Don't tackle one quarter of your SEO problems. Address
all of them.

Remove barriers that inhibit change and action. You need to move fast. You need to be decisive. And you need to remove red tape that can bog down
the cycle of getting changes implemented. Don't water down your efforts because there are too many chefs in the kitchen. Understand the changes that need
to be implemented, and take action. That's how you win SEO-wise.

Summary: Are you ready for the approaching storm?

SEO is continually moving and evolving, and it's important that webmasters adapt quickly. Over the past few years, Google's brilliant object-oriented
approach to fighting webspam and low quality content has yielded algorithms like Panda, Penguin, Pirate, and Above the Fold. And more are on their way. My
advice is to get your situation in order now, before crossing algorithms blend a recipe of confusion that make it exponentially harder to identify, and
then fix, problems riddling your website.

Now excuse me while I try to build a flux capacitor. :)

About GlennGabe —
Glenn Gabe is a digital marketing consultant at G-Squared Interactive and focuses heavily on advanced SEO, SEM, social advertising, and web analytics. You can read more of Glenn’s posts on his blog, The Internet Marketing Driver. You can also follow Glenn on Twitter and Google+.

Yes one thing is sure now Google is going confuse after rolled out penguin 3.0 update because they don't know how to handle it. On other hand we should always ready to face these kinds of problems in Search Engine Optimization field because Google can release any kind of new update on any time which could be directly affects ranking.

Something clearly has gone wrong with Penguin. First, it was a refresh when it was supposed to be a new algo. Second, we have the multi-week rollout. And then we have many sites expecting some type of recovery, but they haven't seen any at all. It's not good. :)

I'm seeing the exact same thing on one of our websites. It couldn't be Penguin and if I'd look at the things we don't do quite right yet, it's got to be content related and thus Panda instead of Penguin.

Was the impact on 10/24? I'm glad you are clear which algo hit the site. And I agree, if it was 10/24, then you are looking at Panda.

Regarding confirmation, Google has only been confirming major updates (like Panda 4.0, 4.1, Penguin 3.0, etc.) The subsequent Panda tremors/updates haven't been confirmed, although they are clear based on having access to a lot of Panda data. And thank goodness for that. :) Expect more posts from me soon about the next wave of updates.

Great post, Glenn, and terrific job keeping all the P-words very clear and organized.

Around here we've been seeing something that almost looks like anti-Panda in a few verticals, with bare-bones low-quality pages displacing entries from multiple competitors. With Penguin & Panda both rolling out, there is tons of confusion to be had.

Also kudos on your visuals here. Mad scientist leads to madder scientist leads to zombie apocalypse and a word without Matt Cutts.

It is dangerous to go off what others are speculating to diagnose whether a hit was penguin or panda - look a your own site, when it was hit, what parameters the site has (spammy links vs. terrible user metrics or quality from my list (themoralconcept.net/pandalist.html) for example and ere on the side of "it could be panda" b/c fixing that means making your site better - a good expense anyways.
Fixing Penguin means deleting links and likely damaging your ranking. There is no proof to show a naked disavow does a DAMN thing.

I also think that in our industry it is important to begin to draw the separation between an SEO and a technical SEO in an effort to better educate the client. There are just too many self proclaimed SEOs that really haven't a clue when it comes to the nuts and bolts of how search engines/algos operate.

Josh, totally agree that webmasters should not make decisions solely based on another site's data. I've always said you need a lot of Panda data to pick up the unconfirmed updates (and there have been many). Unfortunately, most webmasters don't have access to that, but I do (across sites, categories, and countries). I only post about updates when I'm sure there's been one (by checking all of the sites I have access to.)

But the combination of trusted sources of data and the realization that your own site got hit can lead someone down the right path.

Also, I've found many business owners and webmasters are too close to their own sites and content to objectively measure "quality". And I've seen many go down the wrong path, based on their own analysis. And that leads to spinning SEO wheels, which results in a waste of time, resources, and money. That's why it's important to have someone helping you that has worked heavily with algo updates (whether that's internally or someone from the outside).

But as I mentioned in my post, we are approaching a time where multiple major algos will run in near-real time or actual real time. So to me, it's important that sites rectify their problems NOW.

Don, I totally agree. When I was leading SEO in-house and at a large agency, I used to have a ten question test that would quickly enable me to identify the level someone was at (from a technical SEO standpoint). If they did well there, then we could go deeper. But many times, the interview would end right after the test.

For large-scale websites, in particular, the wrong moves could destroy SEO. It's not for the faint of heart. :) I unfortunately see this often during SEO technical audits. It's not pretty.

If you'd be willing to share, Glenn, I'd love to know what those ten questions were. ;) I know there are areas in which I could be stronger... knowing which you find most important could be educational.

Thanks for your comment Doc. That's a great idea, and maybe will turn into another blog post. :) I haven't updated that list in a while (I used it when I was running SEO in-house and then when I was at a large agency). But I think I'll update the list and then post it.

It was really helpful, since it concisely tackled tech SEO items that are important, recent news from the search engines, and also incorporated some real-world scenarios. Again, after the ten questions, I had a solid feel for where the person was. And then we could always go deeper if all went well. But for advanced level positions, I needed to make sure their SEO knowledge was sound (and up to date).

Great post, really enjoyed reading it, when I first read the title i imagines a ghost busters images and "don't cross the beams"!

You've made the future look pretty bad, im sure Google isn't going to share much of their updates with fear of manipulation they claim they want to help smaller companies but a lot of the time the information never really trickles down to help them.

Sorry If I missed it but did you work out what the "third" update was or was it just another Panda slap?

I also wanted to point out that Matt Cutts also mentioned another reason for his extension was he didn't want to be a "lightning rod" for black hat anger which I really get he got a lot of grief for Algo updates etc. I can imagine it must really wear on you.

Chris, I think the future of Google/SEO is going to be fascinating. I wouldn't say it's going to be bad... but the reality of major algos all running in near-real time or actual real-time is going to make it exponentially harder to identify and recitfy problems. So maybe "scary" is the right word. :)

The object-oriented approach I mentioned above is brilliant and can enable Google to isolate specific problems/areas. I fully expect Google to keep moving down that path. That's why I think it's ultra-important to fix problems NOW, before more major algorithms go real-time (and we have several that can cross at any time).

The 10/24 update was Panda. I have a boatload of data backing that (sites seeing significant movement that were battling Panda and not Penguin. They had clean link profiles.) I hope that helps!

Regarding the image, a screenshot from Ghostbusters would have been awesome! Actually, I had a number of possible blog post titles... and each could have yielded another movie. :) Thanks for your comment.

I too noticed a Panda-like effect on one domain in particular which started on 18th October, bang on the launch of Penguin 3.0. In this instance it was negative movement. I gave it a few days to let the dust settle before performing a thorough analysis of the decline.

On initial glance the date of the drops (which correlate directly with the launch of Penguin) could have easily labelled Penguin as the cause but it just didn't add up. The link profile in itself is relatively clean - sure there are a handful of old school type links which may be flagged as a lesser quality than desired, but it's not the type of link profile you would immediately associate with Penguin related factors.

We noted drops on several long-tail keywords on deep internal pages which had no associated links, let alone ones of a poor quality. It crossed my mind that the drops on the long-tail keywords could have been the knock-on effect of devaluation else where, but core internal pages as well as the homepage of the website remain unaffected.

Moreover the long-tail keywords in question were synonyms or variants of other keywords which were poorly reflected within the content of each affected web page, again pointing more towards the likelihood of a Panda related shift rather than Penguin.

This is only scratching the surface. I spent quite some time digging pretty deep and I can say with 100% confidence that the impact, in this case at least, is Panda related.

Your situation is really interesting... I didn't see Panda movement on 10/18, but I think this highlights the possible problems when major algorithms cross. The only movement I saw around that time was Penguin (and more international movement than U.S.)

If you can email me the domain, I wouldn't mind taking a look. Since we are seeing Panda tremors so frequently, it's entirely possible that was Panda. But again, I saw signfiicant movement on 10/24 Panda-wise and not on 10/18.

I'll PM you the domain so you can take a look. I'd be interested to hear your thoughts but it's a relatively small website so not sure how much data you'll be able to gather for your investigation. I may be able to share some of our data with you.

The ranking drops definitely started on 10/18. It'd be easy to blame Penguin going by dates alone, but all the signs point more towards Panda.

I agree record keeping is a must and using that data to make decisions. It's so easy to jump ship or make incorrect wholesale changes when an update comes along or there is a flurry of blog posts on some perceived do or don't.

It is concerning that there may not be an 'inside contact' to take up issues with. Every public facing company needs that if only for their own PR.

Having said that there are quite clearly defined concepts for SEO so there is not too much room for excuses. I mean once you truly put the visitor first you are most of the way there.

Thanks for your comment Michael. And I agree, too many webmasters are not keeping the proper records SEO-wise. There are times those notes can really help identify true drops in traffic and get you moving down the right path. And GWT data is incredibly important to download (at least once per month). 90 days doesn't cut it for many sites...

Regarding Matt Cutts, the movie blog situation I detailed above is a great example of when someone like Matt can quickly rectify collateral damage. I still believe that niche would not be the same right now if he didn't get involved. And many of those sites got pummeled based on the update (only to return to normal traffic levels once Matt got involved). Crazy...

When being hit by Panda by having for example 100 thin content pages. How does the actual recovering process work? When you update 10 pages with fresh quality content. Will Google rank those 10 pages back or even higher than they used to be? Or will all 100 pages need to be refreshed with the quality content? And what if you would also have been hit by Penguin at the same time. You clean up all the bad links, but still have thin content, or the other way around. I wonder how Google treats these kind of situations.

Great questions. When you've been hit by Panda, you need to fix the problems causing the Panda attack. "Low quality content" can mean several things, so a deep audit is usually needed to surface all Panda-related problems.

Once you fix the problems, Google needs to recrawl the site, measure user engagement again, etc. That can take a while (which can be months...) Then you need another Panda update or refresh in order to recover. I've seen Panda tremors almost weekly since Panda 4.0, so this isn't a Penguin-like situation where you need to wait forever... But it can still take a while as Google reevaluates the website.

Regarding Pandeguin (that's the name I've given sites that were impacted by both Panda and Penguin), I've seen very interesting connections between the two. I wrote a case study about a Pandeguin recovery on Search Engine Journal, and you should check that out.

My advice is to tackle each algo thoroughly. Depending on the site, you might choose to start fixing one set of problems before others. Or, if you have the necessary resources, you can tackle both Panda and Penguin problems at the same time. Then you'll need a Panda update/refresh to recover from Panda and then a Penguin refresh/update to recover from Penguin. And yes, one still might keep the site down, even after recovering from the other. i.e. You might see a bump during Panda, but not a stronger recovery until Penguin. Or vice versa.

I often say the relationship between Panda and Penguin is "complicated". :) I hope that helps.

Thank you for the reply mr. Gabe. We are currently struggling with Panda, what most annoys me is that we have no idea with what algo we are dealing with. We have a website in Europe (non-english) so some algo's go global but roll out at an extremely low pace.

Anyway, we cleaned our link profile, and are constantly focusing on making fresh and great content.

My opinion on both Penguin and Panda is that both were way to aggressive. Lots of website owners with no SEO knowledge got hit, which should have never been happened. But who am I to say what is fair or not :).

Thanks again for the reply, nice to read your post.

Edit:

You say several times about the recovering process of Panda "it can take several months" also in the linked article you say you had to wait for 6 months.

I'm pretty confident to say that I've read lots of times that you have to wait for the monthly refresh. Ofcourse you got the experience, but are they completely wrong or do I see it wrong? Thanks again

Panda isn't necessarily monthly anymore. That's my point when I mention the "tremors" I have seen since Panda 4.0 (almost weekly). But recovery isn't as easy as waiting for the next update. There are typically a number of important and deep changes that need to take place. Then Google needs to recrawl and measure user engagement again. It's typically not a quick turnaround...

As a quick example, there were several companies I was helping with severe Panda 4.0 hits that implemented a boatload of changes. They did not recover until Panda 4.1 (on 9/23). That's four months and underscores the importance of making sure you never get hit in the first place. I hope that helps.

Glenn, thanks for laying out the Panda/Penguin update landscape so clearly for all of us. After our twitter exchange just before Halloween, I figured a long-form post like this was coming! :)

I know Panda "tremors" are happening frequently now (and can make things a little confusing) but what's your opinion on what Penguin 3.0 is doing right now? Is it still just in a "beta" or testing mode right now? Did Google really launch it and then dial it back due to problems with results?

Also, given that Panda is essentially real-time and seemingly a part of Google's "normal" algorithm, can a new Panda tremor affect a site that's never been hit by Panda iterations of the past? If a site hasn't been hit by Panda before - barring drastic changes to website structure, re-designs, etc. - can't we basically rule it out moving forward?

Or do you think Panda has gotten more "strict," involving more thin content factors or warning signs? Would love to hear your thoughts!

As you've made clear, it will only get more complicated and require more data to properly analyze what has happened to website's organic performance in the past, so just seeing if there's ways we can rule out certain updates moving forward.

Thanks for your comment Brady! Regarding Penguin 3.0, you can read my post that includes findings from analyzing the first ten days of the rollout. But beyond that, I think P3.0 has been extremely underwhelming. There are have been many fresh hits, but not nearly the level of recovery we were expecting. I fear Google is seeing collateral damage, negative SEO, and more. I plan to write more about this in the coming weeks. Stay tuned.

Regarding Panda tremors, yes, they can definitely impact a site that's never been hit before. I've had a number of sites reach out to me during unconfirmed updates about fresh hits. It's one of the reasons I believe webmasters should continually be analyzing their sites from a Panda standpoint (analyzing content, engagement, etc.) You need to nip new problems in the bud. If not, you can get impacted by subsequent Panda updates or tremors. I hope that helps!

Ok, so you're thinking that the Panda algo has evolved to be more precise, maybe even more "strict," to with content-related issues? I.e.: definition of "thin" evolves, where content is placed on page being evaluated more precisely, where it falls in HTML, new things added as spammers evolve, ha, etc.?

Absolutely, I agree we have to be constantly analyzing (why I'm asking you now :-D).

More and more SEO is becoming/already is a proactive marketing channel rather than a defensive, reactive marketing channel.

This is why you need someone skilled in SEO to get involved when you notice something's up in your Analytics. A novice isn't going to understand and it's likely they'll just make things worse. Panda at the same time as Penguin, very tricky indeed Google! And good luck with that flux capacitor!

Yes, it's definitely going to get tougher as more major algorithms cross. It's crazy to think we just had Penguin, Pirate, and Panda all rolling out at the same time. And I'll let you know how the flux capacitor is going. I think we'll need it sooner than later. :)

Great write up Glen. Thanks very much for shedding some light on this fiasco. I have a question in which I'm hoping you can possibly give your opinion. One of my sites lost rankings with this last Penguin update for only one of it's major keywords, which had been in the #1 spot for close to a year. It dropped to the #9-#15 spots after Oct 18th. Now it bounces between #60 and #10 on any given day. All other keywords that were ranking stayed relatively stable. However losing that one phrase dropped organic traffic in half. Looking at my anchor text for the links to that particular page shows I'm WAY heavy on the anchor/phrase that got hit. 100% my fault. Never realized it was that out of whack. I have no reason to believe this keyword phrase was hit because of poor link quality, but rather anchor over optimization for that single phrase. In your experience, will diluting the the anchor text with more branded & raw URL's help even out what the algo recognized as being over optimized for that phrase? Seeing as how the rankings are still in flux for that keyword, do you think it will take a full Penguin refresh to potentially see any positive recovery? I'm going to test this anyway, because it needs to be done, I'm just curious if you've seen a scenario like this in the past and have any opinion on it. Thanks.

Hard to say without analyzing the domain. If you want to email me, you can via my contact page on my website (gsqi.com). Penguin 3.0 is still rolling out and the flux seen since Thanksgiving was P3.0 (as confirmed by Google today). But if you were hit by Penguin 3.0 on 10/18, I'm betting there are some unnatural links in your profile using exact match anchor text from low quality sites. Again, hard to say without analyzing the domain, but I haven't seen Penguin hits without unnatural links. I hope that helps.

Interesting post, I like the fact that many other webmasters shared their GWT data. I feel like this can really pin point what impacted results when we see it from many users, gains and losses around a specific date. It would be nice if everyone would share this data on a regular basis considering Google has tons of updates.