Black Hat SEO Recovery Analysis – Part 2

To help our client in their goals to recover from negative Google penalties – specifically the Penguin and Panda algorithmic updates – we used the first month to do a DETAILED analysis of their site and identify the problems and most effective solutions. SEO is never one-size-fits-all, so we’ve tailored several things to them to maximize their efficacy.

Our research included data aggregated from a number of tools including:

Google Analytics: If it isn’t on your site, put it on today. We used the existing data in analytics to look at SEO results, visitor behavior, site traffic quality, and quantify a baseline for them. If you don’t know from where you’re starting, SEO is useless. We noticed that their traffic has slowly diminished over time, but only had about six months of data to work with (and it appeared the penalties appeared much before this).

Webmaster Tools: We looked at all the backlinks, manual actions, crawl data and errors, and the robots.txt file to check for errors, issues, and more. The good news? No manual actions! Bad news? There were some shady looking backlinks that we sent through the disavow tool. We also reviewed their crawl stats and Googlebot activity.

A note about the disavow tool: Most webmasters shouldn’t have to use this tool (maybe 1% of sites, I’m thinking). However, if you or your previous SEO company engaged in shady behavior then it might become necessary. This client has no business being crawled by a Russian directory that might as well have a big flashing “SPAM” gif on it, so we wanted to disavow sites like that. Sometimes aggressive disavowal might be necessary; however, in this case, most of the worst had already been deleted. The other side of the coin is that if you disavow too much, you won’t rank for anything as you lack backlink data. It’s a balancing act, but better to be safe in this respect.

Moz Analytics: We pulled Moz analytics, Open Site Explorer, and other Moz tools to get a feel for the analytics data they had. Interestingly, GA, Webmaster Tools, Moz, and Majestic SEO all presented different pictures of backlink profiles, which demonstrates just how important it is to diversify your data with more than one tool.

Majestic SEO: I love Majestic (and I was not paid to say that). Their backlink data is some of the best available with their trust/citation flow graphs. We also were able to see that an enormous number of links had been deleted since last crawl, which, considering the poor quality of the links they used to have, is a pretty good thing.

Visistat: While we didn’t use much of their data in this case, I happen to think their tool is super cool – albeit a little expensive for the average SMB.

gShift Labs: We use them for SEO software and used it to track their social mentions and signals, their keyword positions over the month, and monitor their analytics in a broader scope.

By the end, we had roughly 8 spreadsheets of raw data, four analytics dashboards, four Moz analytics reports, a report from gShift, and our master report (which was about 7 or 8 pages).

Conclusions: The recovery work they’ve done up to this point has been effective and is helping. We’re getting them the rest of the way there, but most of the worst links have already been removed. Duplicate content has been removed from the site, target keywords have been identified, and we have a full picture of the existing incoming traffic. Without an accurate image of where you are, you have no roadmap of where you’re going or if it’s even been effective after six months, a year, or more. This month, the plan is to optimize their site – removing “over-optimizations,” cleaning up the content, using the existing keyword data to maximize visibility on target keywords, and focus on the on-page content.

Stay posted for next month’s results and how we’ve made an impact on their site, their sales, and their bottom line.