Some of the use cases explained in this case study are not available in lower plans.

The LRT Superhero Plan (and higher) includes all our 25 link data sources and allows you to perform link risk management, competitive research, professional SEO and backlink analysis for your own or your competitor's sites. You get to see your website's full backlink profile picture and this can make all the difference for your SEO success.

A Case Study about recovering from Penguin and getting Google penalty relief

Introduction

This is the story of Tradebit – the digital goods download platform (think MP3, eBooks, etc.) that suffered in Panda, Penguin and probably every other Google update of the last 8 years since founded by Ralf Schwoebel in 2005.

However, this case study shows you EXACTLY what happened with the Penguin penalties since the first Penguin update in April 2012 and the most recent Penguin 2.0 in May 2013.

YES, recovery took over a year!

The site also got a manual penalty, which was finally taken away on Jun 6.

Learn what Ralf, CEO of Tradebit went thru for over a year and what it took to get Google penalty relief.

Let’s take a look into the reasons for the drop in visibility that tradebit.com experienced after a year of Penguin updates.

This research looks into how a site CAN recover from the feared Penguin.

Thanks for sharing your story Ralf!We would greatly appreciate you sharing this story as well!

The crashes

The first Panda updates up to March 2012 had only minor effects on the platform (traffic dropped by 15%), but the revenue kept stable and it was pretty clear that long-tail searches (like “artist name MP3 download”, etc.) kept driving the converting traffic. Then the penguin updates came along:

At the end of March 2012, we also felt the first Panda Update (which is referred to as Panda 3.4) that had also the first major impact on revenue. In combination with the Penguin updates, we were quickly seeing problems.

The traffic loss became visible and Google Analytics showed a new, lower level and another roughly 15% loss. This time we also felt that in the long-tail and by that on the revenue.

Paralyzed looking for a solution

So far we had survived the Panda updates pretty good, but the link profile was obviously a problem. We continued to work on the content quality and revisited and started to put more emphasis again on press and public relations work. But at the end of summer 2012 we just had to face our own weaknesses and focus on strengthen our affiliates and sales partners.

Coming from the early 2000’s in SEO, it was clearly time to rethink the whole approach and forget everything about content and link building to learn it freshly.

Link Detox launch in August 2012 – finally

I knew about the LinkResearchTools and used them once in a while to find new links or to avoid the wrong link partners. So I was very interested in the Link Detox launch from Christoph Cempers team and did my first run quite early after the start of the platform. It showed me around 40% bad/toxic links in my link profile, which I first rejected to believe. But with the lack of any other data, I started to take that list and worked on those obviously bad ones. It was the first time I started to remove links by mailing to people and asked them to delete the links on their site.

Analyzing bad links

Another approach we started to implement by the end of the summer 2012 was looking for the link neighborhood on links we got reported by the Link Detox tool. At first we took the reported links and matched them with the links from Google Webmaster Tools, because we suspected those to be the one which made the most sense. We got confirmation on that later in the year.

If one of these link sources was also linking to sites that looked suspicious or spammy, we tried to get our links off that source. The success rate was pretty low on these efforts and also kind of annoying boring work! Also notifications sent by Google were absolutely not helping the cause:

and it was not just me complaining. So after Google announced their “Disavow” tool, I was pretty excited to share a panel with one of the Spam Team members on the 2012 SEOkomm in Salzburg.

Meeting the Google Spam Team at SEOkomm

It was the perfect time to tell Uli Lutz, who took the heat to sit on that panel, that “yes, I am guilty and I want to better myself, but how?” speech and that the information policy by Google did not help us webmasters.

After that panel discussion, I got direct feedback from the Spam Team and blogged about the hints and tips that came back on my private blog:

From that moment on I was convinced that we could get Tradebit back and out of that penalty box with a few concentrated days of work on the link profile, which we did.

One of my major complains was, that WMT was showing almost 4.5 million links and that it is impossible for a small company like ours to deep dive into that amount of backlinks manually. I think Christoph Cemper, who also sat on this panel understood my problem completely, because a few weeks later, he send me a link for the new bigger and better Link Detox tools.

More investigation with Link Detox for 5 million links

In January I got access to the new Link Detox version that could process up to 5 million links. What a relief. Now I could upload all sorts of link data I had from the past, including link building reports and most important the data from Google Webmaster tools. Not that they show even 10% of the links I have, but I found it valuable that I could aggregate all those links I had in Link Detox and match them for my priority approach (worst first).

Besides matching those links, we also had to dive deeper into our specific case and look at the links that especially came from affiliates and partners with a lot of blogs, forums and online activity.

Here is the whole story that I published after the “reconsideration requests”:

The condensed way to recovery

The first reconsideration request I send after the first use of the disavow tool was shortly after the SEOkomm in 2012 in December. It was denied within a few days with the usual canned message.

After the first denial, we needed to expand the activity and formalize our link handling. We specifically did:

Affiliate program links changed

All our affiliate links were basically going to existing pages with an added “?aid=123” to the URL. If the affiliate ID was set, a cookie was dropped and a 301 redirect led the customer to the real landing page. That was passing link juice and I still believe one of the major problems for our site. We reached a size with so many affiliates that we had no longer a personal connection to them and no control where these links were placed.

So we decided to take all new links to a neutral, blocked subdomain (visit.tradebit.com) and forbid the Google bot to read it. Bottom line: no more link power from affiliates. That felt bad for an old-school SEO, but was obviously very important in the process.

So in short all URLs that were

www.tradebit.com/something?aid=123

are now

visit.tradebit.com/something?aid=123

That “visit.tradebit.com” has a specific robots.txt and is blocked to be spidered by Google by disallowing the files that redirect the surfers:

User-agent: *

Disallow: /visit.php

Disallow: /visit.php?myfileid=

Disallow: /visit.php?previewid=

… for example!

Bad scraper links disavowed

Sites who were scraping our content and putting up that content as their own landed in our “disavow domain list”. We also disavowed spammy looking blogs, who did not clean their comments. I guess there was some negative SEO going on for our site and we had to take action on those links. The list of scrapers and spam sites covered in the last version about 250 entries.

Paid links removed

We also started to remove all paid links from platforms like TLA and other link sellers – that already started after Pubcon 2011 in November, when Matt Cutts was (again) talking about paid links and how link networks are discovered.

We did not just blindly cancel the links, but did that in waves combined with press releases and other content marketing activities. So we had left that realm when the announcement of last June hit SearchEngineLand:

After all that we started another reconsideration request by the beginning of March 2013, which was also denied.

With that demotivating mail in our back, we did not anything else to clean our link profile, because it was hard to think of anything else to do. We focused on improving our site and not taking on new bad links or content for a while until I started a new reconsideration request in May 2013 which finally got us back.

Back in the game

On the 4th of June 2013 we finally got the mail that the manual spam penalty was revoked. Part of it reads

“… Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site...”

That was exactly one month ago and we slowly see improvements in traffic and ultimately in conversion.

Additional steps taken

Because the link profile and the content was not the only thing we have done in the past year, it might be important to mention a few SEO and Conversion related steps that were taken and could also had influence on the recovery process.

New Design: we did a site redesign in January, which was majorly a layout change. The new layout on www.Tradebit.com now is a responsive design, which is useable on smartphones. While doing so, we made sure to avoid duplicate content pages where possible.

Trust Seals: we added seals and links to verification sites to our domain, like the security certificates for “https” verification.

Improve Onpage Conversion: another big step was the improvement of the conversion and time on our site. We made sure to offer more related products, more blog posts and related articles to certain topics (like specific reviews of software products).

We have spent almost a year on the topic “quality” if it comes to our whole setup and communication on the site. Recovery from a Google penalty was a process and it took more than a year to officially say: “yes, we are back”.

Most likely there is no shortcut!

Good luck with your recovery story – let me know how it goes.

Ralf

This case study was written by Ralf Schwoebel, CEO of Tradebit and long-time user of LinkResearchTools and Link Detox and was reviewed and approved by Christoph C. Cemper.

A word from Christoph C. Cemper

This case-study was written by Ralf and I'm very glad for Ralf to finally found relief in his penalty situation. I have watched and consulted many of these steps and I am happy that Link Detox finally helped Ralf spot the really shady things in his link profile.

Thanks & Congrats again Ralf!

What do you think about this Recovery Case Study?

Let us know!

With the Superhero Plan you can perform link audits, link cleanup, link disavow boost, competitive research, professional SEO and backlink analysis for your own or your competitor's sites.

You can avoid a Google Penguin Penalty! Learn all about the Real-Time Google Penguin Update in this free webinar.

Join our free 21 Day Link Strategy Training below

Signup to receive snack sized bits of knowledge about Link Research Strategies, SEO Tactics & features of LinkResearchTools platform and technologies.

Bonus: signup you will also receive a free copy of the eBook "7 Golden Rules of Link Building" for immediate download.

Started to create eCommerce solutions in 1995 with Germany.net and grew with the topic since then. In 2002, he became a successful online marketer and combines his technical skill with the need for traffic and conversion generation. He the founder and CEO of Tradebit, Inc. - the download marketplace. The platform went online in 2005.

21 Comments

Chrison July 16, 2013 at 10:58 pm

Thanks for the positive story and suggestions. I resubmitted my disavow today, but had not included a spammy scraper. Thanks to your article, I went back and added him. I was hit with the Penguin 2.0 algorithm and hoping for the best. I used a combination of Link Detox (love it) and Majestic to segment specific spam that was created by a site to boost a forum post he/she had created on our site that ultimately linked back to their site. I was shocked at some of the garbage pointing to my site. The post has since been deleted, but now the tough work of cleaning up my link profile. I wish I had been more diligent about checking Webmasters. I also wish Google would have said that links can harm your site, but obviously that ship has sailed.

I’m really happy for Ralf, and it is a truly inspiring story. I don’t want to kill the party, but where manual penalty is present and lifted, it suggests that it wasn’t a penguin penalty, it was as already written, a manual penalty.

Penguin is an algorithmic penalty, you cant file a reconsideration request and get a lift. So far the only working way to deal with penguin is to change your domain.

On the other hand, if you got a manual penalty, then putting in lots of work can help you.

Great writeup and inspiring news. Really amazing that it happened just a few days after penguin 2.0. It took a client we are working with a month to realize Penguin 2.0 was real and we could not just wait it out.
Aggressively looking through the link profile now and have done 2 batches in LRT and then sent them to another tool for removal and tracking. Considering doing a third pass or to do a disavow in a few weeks.
Interesting problem with scraper sites and embedded links. I used to embed links to help spread my links if my copy was stolen. Now that may not be a good strategy.
Hoping to get approved for the first reconsideration request.

How does one know that you are hit by manual penalty as opposed to just the penguin algo. My sites seem to get hit last April but not every kw, majority took hits but some did not. though gradually over time, which i know i had panda isssues, also caused more drop. Would a manual penalty pretty much affect every page/kw on the site?

I strongly recommend showing some effort first to manually remove any bad links you can pointing to your domain before using disavow. It seems there is no quick way to recovery, time and effort is needed. So if this is a client then understanding is needed for the long-term. Or it will possibly be a case of needing a new domain.

Good to see the manual penalty recovered. This is aspect is all about due diligence and honesty. I am interested in the level of recovery. Stats taken from Searchmetrics are not showing a significant recovery in visibility. So while the manual link penalty has been removed, from the outside looking in, it looks more like a partial recovery. Perhaps you can elaborate more?

Lee is totally right: this is a partial recovery, but it is the part that matters most to us: REVENUE!

Since the first Panda it was VERY clear to us, that our income comes from long tail searches.

“lady gaga bad romance karaoke”

and the like…

The penalty that hit us most, was the devaluation of the link juice that let these long tail pages drop in rankings. These are back, in fact – we almost have no major keyword rankings anymore, but seem to attract specific long tail “download” terms!

I used this tool and recovered one site suffering from Penguin. The client had bought a lot of bad links. But for two others in didn’t work. I’m not sure why. They were both in first place before Penguin. One is still on the third page.

I read that for every link submitted to the disavow tool you need to make a comment for it to be effective. Do you think that is true? I did this but it didn’t help.

I also need this tool but how can i get this tool and what was the way to use this tool. Because Last year I have lot of traffic but suddenly i lost everything. I don’t know why. but I am still struggling with this problem.