This post outlines the story of a client who enlisted our help in removing a manual link penalty inflicted by Google as the result of one of their former employees ‘black-hat’ SEO work. I have outlined the process we followed in order to identify the problematic links, how we went about removing them and in the end ensured the clients website was back in compliance with Google’s webmaster guidelines.

Six months ago, an unnamed client approached us requesting our help. They had suffered a severe drop in website traffic as a result of an ‘unnatural links’ penalty from Google. This penalty was caused by the work of one of their former employees, who had built thousands of inbound links to their site from poor quality article sharing sites, SEO directories and other completely unrelated sites that had clearly been used to artificially improve the rankings of their website. The client was notified of the penalty by a warning message in their webmaster tools account, around the same time they noticed a drop in their website traffic:

Stage one: Uncovering the problematic links

The first thing I had to do once I’d reviewed the note was to carry out some research on the subject. I read numerous articles on unnatural link warnings and how people had been unable to get them removed successfully due to lack of data, so I ensured that I collated the largest collection of inbound link data possible. I used Open Site Explorer, Ahrefs Site Explorer, Majestic SEO and Webmaster Tools’ own ‘Links to your site’ to download detailed lists of inbound links to the unnamed client’s site, and in doing so noticed that each set of data reported a slightly different set of inbound links. Therefore I’d suggest getting a list of links from as wide a range of sources as possible and collating it into one master spreadsheet of inbound links.

Once I had created a spreadsheet to compile inbound link data from each of the aforementioned sources, I de-duplicated the list and saved the document. Working through multiple links from the same domain would have proved an arduous task, therefore I created a separate document using the ‘text to columns’ function in excel to simplify the master list of URL’s into a list of domains linking to the site, and once again de-duplicated this list to leave me with around 9000 linking domains. I had by this point created two lists, one of each and every URL linking to the clients site (multiple links from each domain) and one simply consisting of a list of the domains.

I began working my way through the list of domains, using a combination of Google’s page rank toolbar, SEO Moz’s ‘Moz Bar’ and common sense to identify any poor quality domains that were linking to the clients site. This included highlighting any sites that were from totally unrelated industries and sites that had numerous links using suspicious looking anchor text. Once I’d finished working through the domains, I was left with a list of 3000+ ‘poor quality’ domains, and another list of natural links to the clients site.

Stage two: Removing Any Poor Quality Inbound Links

As the inbound links to the clients site had been placed there by one of their former employees, we were left with no login information to remove poor quality directory listings and spammy articles. Therefore we had to request that each and every link to the clients site be removed by the webmaster of each of the poor quality domains. To do this, I uploaded the list of poor quality links I’d initially prospected into Buzzstream. Buzzstream automatically generated contact information for around half of the sites in question, with the remainder proving rather difficult to contact. Many of these sites had no contact information visible, so I used WHOIS to attempt to find contact details of the individual that registered each domain. Once I had found all the contact details possible, I started to send emails to the webmasters of the poor-quality sites using Buzzstream and had a very good response rate using the following strongly worded email:

Stage three: Following up

Once I’d sent emails to the webmasters of each poor quality website linking to the clients website, I waited a week before sending a follow up email to those who failed to respond. This resulted in over 50 further responses and also gave us more evidence to use in our eventual reconsideration request to Google.

Stage four: Disavow

Google’s disavow tool was designed so that webmasters would be able to report any unwanted inbound links to their site. By submitting a URL or entire domain via the disavow tool, you’re effectively telling Google that you want them to ignore the value of that link.

I used the disavow tool to submit a file listing all the poor-quality domains that were unresponsive to my email requests. A typical disavow file looks like:

Stage five: The Reconsideration

Once I had submitted the remaining poor-quality domains using Google’s disavow tool, it was time to prepare a reconsideration request. This is essential if you’re to overturn a penalty that Google inflicts on a website, and should include as much detail as possible. I used Google Docs that included evidence of the emails I’d been sending, and the responses I’d received accepting the requests. I also included Google Docs of the inbound links I’d prospected, including notes on the actions I’d taken to try and get them removed. I also ensured that the owner of the business in question wrote the request explaining that they had learnt their lesson, and would never carry out that kind of work again. Once I’d left as much detail as possible and attached the evidence of undertaking the process I’ve highlighted above, I submitted the reconsideration request via the clients webmaster tools account.

Success: Manual spam action revoked

Around a week after I submitted the request, I received the following message from Google:

As Google’s message states, the clients rankings won’t suddenly return to where they once were. However, I have already started to see a marked improvement in their keyword performance which indicates that Google are starting to reflect this information in their search results.

The reality is, this kind of request can be easily avoided and I’d strongly encourage any company to think about who they’re employing to carry out SEO work on their behalf to ensure they don’t encounter a similar scenario. Companies should concentrate on producing good quality content that people will naturally link to. There should be no need to carry out aggressive unnatural link building – it may have worked a few years ago, but Google are now smarter at spotting this kind of activity, as highlighted above.

As always, if you have any questions then please leave a comment below. We’d love to hear from you!