08 Aug Google Panda – Problems and Solutions

If you have lost some traffic in the search results and suspect it is related to the Google Panda penalty this article will give you the means to diagnose whether you likely have a problem along with some best practices to resolve your issues and move forward with your marketing.

The overall purpose of this article is to help you understand what Panda is looking for along with some solutions for common problems.

Panda relates to issues with your site and you can get from under the belly of the beast and reclaim your rankings.

Thin Content & Web Spam

The single aim of the Google Panda update is to return better and higher quality results in it’s web search and filter out web spam and thin content. To do this, they have identified several criteria that signify that a site, or pages on a site may be of a lower quality and should not rank as well as they otherwise might have done. This is a problem if your site is flagged as you may lose considerable traffic. Likewise, if you have a new site, and it is clobbered by Panda, you may never know and subsequently may never reach your true ranking potential.

Problems & Solutions

The remainder of this article will detail the various aspects that Panda is designed to catch along with some solutions for you to resolve these issues and regain your rightful positioning.

1. Duplicate Content

If you run a content management system then chances are you have some duplicate pages. In most instances these are product pages that appear on more than one URL or category pages that feature the exact same set of products. This can be problematic as these pages essentially compete for the relevance and value of the individual page with the result of being bumped down the search results or not appearing at all.

Any modern system should resolve duplicate product pages with either canonical URLs or 301 redirects to remove the duplication and internal linking should be consistent and always pointed at the main canonical page.

Other common duplications are found on blogging software such as WordPress where the various taxonomies such as author, categories & date can lead to duplicated pages. Again, this is easily resolved through applying the noindex meta tag and many SEO plugins will provide simple, page level functionality to apply the noindex tag.

2. Thin or Almost Duplicate Content

E-Commerce and Blog software again make it very easy to create near duplicate pages which then have difficulty ranking as relevance for a given term is distributed between several pages. To make matters worse, Panda often groups other pages into this negative group and the problem can affect additional pages.

As an example, if you had a product category page for Widgets, and then had several product pages that were identical except for say the colour of the Widget so lets say, Red Widget, Blue Widget, Green Widget (etc) then these could be near duplicates.These products then get flagged as thin content or near duplicates and they then drag down the category page that just happened to be your primary search landing page.

The best solution here will vary depending on your situation and the software you are using for your site. If you have multiple product pages that vary only by a product detail such as colour or size then most modern e-commerce platforms will allow you to pool all of these into a single page. The user then selects the product option when ordering. This provides a way to pool all of the product relevance into a single page that handles all of the various product options.

Where this option is not available and you have determined that neither your category or product pages are ranking as they should then you may have to take more drastic measures. One option is to use the meta noindex tag on the product pages and focus all of the ranking power in the product category page.

3. Paginated Category Pages

The same problems can happen with category pages on ecommerce sites or blogs where there are several pages that are all largely similar. Google does a pretty good job with this usually but you can certainly help them do better and control the user experience and ensure any ranking power or link juice is all focused on a single page rather than spread across several, similar, but competing pages.

Your options here are as follows:

Add a View All page and a canonical on each of the category pages that points to this page.

Add the rel=next & rel=prev tags to the component pages to indicate a series of page.

Either solution will work well and they can be combined so if Google determines a single version of one of the category pages is most relevant that can be returned or alternatively, for general queries, the first page in a series or the view all page if it is not too long to be unwieldy.

4. True Thin Content

For whatever reason, be it search manipulation or legitimate reasons, if you have a series of pages on your site that have only a sentence or two then you may find these pages are penalised and subsequently have a knock on affect on other pages. Often individual FAQ items or such can be subject to these problems where the answer is a single sentence.

Review all of your pages and make sure they all provide enough value as a search result and ensure that the actual content outways the template elements (unique ratio).

5. External Duplication

Having copies of your content on other sites can be a big problem and can indicate that the content is of a low value. There are several scenarios where this can happen with the worst being the use of manufacturers product descriptions that are already in use on many other sites. Using this kind of content you are becoming just another result in a sea of similar results and bring nothing new to the table, subsequently, you will be unlikely to feature in any set of results for this product as there are already several other sites with the same content.

There are many other causes of external duplication

Affiliate sites

Content has been copied, stolen or scraped

Copy and pasting of content onto other sites

In an ideal world, if someone stole your content, Google would have you down as the owner and these other sites would be penalised, unfortunately, this does not always work as it should and especially where you have an old site, which has been around for some time, you may find several other sites have stolen your content. The best solution if this is the case is to simply rewrite your content where practical to do so and you will probably find it needs a refresh.

You can always contact other sites, send a legal letter or even file a DMCA request and in some cases that should be done but often, the work involved is heavier lifting than simply rewriting your content.

Where you have duplicate content across other web properties that you own or have authorised affiliates then you should either provide a set of content for affiliates that does not match your own or use cross domain canonical tags to indicate the version which should rank.

If you have not used it before the CopyScape tool is a powerful means of identifying content.

6. Ad Heavy Content

Ad heavy content is much the same as thin content and if there are more adverts on your page than content then chances are these pages will provide a poor user experience and Google is simply not going to return these pages in a web search. Of course, this may not always be the case, but Google is a computer, and computers make decisions based on various factors so if your ad heavy pages are not ranking any longer consider either turning down the ads or turning up the content.

So, ensure you have more content than adverts and make sure these pages all provide value.

Dates with a Panda

Just so you can be sure that you have a problem with Panda and not some other issue you can check your site Analytics and see if you lost traffic on one of the dates the Panda Algorithm has been rolled out.

February 24, 2011: Google Panda released in the US – 12% of search queries are affected

April 11, 2011: First refresh with some recoveries and more sites hit

May 10, 2011: Google uses manual data from sites blocked in Chrome to further tweak the algorithm

June 16, 2011

July 23, 2011

August 12, 2011: Panda rolled out globally and on non-English versions of the search engine

September 28, 2011

October 13, 2011

November 18, 2011

January 18, 2012

February 27, 2012

March 23, 2012

April 19th, 2012

April 27th, 2012:

June 8th, 2012:

June 25th, 2012:

July 24, 2012:

The updates to Panda now are almost monthly and if you have made changes, you need to wait till the data refresh to see if your recovery attempts were successful. Likewise, if you suddenly lose traffic, referring to the dates of the Panda and Penguin filter updates can give you some indication of what your problem may be.

The Penalty Zoo

Panda is not the only penalty that is in full affect and there is also the Google Penguin penalty that goes after link spam and link building that has been done to manipulate search results. There have also been several other large and aggressive changes this year that have seen link networks taken down, directories deindexed and a more aggressive set of manual penalties for sites that are frequently breaking the rules in an quest for top results.

What this really means is that any loss of traffic or failure to rank could be due to any one of these issues or as I am seeing more frequently, a combination of issues so it is important to perform a full penalty audit and understand exactly what your problems are before you take action.

If you need help diagnosing your problems or with a strategy for recovery from one or more of the Google penalties then please get in touch.

Marcus Miller

marcus@bowlerhat.co.uk

Marcus is our Digital Strategist. He’s been working in the industry for nearly 20 years and wears many (bowler) hats as a highly technical developer and SEO, and even has a fancy computer science degree to prove it.