Google Panda/Farmer - How to protect your traffic

A single update to Google’s algorithm has led to a massive dip in
traffic for tens of thousands of websites.

The Panda/Farmer update is potentially one of the most significant
changes that the search engine giant has enacted in a very long time.
But what is it, and how will it affect your rankings?

What is Google Farmer/Panda?

Panda (dubbed Farmer by sections of the tech press) is a major update to
the Google search algorithm. It is intended, like most updates, to
increase the quality of the results that Google returns. This time
around, Google is particularly targeting so-called ‘content farms’;
sites that ‘scrape’ content from other locations, or sites that produce
‘shallow’ content that still ranks highly.

Or, at least, this is what has been deduced from Google’s
characteristically cryptic announcements. The search engine has denied
that the update is specifically targeting content farms, but Matt Cutts,
one of its leading lights, has made clear that Google’s intention is to
clamp down on scraper sites, while a blog from Google said that
“attention has shifted…to content farms.”

Who is being affected?

At the minute, Panda only affects US-based results. But it is widely
presumed that the update will be rolled out worldwide in time – so it is
important that you consider ways to mitigate its impact.

Panda has already had a devastating impact. Some websites have seen
their traffic drop by as much as 40 per cent almost overnight and, by
Google’s own calculations, the update has affected some 12 per cent of
search queries.

But my website isn’t a content farm…

This is part of the reason that Panda has caused such anger amongst
website owners. Although the update was intended to hit content farms
and scrapers, it would appear that it has falsely identified a large
number of sites as being of low quality. This has had meant that many
good quality, legitimate sites have been unfairly penalised.

Google has said that it is tweaking the algorithm, and that it aims to
make it more accurate. But this is unlikely to provide much solace to
those who have already seen their traffic plummet without reason.

What can I do?

SEO experts are still trying to understand the full implications of
Panda/Farmer. But some tips have already emerged to help you mitigate
the impact of the update on your site.

Check if Panda is the problem. Look at your analytics. When did
you start noticing a dip in traffic? Panda came into effect on
February 24th, so if the dip occurred before or much after this date,
it could well be a result of some other change you have made to your
site. Given that Panda currently only affects US-based queries, you
can also pinpoint the problem by filtering for US-only visits in your
analytics.

Look at individual pages. Panda operates on a page-level basis. It
considers whether or not individual pages on your site are likely to
be providing low-value, shallow, or scraped content, and penalises
them accordingly. You can therefore work out which parts of your site
are falling foul of the algorithm by identifying the pages that have
performed particularly badly since the change.

It is important to remember, though, that Google has said that entire
sites may be treated less favourably as a result of individual pages.
It is therefore important that you try to create great content on
every page of your site.

Create deep, unique content. The basic tenets of SEO remain the
same. Don’t try to second-guess the algorithm; instead, try to create
the sort of content that you would like to see on the web: high
quality, unique, authoritative content. The days of blatant
keyword-spamming, where whole pages were just filled with nonsense,
are basically gone. But many website owners still think they can drive
traffic by building content solely around those keywords. This isn’t
enough – and this is exactly what Panda has targeted. Think about what
you can do to make your content unique, and how you can make your
content valuable to users.

If you’re aggregating, do more.
Of course, some sites have a valid reason for aggregating content.
You might be presenting information on a specific topic from a range
of different bodies, for example. In these cases, it is vital that you
also provide some ‘added value’. Add some high-quality content of your
own – something that is more than just ads, and that is regularly
updated where relevant. This will help to illustrate to Google that
yours is a legitimate aggregation operation.

Consider rel=canonical.
The same goes for duplicate content. There are plenty of situations
in which it is legitimate for sites to duplicate content, either
across their own pages or across multiple domains. But regardless of
your intentions, if you handle this clumsily you risk being penalised.

Google has indicated that the rel=canonical link element is a good
way of handling duplicate content across several domains. You can
read more about it here.

The implications of Panda/Farmer are still being taken in. But broadly,
the principles remain the same: Google will treat you well if you create
unique, authoritative, high-quality content.