Sponsors

Questions have been swirling around duplicate content and international SEO for years (and way before that post too). Even with Google’s official stance (which we’ll look at later) and the personal experience of numerous SEOs, the fear of duplicate content penalties persists.

The purpose of this post is to try to address the main concerns while simultaneously dispelling a few paralyzing fears:

1. FACT: SEOs are Rightly Concerned About Duplicate Content

Original, quality content is the backbone of on-page SEO. Even as the emphasis is now on overall site quality as opposed to page-by-page optimization, the need for original, quality content remains as important if not more so.

We already know about canonical issues, and have the canonical tag (attribute rather) to help clean up the mess.

We have XML sitemaps to ensure all our content is indexed, hopefully correctly.

We can assign ownership (particularly relevant for news sites) with the source tags (original and syndicated).

In my humble opinion they have every right to be concerned with and about duplicate content. Like any big corporation (and Google is certainly big), they have stakeholders to whom they are accountable and for whom they are responsible. See this quote from Google:

“However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.” (emphasis is mine)

To take the thought further:

If the users have a poor user experience they may simply run the search again – on another search engine.

If there are fewer users, there is less revenue.

If there is less revenue, there are likely to be lay-offs.

If there are lay-offs, there is likely to be uncertainly which will (as will revenue loss) negatively affect share prices and shareholder returns, ergo bad business…

3: FEAR: We’ll Get Hit with a Duplicate Content Penalty!

Duplicate content ‘penalites’ exist, but have never been labeled as such as the action taken by the search engine is not so much ‘penalizing’ the site for spammy content as adjusting the algorithm to ensure a better user experience. This is also rather rare, and there is a significant difference between ‘penalty’ and ‘algorithmic spam adjustment’. See what Google says:

“In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.” (emphasis is mine)

There is most definitely a duplicate content filter that can feel like a penalty. But there is, again, a very big difference between ‘penalty’ and ‘filter’:

If a website were to be penalized, it would be a deliberate attempt or act on the search engine’s part to downgrade the site’s rankings, or penalize their listings.

A filter on the other hand is simply the engine’s best-attempt to provide their users with the most relevant listing/s to their search query, and the filter is applied on a query-by-query basis. See what Google says:

“Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a “regular” and “printer” version of each article, and neither of these is blocked with a noindex meta tag, we’ll choose one of them to list.”

4. FEAR: We Can Never Leverage our Content Assets & Resources Globally as we’ll be Penalized or even Banned!

Clearly, part of this fear has been dealt with above…

As to the rest of it; yes, Google does apply a filter but it is done by geo-targeting. Much as different language renditions or translations of the exact same content will be filtered by language, content targeted at different international audiences will be filtered by geo-target, usually by one or more of the following:

Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs.

This is generally not a problem as long as the content is for different users in different countries.

While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible for all pages and variations from the start…”

Does that seem contradictory? It shouldn’t… Refer back to Fact 2 above, I’ll copy the relevant information here for you:

“However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.” (emphasis is mine)

A user in South Africa is not going to see identical English-language UK, Australian, Irish, British or Kiwi results when searching on Google.co.za.

If there are duplicated pages across multiple ccTLDs, the South African searcher on Google.co.za will most (highly) likely see the .co.za result.

The .com page will show for .com searches – and may encroach (at least initially) on new country target market engines, particularly if the ccTLDs are hosted State-side. Why? Simple offline SEO math – the US population is much bigger than other primarily English-speaking countries (10x population of Canada; 4x population of UK; 11x population of Australia and so forth) and hence the pages and sites have the (somewhat ‘doh!’ factor) likelihood of gaining both greater traffic and quality backlinks – faster and more regularly too; remember we’re assuming duplication of content on ccTLDs or across same-speaking country borders.

Closing Thoughts:

With global online marketing and sales becoming ever more prevalent and economically viable and important to many countries, it is of benefit to any company engaging on a global scale or growing internationally to confidently – and with a good SEO company or inhouse team – go about:

Effectively sharing quality, relevant content resources to achieve economies of scale and greater global exposure in their target countries without fear of ‘penalty’ or negative filtering; thus effectively leveraging what and where they can – an enormous benefit to businesses in the global online space.

Localizing as much as possible, not simply for any latent duplicate content fears, but because same language countries have very different vocabularies and unique colloquialisms.

Leveraging (as a result of tactical and strategic changes) winning test-page content and layout; blog posts; articles; converting content and copy; and more.

One closing thought is that content or page recipes that work in one country may not actually be your best option in a different country/new market, even if they do speak the same language. Don’t assume that taking from the source will provide you with the ‘winning-ticket’. Measure and test as much as possible in-market, and ofcourse, develop unique and original target specific content as quickly as possible with those learnings. Just remember, Google’s not out to get YOU, they’re out to keep their stakeholders happy – as any good business practice requires.