The Problem with Testing in SEO

Let me start by saying I’m a huge fan of testing in SEO and other online marketing disciplines. Proper tests can take away a lot of the FUD (Fear, Uncertainty, and Doubt) that exists in the online marketing industry.

And there are several search engine prophets out there that are doing a superb job testing SEO myths and the impact of new search features. David Harry at SEO Dojo has done some great stuff, and so has the crew at SEOmoz (though a bit hit and miss).

The problems that exist with testing in SEO aren’t deliberate. I’m convinced that pretty much all tests performed by SEOs start with the best of intentions: to help uncover truths and add quality information and best practices to our methods.

But nonetheless there are some fundamental issues that need to be addressed:

1. Search engine optimisers aren’t scientists

Most search engine optimisers aren’t scientists. There are a few notable exceptions (Marie-Claire Jenkins from the Science for SEO blog to name but one – and if you’re not reading that blog you really should), but generally SEOs seem to come from backgrounds of sales/marketing/pr, journalism, and IT.

This means that most SEOs are unfamiliar with the methodology of proper scientific testing – controls, double-blinds, statistical variances, etc. The inevitable result is that many tests performed by SEOs suffer from intrinsic shortcomings that may influence their outcomes. The tests may be biased towards a certain result, the results themselves may be misinterpreted, and often there is no adequate control within the test.

2. The SEO industry has an abundance of egos

By its very nature the online marketing industry as a whole, and the SEO industry specifically, is filled with egos. This is not a bad thing – before you can market someone else’s product effectively you need to be able to market yourself. Truly effective marketing starts with unfailing belief in the product you’re selling. And when that product contains your own skills and services, you need an abundance of self-confidence. (I’m certainly no exception.)

The result is that many SEOs conducting tests are inclined to attach too much value to their interpretation of the outcome. There’s a tendency to proclaim Grand Truths of SEO based on little more than a single test (which is often flawed as per point 1).

3. SEO tests are aimed at moving targets

Where most of conventional science tries to understand the inner workings of the universe, SEOs are trying to understand the inner workings of human constructs. Search engines, for all their complexity, are built by humans for a specific purpose. And search engines are not static – they’re continuously changing, adapting to new trends and developments.

On Monday a well-conducted SEO test can reveal an incredibly valuable insight to an aspect of Google, only for this to become obsolete on Friday when Google’s engineers make a small tweak to their algorithms. But it might take months or even years for that obsolete insight to be disproved and discredited.

This is not in the least because of point 2. Often SEOs are loathe to let go of their discoveries, nor to properly frame them in the inherent uncertainty that these discoveries deserve. As a result these Grand SEO Truths continue to exist on blogs and in books long after they’ve lose their validity, where they are often taken at face value by newcomers to the industry.

So should we all stop doing tests? Heck no. We need proper tests to help improve the quality of our industry, fight the spread of old knowledge and outdated practices, and keep the search engines on their toes.

But we need to try and make sure we do things right. I’m not saying every SEO that wants to run a test should first get a degree in a scientific discipline (though it would help), but perhaps we can work together and use the enormous amounts of skill, knowledge, and experience contained in our industry to codify a set of best practices for running SEO tests.

There’s been a lot of talk about qualifications and certification for SEO professionals and companies, and maybe this can be a part of that process.

I’d love to hear the thoughts and ideas of other SEOs on the matter, so please leave a comment.

Hit the nail on the head Barry!
“The SEO industry has an abundance of egos” is quote of the day for me!

http://www.dennissievers.com Dennis Sievers

Ouch, my ego is hurt now Barry

But you’re right. We don’t test things the scientific way. On the other hand, we can (almost) measure everything we test, and with that, we do prove things. We live in a digital world, that means 1 = yes and 0 = no. And we SEO’s tend to look like things that way too

http://www.siliconcloud.com robert callaghan

Hi Barry,

I totally agree, we need to test and analyse everything we do, i go by the saying , get found, convert, analyse and repeat best practices. There’s no point in doing SEO unless it drives leads, we all want more leads at the end of the day. Basically keeps us in jobs, so if you can show your clients increased traffic and the process involved brilliant. We use hubspot which combines seo, sociall media and blogging analytics all in one platform.

Rob

http://www.themediaflow.com Nichola Stott

You raise some really interesting philisophical concepts about the nature of understanding, and proof of concept. I do believe that testing and scientific testing is an important part of understanding how some elements of an algorithm, may work, at a certain point in time – however mainly because of point 3. I think we need to widen our faith and what informs our opinions to a wider empirical approach; even which, if we agree with Thomas Kuhn, is always influenced by prior experience and our innate assumptions.

I guess what I’m trying to say is “the scientific method” is just as flawed as any other. IMHO

Nic

http://www.adamus.nl Barry Adams

Thanks all for your comments.

@Dennis: ideally SEO would be as straightforward as binary, but it isn’t (and I think that’s a good thing). Many testers make that jump from measuring to ‘proving’ a bit too quickly – correlation isn’t causality, and our own biases and preconceived notions often get in the way.

@Nichola: I agree that applying the scientific method to SEO is not ideal, but I do think that the scientific method is the best tool we have. Imho it’s less flawed than any alternative approach. Science has its share of prejudices and bias, but it at least strives to be transparent and objective. Whether or not we should apply the scientific method to something as fluid and changeable as SEO, well, that’s one of the questions we should try and answer.