SEO Is Not Science

I’m not sure if having a computer science degree makes me a scientist or not, but I do know how to spot science. See, science has a definition. According to Wikipedia, that definition is: a systematic enterprise of gathering knowledge about the world and organizing and condensing that knowledge into testable laws and theories.

It’s that last bit that causes all sorts of problems for traditional SEO. We SEOs are pretty good at gathering and organizing knowledge. We’re even better when it comes to sharing it, but when it comes to testable laws and theories we kind of suck.

Ever since Branko’s presentation at SMX (which I liveblogged by the way ) I’ve been thinking about testing and the scientific method – and how little of it we actually do in the SEO community. It prompted me to start testing META descriptions with Alan Bleiweiss. Even though we both “knew” what the result was going to be, we did a scientific test anyway – and that’s what the SEO community needs more of.

The problem though, is that in addition to being technologists, SEOs also have to be marketers – and marketing and science don’t really have the best relationship. Marketing tends to be about making claims whereas science is more about collecting and publishing data. As SEO/Marketers we not only have to do the research, but we have to then go in front of a client and convince them why they should spend money to make changes. Often, data alone isn’t enough to sway a client.

A successful SEO is one who is able to wear a science hat and a marketing hat, but a great SEO is one who knows when to wear each one of those hats.

It’s not that SEO can’t be science, it’s just that we’re not doing a good enough job of making it a science – and that’s bad for the industry. Unverifiable claims are one of the quickest ways to get the snake oil reputation. Science is how that nasty reputation can be avoided.

Unfortunately, many SEOs (myself included) get too caught up in the lifestyle to worry about actually doing SEO. Talking about SEO, doing the conference circuit, and living the A-list twitter life are all things that come to those who make SEO claims – but sometimes we get so caught up in that life and making those claims that we forget to actually test our claims.

Sometimes, it’s our own egos that prevent what we do from being called science. “I’ve been doing this for X years, it works, I don’t need to test that, everybody knows it.” How many times have you said something like that? Read through the comments on Alan’s META description test and count the passionate opinions there that are soley based on ego without any data to support them. There’s quite a few.

The true SEO scientist doesn’t just make a claim. He gathers data, then posts that data for others to examine. The problem though, is that posts full of data don’t get retweeted and they don’t get onto the front page of Sphinn. Posts full of claims however do get lots of retweets.

It’s that community aspect of SEO that’s holding us back from achieving our true potential. I can hear a few people muttering under their breath “Oh, he’s just jealous that he’s not an A-lister” and well, that’s true, but it’s not the motivation for this post. I firmly believe that all the A-list SEO people have earned their status. They did awesome work, wrote great blog posts, and did everything else to earn the success they’re enjoying today.

Science and fandom don’t mix.

The problem with reputation though, is that people stop questioning A-listers. They’re no longer required to produce data or backup their claims. While they’ve deserved that right, it’s not good for science. A-listers and SEO rockstars can be wrong too. Nobody’s perfect. Just look at how many advocated pagerank sculpting several months after Google quietly changed how nofollow worked. (then, look at how many refused to admit they were wrong.)

Pay attention next time Danny Sullivan or Lisa Barone write a blog post. Almost instantly they’ll get 10-12 retweets. (and usually, they deserve them too as they write awesome stuff.) The problem here isn’t the retweets, it’s the amount of people who retweet before they actually had time to read the blog post. If retweets are the social equivalent to links on web pages, then so many people are turning their twitter feeds into free-for-all directories. You wouldn’t recommend a doctor that you’ve never visited, why would you recommend a blog post that you haven’t read? What if Danny’s blog had been hacked to include a Viagra post? I bet he’d still get several blind retweets. Stuff like that doesn’t help SEO get taken seriously as a science.

We’ve all heard that the squeaky wheel gets the grease, but in SEO it’s the loudest shouter who gets the attention. Usually, those shouting loud know their shit, but not always. It’s important for SEO that we still continue to scrutinize and think critically about what people are saying – no matter who’s saying it. Everybody makes mistakes.

When we’re dealing with over 200 inputs like in SEO, it’s very easy to mix signals, miss relationships, or confuse correlation and causation. Rather than simply making claims, we as an industry need to focus on becoming more scientific. Don’t be afraid to run tests, share your data, and discuss methodologies. You may be right, you may be wrong, but either way you’ll be starting a scientific discussion in which everyone is bound to learn something – and that will make us all better SEOs.

SEO in its current form is not science. It’s starting to be though. Several great people and companies are attempting to put the science back in SEO, and I’m excited to see where our industry can go once more people get on board.

Ryan Jones is an SEO from Detroit. By day he works as a manager of SEO & Analytics at SapientNitro where his team performs SEO for Fortune500 clients. By night he's either playing hockey or attempting to take over the world with his own websites - which he would have already succeeded in doing had it not been for those meddling kids and their dog. The views expressed here have not been paid for and belong only to Ryan, not any of his employers or clients. Follow Ryan on Twitter at: @RyanJones, add him on Google+ or visit his personal website: www.RyanMJones.com

Comments

I need to say the “So you want to test SEO” session was one of my favorites up in Seattle. It’s odd though – I’m not a pure tech – I’m mostly a marketing and business management specialist. Yet I guess the reason I love the scientific process as it relates to our industry is that the more something is proven to be true, the more confident I am in implementing it and the better decisions I can make when auditing sites.

And when an audit is on a client site that generates millions of dollars a year in revenue each year, by basing as many of my action plan recommendations as possible on actual science, I can sleep better at night 🙂

Nice post – although I would say that the reason an A-class SEO marketer such as Danny or Lisa would get loads of retweets and references over an A-class SEO scientist is because they ARE A-class marketers – and so know how to invoke the passion enough in the audience to promote their post.

I appreciate your thoughts here, Ryan. I think another aspect of SEO science that’s really tricky is that while traditional science tests hypotheses against entities that don’t change or change very slowly (the universe, biological organisms, etc.), SEO’s “universe” is the Google algo, and it’s changing all the time. What we tested and learned 5 years ago might be irrelevant today, and we can’t rest on our laurels for too long.

That brings up another issue – the fact that many “scientific” tests in our field don’t properly factor in things such as the majority of changes that Matt Cutts DOESN’T announce or discuss, let alone the concept that we really don’t know the nature of the claimed “200 factors”. So quite often, tests may “appear” to have causality when in truth they very well may not.

@Alan – It’s tough when so many ranking factors are interdependent. For example, someone was suggesting the other day that exact-match domains perform well because they tend to attract exact-match anchor text. Although you can try to separate it and it turns out not to be 100% true, there’s definitely SOME truth to it. It’s hard to separate the factors out completely.

I’m a firm believer, though, that we should keep trying and support each other’s efforts, even when those efforts are imperfect. I’m not saying we can’t be critical – just that I want to see more attempts at SEO science and that we can all learn from each other, even the failed attempts (maybe ESPECIALLY the failed attempts).

Nice post Ryan. I love to see more “here’s what I did, here’s what I saw” posts and less claims, but it’s still okay to make claims… just a good idea to keep the two separate.

It gets political, too. If you make claims and show data in one post, some people will be inhibited by your rock star status and NOT comment, even though they have value to add. If you simply reported findings, they’d comment. That’s why I think it’s best to report findings objectively, and then report claims in a separate post… to keep the conversations productive.

If a post I write gets more retweets than someone else writing the same post, than clearly I’ve branded myself as someone worth listening to and who knows their shit on a given topic. Which, is my job. If you want the same attention (as anyone, not me. lots of people rightly receive way more attention than I do), than you need to earn the same reputation. I make no apologies for being able to use the brand that I put in the time to build.

2nd, I’m not arguing that you don’t deserve the attention. You write quality posts and do know your shit.

What I’m talking about are the people who “vouch” for you simply based on your name, without actually reading what you’re saying.

I mean, let’s face it, I’ve fucked up in the past, and I’m sure you’ve made mistakes too. All I’m asking is that people at least take the time to read what you’re saying before they say “hey, this is good shit.”

There’s nothing wrong with people forming their own opinion, not just retweeting everything anybody famous posts as an attempt to give them a virtual hand job. (ok, i’ve been drinking)

Unfortunately, many people believe stuff simply because somebody said it without taking the ideas on their merits.

I wish more people would take the time to actually read stuff, think about it critically, and form their own opinion before believing it as the gospel truth.

Usually, you can be trusted to know your shit, but the scientific approach involves scrutinizing it, not just believing it because you said so. That sounds more like religion than science to me.

You can dream of a world where making good content keeps every happy and Good Content makes for a better web, but what you’re really doing is asking everyone to volunteer their time and effort to cleanup the mess Google has made (and keeps making) on the way to the bank.

Billions and billions in profits put into individual bank accounts of Google shareholders (Google employees), at the expense of a polluted web, and you want us all to fix it by writing good content for free (or next to nothing).

John. I have to look at Graywolf’s article with a high degree of skepticism.

If you click over to that website, you’ll find that most of the articles aren’t really “great content” they’re just pandering to digg/stumbleupon users. They’re all about ‘what’s the average digg user like?” and “differences between digg and stumbleupon users.”

Things like that are bound to get a ton of diggs and stumbles, but that doesn’t necessarily make them great content. The author is assuming that diggs and stumbles mean the content is linkworthy.

I’d argue that outside of Digg/Stumbleupon, and whatever, that average internet users don’t find this content useful or interesting.

I’d also like to see what keywords the person was tracking. After reading some articles, they only seem optimized for terms like “digg” “stumbleupon” etc. He’s not going to rank for those terms.

My argument here is that good content still works. I’ve never had a problem with it on any of my sites. Perhaps his idea of “good” isn’t as “good” as he thinks it is. As somebody who rarely uses digg and stumbleupon (i know i know) I didn’t find the content useful at all.

He should focus on writing content for all users, not just digg users. Sure, it works for gaming digg and getting on the front page, but that’s about it.

@Ryan you should consider that for Graywolf those digg and stumbleupon topics might be exactly his targets. You seem to suggest that, in order for Google to value your content within your niche, it needs to appeal to a general internet audience? That seems odd. It also seems counter intuitive… all we’d see ranking are celebrity articles and freakshow stuff. Wait a sec….

Anyway in the niche he plays, his articles are far more quality than typical. He uses original data, makes interesting graphs, comes to conclusions, etc. It’s not multimedia content to describe how to tie a neck tie… but to his audience of marketers, understanding what makes Digg tick is very important (and appreciated by the marketplace audience). I believe he was comparing that content effort to various lesser content forms or even the “Top 10 Signs You’re Not Going to be a Rockstar” stuff that seems to work really well in Google.

About Ryan Jones

Ryan Jones is an SEO from Michigan. Ryan works as an SEO manager at SapientNItro. Ryan also runs several websites of his own. [More About Ryan..]