How intention may influence search result click-through [Conversion Conference]

It was a few months ago when I was invited to speak at the excellent Conversion Conference London. We do have a strong conversion rate optimisation knowledge base, but the title for this particular session was in fact labelled: “SEO and SEM vs. CRO – Tactics for Optimising Both Search & Conversion”. Around the same time I accepted to speak on the panel for the conference in December, SlingshotSEO had published their click through rate study (first, via SEOmoz) while MiraMetrix were working on an interesting eye tracking survey for Dr Pete to publish later on SEOmoz.

The Problem with CTR Studies

While click through rate studies are helpful, useful and interesting, they are do have limitations. Even the best studies don’t take universal search results into account and while the data tends to aggregate many hundreds if not thousands of search queries, the query intent is not recorded. Click through will vary based on variables beyond ranking position. All of this got me thinking – could we measure how the intention of a searcher (for example, informational query intent vs transactional) might affect the click through rate in a search results page? At the time, no tools existed to help me to find a conclusion.

SERP Turkey

Probably not much more than two weeks ago (and around a week before the conference!), Tom Anthonylaunched his handy new A/B split testing tool; SERP Turkey. Suddenly, it became possible to A/B test variations of search results while recording the number of clicks each variation achieved. While the early version of the tool has obvious limitations, it is incredibly useful and easy to use. I’d like to thank Tom – he actually coded a change to the tool as a feature request to enable me to copy entire tests (thus saving me a great deal of time!). Thank you, Tom.

A Simple Test To Measure the CTR in a Search Result Variation

Ready for conversion conference, Fabian and I set about designing a simple test. We wanted to see if we could measure differences in click through when searchers were given two different scenarios. An “informational” search scenario for the query “pool tables” and a “transactional” search scenario for the same query. In each scenario we served variations of search results to our searchers where the position 2 ranking had differently worded snippets.

Searchers were sourced via Mechanical Turk (approval over 95% + Master Categoriser status). There are obvious caveats to using mTurk, though Tom’s “Great White Shark” test did show that the integrity of the Turker’s choices were of good standard. Here were our scenarios, of which each variation received around 120 searches where the time spent examining the search results was over 5 seconds.

Scenario 1: You’re researching Pool Tables. Click the result that would be most relevant to you.

Scenario 2: Imagine you’re about to buy a Pool Table. Click the result that would be most relevant to you.

The Test Search Pages

There were 3 variations of our search result pages. As SERP Turkey doesn’t do rich snippets, the third variation was actually a hack – and the time it took to get this one set up meant we only tested in in the transactional scenario. The only change in each variation was the wording or presentation of the second result. No other changes were made.

Informational Variation

Transactional Variation

Transational Variation w/Rich Snippet

What Were the Results?

Our hypothesis was that in a research scenario, a searcher may prefer the snippet with broader, informational terms: “experts”, “beginner”, “advanced”, “comprehensive”, “information”, “plenty”, “learning” and “schools”. In a transactional scenario, the searcher may respond to the more transactionally worded snippet: “buy”, “free delivery”, “sale”, “prices”, “buy today”. Here is a typical results page from one of the scenarios.

Result: informational intent

In the informational scenario, our more “informationally” worded variation won. The transactional variation received 11.34% CTR, while the informational version received 16.36%. You can see that an extra line of text snuck through QA, which was a little annoying – however I’m not convinced this skewed the results in the way we might expect – as later testing in the transactional testing might prove.

Result: transactional intent

In the transactional scenario, the more “transactionally” worded snippet won. Our transactional variation received 19.35% of the clicks while the transactional snippet received 11.35% CTR. Interesting – if the lengthier snippet was skewing CTR, surely this test might highlight the issue?

Final result: CTR when we introduce a rich snippet

As I mentioned earlier, we only introduced a third variation (the inclusion of a rich snippet) in the transactional scenario. The final result was perhaps unsurprising. The transactional variation received 21.52% CTR, vs the informational snippet at 14.93%. Our rich snippet, transactionally worded variation won at 26.32% CTR.

The Obvious Caveats

Firstly, I’ll get the obvious caveats out of the way. This is one query – though I feel ultimately the click through will always vary based on the individual query and the intent of the user at that exact time. Our variations were far from eloquently written and were instead designed to place emphasis on the types of terms that imply purchase or information. Aside from the fact that the searchers were mTurk based, I’d also like more traffic to play with. I suspect that, because we chose the highest rated Turkers, the test is only up to around 120 usable results for each test. It’ll keep running until there are 500 which at this rate will be sometime in January 2012! We identified an anomaly where position 4 was beating or equalling position 1’s CTR in 2 tests. We think this may be because of the density of the word “pool tables” in the snippet and title, but we need to test for that.

As this was one query, the only possible conclusion I’d take away from this as a reader is, that CTR will vary depending on intent and that you should learn how to test your own variations to find a winner. With that said, I think it’s very exciting to have tools like SERP Turkey to make these kinds of test far, far easier.

The Excitement, for Me at Least

The thing that excites me is this. We can fight for better click through without moving the position we rank in, simply by experimenting with different wordings in our meta descriptions. That’s good to know. It’s good to know that a variation of a snippet may be able to persuade searchers to click a second place result more than they may a first place result. Win!

Improve the Way Your Site Looks in Search Results

Improving the way our sites appear to searchers in search results pages is an activity that we really should be engaging in more of. Later in my conversion conference presentation, I showed how using two 160(ish) character sentences in a page’s meta description (this one, to be completely clear) can result in Google displaying a more relevant snippet to the user. That was an idea I first discovered on Dave’s blog. For the sake of brevity (I am being hurried along to pack and fly to Australia as I write!) I’ll ask you to flick through the presentation below to examine exactly how I decided to word that snippet based on analytics keyword referrer data.

Responses

I think it goes back to the personas from the web design. Ideally, each personas should have a specific conversion, the SEO/SEM should work to target these personas, and align with the conversion tracking within the web analytics solution. The data should provide insights to improve the CRO process.