Related Tags

Google CTR study: top three positions deliver 35% of traffic

Ripples spread through the SEO community with the recent publication of a new Click Through Rate (CTR) study by an agency called Slingshot.

Why oh why is this important to anyone you may ask! Let me enlighten you…

Ever since SEO was coined as a term, brands employing the services of agencies have asked what they are likely to get as a result of deploying it.

To answer this agencies in the most part used to focus on the first and most logical impact of SEO, rankings.

In fact in some cases a commercial model was built around it, whereby the agency only got remunerated from positions gained (typically first, second & third page, first to tenth, eleventh to 20th, etc… and in some cases top three results only).

In those days only a brave (and foolish) man would forecast. Back then it was the wild west of SEO! Search engine algorithms only had a couple of factors influencing them. Some even returned results alphabetically.

One day you could rank top using white text on a white background, the next you be pipped by someone keyword stuffing a cloaked & meta refreshed landing page.

Obviously as a transactional site, or even just a brochureware site, ranking visibility was just the front line of measurement of success and it became logical to look beyond visibility to the visits that came from this.

However, very few providers shared any relevant data to make forecasting possible. Two notable culprits were Wordtracker (still available today) and Overture, the latter returning paid search click data from its alliance with Yahoo.

It wasn’t until Google launched its keyword tool and traffic estimator as part of its Adwords offering that we got anywhere near accurate data to use. Now we finally had data the insight generated was suitably crude.

You could now pick keywords based on their ‘search volume’. However, there was virtually no comparison to search volume and actual traffic as measured by the site's analytics.

One word of warning, and a common misapprehension of Google’s keyword tool data: its collected from its entire search inventory. That means that the returned counts include clicks on adverts from all round the web, not just search.

Obviously, back in 2006 search results looked very different than they do now. It’s also worth pointing out the following: 1) users were much less internet savvy, 2) connections were much slower, and 3) AOL users were cretins!

Multiplying the search volume from Google’s keyword tool by the click through rate from the leaked AOL data suddenly gave us the ability to determine how much traffic you might get from any top ten position in Google.

The trouble is that no one had the ability to actually determine that these figures were still wildly out, as the only way to get impression data to work out the percentage click through rate from the traffic you were actually getting was through Pay Per Click, and that was like comparing apples and oranges and had numerous other factors as caveats.

Google Webmaster Tools resolves this issue.

So what impact does this CTR study have on me?

Take a look at the data comparison table from the four different ‘studies’ below. The (conditional formatting) colour scheme allows you to easily see where you were the highest click through rates were (green) and the lowest (red):

Quite a change between 2006 and 2011. What’s most amazing is the difference between the potential click throughs in total.

For the top three results, according to the Slingshot study, you can now only expect to receive 35.5% of the available traffic. This is as opposed to almost twice that (62.5%) in 2006.

Equally impressive is the drop in total available searches for page one results (top ten). You can now only expect 52.4% of available searches on page one, as opposed to a whopping 89.6% in 2006!

Qualifying the study data

Now we are able to see impressions by keyword from Google Webmaster Tools we can determine the actual click through rate of your keywords. I’ve always been wisely sceptical, so to check this data out I conducted three real case studies.

The data is interesting, but first here’s some relevant background info on the sample set. All figures are from August. Client A is in the travel sector, Client B is an e-commerce retailer, and Client C is in utilities.

There are obviously a number of caveats to take into account when qualifying this data against your client's, least of all is the size of the data set and the stability of positional rankings. However for non-brand terms the click through rates were much higher than expected.

Now comes the interesting bit: forecasting. Knowing this data from our clients is it wise to anticipate such stonkingly good CTR from existing visibility?

I’d suggest not if you are embarking on a relationship with a client afresh, but if you are already engaged and through your work have achieved fixed top three positions to gain these CTRs then hell yeah, why not?

Forecasting just got easier but it’s still acumen and experience that allows you to anticipate your own SEO ranking capabilities that nails accuracy.

Huh. Thanks for deleting my comment. I guess you can't take criticism? I bet this one will mysteriously disappear as well - at least have the guts to respond to your critics rather than silencing them.

If the top three places take 35% and the first page takes 52% - are we to assume that the remaining 48% of clicks are lost to PPC ads and position-11 SERPs (and beyond)?

I'd be interested to know how this data tallies with PPC out-click data... The Google adwords accounts for the above stated client accounts might well give you a first page SEO and PPC out-click percentage of 95% leaving page-2 of SERPs and beyond to share only 5% of clicks.

Seems to me the SEO CTRs wil vary dramatically based on the presence and relevance of PPC ads. Blended search must also come into play - you are going to get video ads, maps, products n' all sorts cropping up.

I wonder if the changing face of the SERPs has anything to do with the dramatic percentage drop. Now the search results are filled with videos, images, local listings and more BEFORE you get to the actual listings. That's a lot more noise to contend with, even if you are ranking in the top 3.

The 2006 AOL data was pre-universal search, and before the phenomenon that scattered the heatmap around the page which for a while became known as 'chunking & fencing'!

almost 7 years ago

Adam Lee, Managing Director at No Pork Pies

I've always felt that historical CTR studies don't take into account brand search spikes, but my main concern was with variables in industry based behaviour.

When Tim looked into the research on three different sector clients it was very interesting to see the position 1-3 split of the different sectors. E-commerce having a fairly even split, possibly due to the price sensitive nature of the results, whereas Travel and Utilities having a spike in position 1.

Tim would you say this could indicate that a price sensitive keyword should be more focused on their message & price regardless of where they rank in the top three?

Great minds think alike. I covered much of the same ground in a recent posting, but check out my part 2 where I further calibrated the curve against real world data, to make it easy to calculate organic opportunity vs. position.

There are a number of problems, especially abandoned searches, you have to account for.

Also, it's important not to use GWT impression data, as if your client is ranking 11 for a keyword, that means they are on page 2. You'll only see the page 2 impressions registering on GWT, there won't be any evidence of the page 1 only searches that occurred, so it's not a reliable way to gauge opportunity - Adwords data itself is the way to go.

Interesting insight, but as Tim mentions there are a number of factors that would dilute the keyword volume. the SEOMOZ blog on eye tracking which surely will dilute the potential volume of search traffic as well as PPC % ?

Secondly, I think you have to take this analysis with a pinch of salt, surely it depends on what sector you operate - e.g. selling a competitive product e.g. football shirts, the potential keyword volume would be diluted due to the volume of adword activity?

Your own data seems to contradict the findings of the slingshot report, as does the evidence from our own clients. Whilst I can see that web users are getting more "savvy" and using search results more intelligently, a 50% drop in first place click throughs in a year would be a phenomenal shift.

I would suggest either the last report (Optify) or the new one (Slingshot) - or both - are wrong. I simply cannot see such a sea change happening in that space of time, especially when it doesn't seem to be backed up by experience of actual sites.

The only way I can see that this would happen is if another parameter has been included, or excluded - such as including the Adwords into the mix, adding in results from Shopping search (where users have historically been more likely to look deeper) etc etc. The other factor which could have a bearing is if research search strings have also been included, where again users will dig deeper.

Apologies to anyone whose comments were not published immediately. This is due to our spam filter, which sometimes picks up genuine comments.

@Rand You're absolutely right. Tim did initially add the link, but I forgot the put it back in after rewording the first para. The link's in there now, Slingshot should hopefully get a few more downloads of an excellent study.

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Digital Pulse newsletter. You will receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.