This post is a brief follow-up, giving some extra context and some of the things that Chris and I noticed whilst conducting this research.

tl;dr: Across 1,056 SERPs we looked at, not one of them was an exact match to the previous hour — that’s 0% stability.

When looking into some ranking discrepancies for a client, Chris noticed via some manual position checks that results seemed to be moving around as often as hourly.

In an attempt to try and quantify the scale of what was happening, we set up a small experiment which has been running over the past two weeks.

Methodology

Using our in-house rank tracking platform at StrategiQ, we set up hourly checks for the top 100 results in Google for a small set of 6 keywords. We then tracked the movement of each unique URL we saw in the SERP for just over a week.

The keywords we’re looking at are fairly competitive with around 20 major players competing for the top spots on Google.co.uk.

It’s worth noting that whilst we tried to account for as many variables as possible (targeting the same location, using IPs from the same range etc.) we haven’t accounted for potential difference caused by hitting different datacentres which may have impacted the results. Dr Pete actually looked into this way back in 2012 — so I’d recommend checking out his post too.

Results

From first-glance it certainly looks like there’s a fair bit of volatility. The graph below shows positions for one particular SERP we tracked, with each line representing a unique URL and showing their position changes over time.

If you look at the top 10 in particular, you can see indications of sites seemingly fighting for positions.

We saw a similar story across all of the different keywords we analysed, to a more or lesser extent.

Another thing we noticed is that totally new URLs which aren’t in the top 100 can rank fairly highly (page 2–3) for as little as a few hours, then disappear totally again.

We can also see that there’s a fairly constant baseline of movement. In fact, across the 1,056 SERPs we looked at, not one of them was an exact match to the previous hour.

Top 10 vs Top 100

Something consistent across all of the terms we looked at was that however chaotic the latter results were, page one would remain fairly consistent, with the top 10 domains either staying put or just flipping one or two positions between themselves.

An explanation for this could be that the URLs on page 1 could be assumed as having higher authority, therefore a larger change would be required to overtake them, whereas less change is required to make a change below page 1, but this is something for another study!

Summary

Whilst just a quick experiment and sample of what we’ve noticed so far, this is yet another reminder that Google’s results are in a constant state of unpredictable change, even ignoring major algorithm updates. Gone are the days of high flux representing an algorithm shift—change appears fairly constant, and more than most people realise. To put this in perspective, below is the same chart that we started with but looking at daily snapshots instead.

Whilst still fairly volatile, it puts in perspective how much of the true picture is being missed!

Combine this with the rise of more real-time features in the SERPs (such as news or tweet results) and we can see that rank tracking we do for our sites captures just one of many variations of a given SERP, so should only really be used as a benchmark as opposed to a primary KPI.

The original tweet we put out before this post has generated some interesting discussion over on Twitter — feel free to chime in with your thoughts!