Google Broad Core algorithm update: Is Google testing SERPs quality?

Google suggests that this update has nothing to do with the quality of content, but instead focuses on improving the quality of the SERPs.

We would tend to agree, as the only real losses we’ve seen at Pi Datametrics – while dramatic – tended to be short-lived and occurred in the run-up to the update itself.

So if Google wasn’t testing quality, what were they testing?

I turned to the SERPs to have a look; focussing on the period just before, during and after the recent update. I asked Google a relatively simple question, then analysed the results to detect any rumblings or suspicious flux.

Chart 1: Testing the Google Broad Core algorithm update

Google Query: What’s the best toothpaste?

I’ve focussed primarily on content that was visible on page 1 or 2 at the start of this year.

We can clearly see all these pages drop out of the top 100 then reappear on the same day. This occurred multiple times over a five week period.

Seven websites all performed pretty well (visible on page 1 and 2), with a further two sites appearing midway through the shakeup, that had no previous visibility (Expertreviews [dark pink] and Clevelandclinic [dark blue]).

The obvious shakeup started on 24 January, roughly five weeks before the algorithm was said to have fully rolled out (Sunday 4th March).

What we have here is a pattern we’ve seen many times before; something that is only visible with access to daily data on the entire SERPs landscape. It looks like a period of testing pre full rollout, which is only to be expected.

In the chart above we can see the flux continuing from 05 February onwards. Every site involved experiences almost the exact same pattern of visibility loss.

Things finally settle down on 8th March. At first glance, it looks like all sites regain their original positions. However, on closer inspection we can see that all came out slightly worse off, by an average of just over two positions; the smallest drop being one position (which can be painful on page one) and the largest being six.

Daily SERPs data shows you when to act and when to sit tight

If this chart says one thing, it is DON’T PANIC if you drop out of the top 100 for a term you care about!

Just keep monitoring the SERPS every day. If you’ve ruled out content cannibalisation, it could well be a period of algo testing, as with the Broad Core Update.

If you’ve put the searcher first and created the kind of rich content that will satisfy them, then the chances are you will recover from these testing times, or maybe, in the case of Expertreviews site above (following the injection of a longform, socially popular and recently updated piece of content into their ecosystem), you could even move from nowhere to position three, nudging all others down a peg.

Chart 3: Content that matched user intent was safe during the update

The only two websites entirely unaffected by all of this were Reviews.com and Which.co.uk, proving that a combination of first-mover advantage, relevance and fantastic authority, ensures high visibility and algorithmic stability:

So, the immediate questions are – who has benefited from this shakeup? What happened in the gaps between the spikes? Who’s lost out and why? Are we now seeing a SERP more aligned with the intent of the searcher?

Chart 4: Who benefited from the early shakeup?

It wasn’t Expertreviews or Clevelandclinic. They benefited later.

Let’s introduce some of the the momentary winners who gained visibility during the downtime of all sites:

Businessinsider.com benefits from the initial shakeup. It has some great content, but it’s not been updated since October 2017. It has been indexed all this time but only really becomes visible when Google pushed the previously well positioned sites out. Result? Survived the shakeup and ended on page one.

The same happened to the Colgate page. Note it’s /en-us/ TLD. Arguably, it shouldn’t be visible in the UK anyway. This page only provided a list of toothpaste types e.g. ‘Fluoride’ or ‘Tartar control’ etc. This didn’t answer my question or match my intent (Q: What’s the best toothpaste). Result? Colgate ended up dropping back to page five after the shakeup.

The Amazon page simply displays a list of its bestsellers in toothpaste. From a content perspective, it’s not that inspiring. Result? Ended up dropping back to page three.

So the question is: If I were searching for “What’s the best toothpaste?” which of these new pages would I prefer?

All pages are mobile friendly, but if I really wanted to know what the best toothpaste was, I’d definitely prefer to read the Businessinsider.com page – coincidentally the only page that moved up to page one following the shakeup and remained there.

In other words, the only one to satisfy my intent was in fact the only page that remained visible post shakeup. This page, to me answers my question perfectly.

What do these insights prove about the Broad Core update?

Based on our testing, we can deduce that this algorithm is concerned with optimising search results to support user intent, rather than to explicitly audit quality.

Why?

Because losses were not drastic, meaning we can rule out a penalty of any kind.

Of all winners, none appeared to rise as a result of content updates.

Some sites with strong, relevant content seemingly lost rankings in Google UK as they were intended for the US market. This suggests that Google was auditing relevancy factors beyond just content (i.e. location / tld), to serve the best results and satisfy user intent.

In this respect, Google’s core update was concerned with the nature rather than the quality of content.

What better way to test the match of content nature with searcher intent than by shaking up the SERPs for a couple of weeks to determine user reaction?

Should you panic when your content visibility nosedives?

If your content visibility drops, it’s always necessary to carry out checks to ensure you have done everything within your power to mitigate the issue.

In the face of an algorithm update (like Google Broad Core), however, the best advice is to do nothing but monitor the SERPS closely.

If it is algorithmic testing, you most certainly won’t be the only one involved. Other sites will follow the exact same pattern down to the day. That’s a big clue that it’s algorithmic rather than isolated.

You simply can’t know whether Google is testing its algorithm, unless you have daily SERPs data and visibility on the performance of the entire SERPs landscape for the content you’re interested in. We use Pi Datametrics SEO platform.