Tag Archive | "Traffic"

Google’s increasing dominance of their own search engine results pages (SERPs) has kicked up a lot of panic and controversy in the SEO industry. As Barry Adams pointed out on Twitter recently, this move by Google is not exactly new, but it does feel like Google has suddenly placed their foot on the accelerator:

I find it hilarious that SEOs are suddenly annoyed that Google is aggressively taking over some verticals with in-SERP features. They’ve been doing that for years.

What do you think the EU antitrust case is about?! Or do you suddenly care because it affects your clients?— Barry Adams (@badams) March 15, 2018

Follow that Twitter thread and you’ll see the sort of back-and-forth these changes have started to create. Is this an ethical move by Google? Did you deserve the business they’re taking in the first place? Will SEO soon be dead? Or can we do what we’ve always done and adapt our strategies in smart, agile ways?

It’s hard to think positive when Google takes a stab at you like it did with this move on Ookla:

But regardless of how you feel about what’s happening, local packs, featured snippets, and SERP features from Google, properties like Google News, Images, Flights, Videos, and Maps are riding on a train that has no plans on stopping.

To give you an idea of how rapid these changes are occurring, the image below is what the SERP rankings looked like in November 2016 for one of our client’s key head terms:

And this image is the SERP for the same keyword by early December 2017 (our client is in green):

Who is this blog post for?

You’re likely reading this blog post because you noticed your organic traffic has dropped and you suspect it could be Google tanking you.

Traffic drops tend to come about from four main causes: a drop in rankings, a decrease in search volume, you are now ranking for fewer keywords, or because SERP features and/or advertising are depressing your CTRs.

If you have not already done a normal traffic drop analysis and ruled out the first three causes, then your time is better spent doing that first. But if you have done a traffic drop analysis and reached the conclusion that you’re likely to be suffering from a change in SERP features, then keep reading.

But I’m too lazy to do a full analysis

Aside from ruling everything else out, other strong indications that SERP features are to blame will be a significant drop in clicks (either broadly or especially for specific queries) in Google Search Console where average ranking is static, but a near consistent amount of impressions.

I’ll keep harping on about this point, but make sure that you check clicks vs impressions for both mobile and desktop. Do this both broadly and for specific key head terms.

When you spend most of your day working on a desktop computer, sometimes in this industry we forget how much mobile actually dominates the scene. On desktop, the impact these have on traffic there is not as drastic; but when you go over to a mobile device, it’s not uncommon for it to take around four full scrolls down before organic listings appear.

From there, the steps to dealing with a Google-induced traffic drop are roughly as follows:

Narrow down your traffic drop to the introduction of SERP features or an increase in paid advertising

Figure out what feature(s) you are being hit by

Gain hard evidence from SEO tools and performance graphs

Adapt your SEO strategy accordingly

That covers step one, so let’s move on.

Step 2.0: Figure out which feature(s) you are being hit by

For a comprehensive list of all the different enhanced results that appear on Google, Overthink Group has documented them here. To figure out which one is impacting you, follow the below steps.

Step 2.1

Based off of your industry, you probably already have an idea of which features you’re most vulnerable to.

Are you an e-commerce website? Google Shopping and paid advertising will be a likely candidate.

Do you tend to generate a lot of blog traffic? Look at who owns the featured snippets on your most important queries.

Are you a media company? Check and see if you are getting knocked out of top news results.

Do you run a listings site? Maybe you’re being knocked by sponsored listings or Google Jobs.

Step 2.2

From there, sanity check this by spot-checking the SERPs for a couple of the keywords you’re concerned about to get a sense for what changed. If you roughly know what you’re looking for when you dig into the data, it will be easier to spot. This works well for SERP features, but determining a change in the amount of paid advertising will be harder to spot this way.

Once again, be sure to do this on both mobile and desktop. What may look insignificant from your office computer screen could be showing you a whole different story on your mobile device.

Step 3.0: Gain hard evidence from SEO tools and performance graphs

Once you have a top level idea of what has changed, you need to confirm it with SEO tools. If you have access to one, a historical rank tracking tool will be the most efficient way to dig into how your SERPs are evolving. I most frequently use STAT, but other great tools for this are Moz’s SERP features report, SEOmonitor, and SEMRush.

Using one of these tools, look back at historical data (either broadly or for specific important keywords) and find the date the SERP feature appeared if you can. Once you have this date, line it up with a dip in your organic traffic or other performance metric. If there’s a match, you can be pretty confident that’s to blame.

For example, here’s what this analysis looked like for one of our clients on a keyword with a regional search volume of 49,500. They got hit hard on mobile-first by the appearance of a local pack, then an events snippet 10 days later.

This was the clicks and impression data for the head term on mobile from Google Search Console:

As this case demonstrates, here’s another strong reminder that when you’re analyzing these changes, you must check both mobile and desktop. Features like knowledge panels are much more intrusive on mobile devices than they are on desktop, so while you may not be seeing a dramatic change in your desktop traffic, you may on mobile.

For this client we improved their structured data so that they showed up in the event snippet instead, and were able to recover a good portion of the lost traffic.

How to adapt your SEO strategy

You may not be able to fully recover, but here are some different strategies you can use depending on the SERP feature. Use these links to jump to a specific section:

Have you tried bidding to beat Google?

I cover what to do if you’re specifically losing out on organic traffic due to paid advertising (spoiler alert: you’re probably gonna have to pay), but paid advertising can also be used as a tactic to subvert Google SERP features.

For example, Sky Scanner has done this by bidding on the query “flights” so they appear above the Google Flights widget:

Accelerated Mobile Pages (AMP)

AMP is a project sponsored by Google to improve the speed of mobile pages. For a lot of these challenges, implementing AMP may be a way to improve your rankings as Google SERPs continue to change.

If you’ve noticed a number of websites with AMP implemented are ranking on the first page of SERPs you care about, it’s likely worth investigating.

If you are a news website, implementing AMP is absolutely a must.

Featured snippets and PAA boxes

If you’re losing traffic because one of your competitors owns the featured snippets on your SERPs, then you need to optimize your content to win featured snippets. I’ve already written a blog post for our Distilled blog on tactics to steal them before, which you can read here.

In summary, though, you have a chance to win a featured snippet if:

The ones you’re targeting are pretty volatile or frequently changing hands, as that’s a good indication the owner doesn’t have a strong hold on it

If you rank higher than the current owner, as this indicates Google prefers your page; the structure of your content simply needs some tweaking to win the snippet

If you’ve identified some featured snippets you have a good chance of stealing, compare what the current owner has done with their content that you haven’t. Typically it’s things like the text heading the block of content and the format of the content that differentiates a featured snippet owner from your content.

Local packs

At SearchLove London 2018, Rob Bucci shared data from STAT on local packs and search intent. Local SEO is a big area that I can’t cover fully here, but if you’re losing traffic because a local pack has appeared that you’re not being featured in, then you need to try and optimize your Google My Business listing for the local pack if you can. For a more in depth instruction on how you can get featured in a local pack, read here.

Unfortunately, it may just not be possible for you to be featured, but if it’s a query you have a chance at appearing in local pack for, you first need to get set up on Google My Business with a link to your website.

Once you have Google My Business set up, make sure the contact and address information is correct.

Reviews are incredibly important for anyone competing within a local pack, and not just high reviews but also the number of reviews you’ve received is important. You should also consider creating Google Posts. In a lot of spaces this feature is yet to have been taken advantage of, which means you could be able to get a jumpstart on your competitors.

Paid advertising

More queries are seeing paid advertisements now, and there are also more ads appearing per query, as told in this Moz post.

If you’re losing traffic because a competitor has set up a PPC campaign and started to bid on keywords you’re ranking well for, then you may need to consider overbidding on these queries if they’re important to you.

Unfortunately, there’s no real secret here: either you gotta pay or you’re going to have to shift your focus to other target queries.

You should have already done so, but if you haven’t already included structured data on your website you need to, as it will help you stand out on SERPs with lots of advertising. Wrapped into this is the need to get good reviews for your brand and for your products.

Google Shopping

Similar to paid advertising, if the appearance of Google Shopping sponsored ads has taken over your SERPs, you should consider whether it’s worth you building your own Google Shopping campaign.

Again, structured data will be an important tactic to employ here as well. If you’re competing with Google Shopping ads, you’re competing with product listings that have images, prices, and reviews directly in the SERP results to draw in users. You should have the same.

Look into getting your pages implemented in Accelerated Mobile Pages (AMP), which is sponsored by Google. Not only has Google shown it favors pages that are in AMP, better site speed will lead to better conversion rates for your site.

To see if implementing AMP may be beneficial to your business, you can read some case studies of other businesses that have done so here.

Knowledge panels and carousels

Knowledge panels such as the one below appear for broad informational searches, and rarely on highly converting keywords. While they are arguably the most imposing of all the SERP features, unless you’re a content site or CelebrityNetWorth.com, they probably steal some of your less valuable traffic.

If you’re losing clicks due to knowledge panels, it’s likely happening on queries that typically can be satisfied by quick answers and therefore are by users who might have bounced from your site anyway. You won’t be able to beat a knowledge panel for quick answers, but you can optimize your content to satisfy affiliated longer-tail queries that users will still scroll to organic listings to find.

Create in-depth content that answers these questions and make sure that you have strong title tags and meta descriptions for these pages so you can have a better chance of standing out in the SERP.

In some cases, knowledge panels may be something you can exploit for your branded search queries. There’s no guaranteed way to get your content featured in a knowledge panel, and the information presented in them does not come from your site, so they can’t be “won” in the same way as a featured snippet.

To get into a knowledge panel, you can try using structured data markup or try to get your brand on Wikipedia if you haven’t already. The Knowledge Graph relies heavily on existing databases like Wikipedia that users directly contribute to, so developing more Wikipedia articles for your brand and any personal brands associated with it can be one avenue to explore.

Search Engine Journal has some tips on how to implement both of these strategies and more in their blog post here.

Google Jobs

Google Jobs has taken up huge amounts of organic real estate from listing sites. It will be tough to compete, but there are strategies you can employ, especially if you run a niche job boards site.

Shifting your digital strategy to integrate more paid advertising so you can sit above Google and to generating content in other areas, like on news websites and advice boards, can help you.

To conclude

Look, I’d be lying to you if I said this was good news for us SEOs. It’s not. Organic is going to get more and more difficult. But it’s not all doom and gloom. As Rand Fishkin noted in his BrightonSEO speech this September, if we create intelligent SEO strategies with an eye towards the future, then we have the opportunity to be ahead of the curve when the real disruption hits.

We also need to start integrating our SEO strategies with other mediums; we need to be educated on optimizing for social media, paid advertising, and other tactics for raising brand awareness. The more adaptable and diverse your online marketing strategies are, the better.

Google will always be getting smarter, which just means we have to get smarter too.

“If you define SEO as the ability to manipulate your way to the top of search rankings, then SEO will die. But if you define SEO as the practice of improving a website’s visibility in search results, then SEO will never die; it will only continue to evolve.”

Search, like nearly every other industry today, will continue to come against dramatic unanticipated changes in the future. Yet search will also only continue to grow in importance. It may become increasingly more difficult to manipulate your way to the top of search results, but there will always be a need to try, and Google will continue to reward content that serves its users well.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

We rely pretty heavily on Google, but some of their decisions of late have made doing SEO more difficult than it used to be. Which organic opportunities have been taken away, and what are some potential solutions? Rand covers a rather unsettling trend for SEO in this week’s Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re talking about something kind of unnerving. What do we, as SEOs, do as Google is removing organic search traffic?

So for the last 19 years or 20 years that Google has been around, every month Google has had, at least seasonally adjusted, not just more searches, but they’ve sent more organic traffic than they did that month last year. So this has been on a steady incline. There’s always been more opportunity in Google search until recently, and that is because of a bunch of moves, not that Google is losing market share, not that they’re receiving fewer searches, but that they are doing things that makes SEO a lot harder.

Some scary news

Things like…

Aggressive “answer” boxes. So you search for a question, and Google provides not just necessarily a featured snippet, which can earn you a click-through, but a box that truly answers the searcher’s question, that comes directly from Google themselves, or a set of card-style results that provides a list of all the things that the person might be looking for.

Google is moving into more and more aggressively commercial spaces, like jobs, flights, products, all of these kinds of searches where previously there was opportunity and now there’s a lot less. If you’re Expedia or you’re Travelocity or you’re Hotels.com or you’re Cheapflights and you see what’s going on with flight and hotel searches in particular, Google is essentially saying, “No, no, no. Don’t worry about clicking anything else. We’ve got the answers for you right here.”

We also saw for the first time a seasonally adjusted drop, a drop in total organic clicks sent. That was between August and November of 2017. It was thanks to the Jumpshot dataset. It happened at least here in the United States. We don’t know if it’s happened in other countries as well. But that’s certainly concerning because that is not something we’ve observed in the past. There were fewer clicks sent than there were previously. That makes us pretty concerned. It didn’t go down very much. It went down a couple of percentage points. There’s still a lot more clicks being sent in 2018 than there were in 2013. So it’s not like we’ve dipped below something, but concerning.

New zero-result SERPs. We absolutely saw those for the first time. Google rolled them back after rolling them out. But, for example, if you search for the time in London or a Lagavulin 16, Google was showing no results at all, just a little box with the time and then potentially some AdWords ads. So zero organic results, nothing for an SEO to even optimize for in there.

Local SERPs that remove almost all need for a website. Then local SERPs, which have been getting more and more aggressively tuned so that you never need to click the website, and, in fact, Google has made it harder and harder to find the website in both mobile and desktop versions of local searches. So if you search for Thai restaurant and you try and find the website of the Thai restaurant you’re interested in, as opposed to just information about them in Google’s local pack, that’s frustratingly difficult. They are making those more and more aggressive and putting them more forward in the results.

Potential solutions for marketers

So, as a result, I think search marketers really need to start thinking about: What do we do as Google is taking away this opportunity? How can we continue to compete and provide value for our clients and our companies? I think there are three big sort of paths — I won’t get into the details of the paths — but three big paths that we can pursue.

The first one is pretty powerful and pretty awesome, which is investing in demand generation, rather than just demand serving, but demand generation for brand and branded product names. Why does this work? Well, because let’s say, for example, I’m searching for SEO tools. What do I get? I get back a list of results from Google with a bunch of mostly articles saying these are the top SEO tools. In fact, Google has now made a little one box, card-style list result up at the top, the carousel that shows different brands of SEO tools. I don’t think Moz is actually listed in there because I think they’re pulling from the second or the third lists instead of the first one. Whatever the case, frustrating, hard to optimize for. Google could take away demand from it or click-through rate opportunity from it.

But if someone performs a search for Moz, well, guess what? I mean we can nail that sucker. We can definitely rank for that. Google is not going to take away our ability to rank for our own brand name. In fact, Google knows that, in the navigational search sense, they need to provide the website that the person is looking for front and center. So if we can create more demand for Moz than there is for SEO tools, which I think there’s something like 5 or 10 times more demand already for Moz than there is tools, according to Google Trends, that’s a great way to go. You can do the same thing through your content, through your social media, and through your email marketing. Even through search you can search and create demand for your brand rather than unbranded terms.

2. Optimize for additional platforms.

Second thing, optimizing across additional platforms. So we’ve looked and YouTube and Google Images account for about half of the overall volume that goes to Google web search. So between these two platforms, you’ve got a significant amount of additional traffic that you can optimize for. Images has actually gotten less aggressive. Right now they’ve taken away the “view image directly” link so that more people are visiting websites via Google Images. YouTube, obviously, this is a great place to build brand affinity, to build awareness, to create demand, this kind of demand generation to get your content in front of people. So these two are great platforms for that.

There are also significant amounts of web traffic still on the social web — LinkedIn, Facebook, Twitter, Pinterest, Instagram, etc., etc. The list goes on. Those are places where you can optimize, put your content forward, and earn traffic back to your websites.

3. Optimize the content that Google does show.

Local

So if you’re in the local space and you’re saying, “Gosh, Google has really taken away the ability for my website to get the clicks that it used to get from Google local searches,” going into Google My Business and optimizing to provide information such that people who perform that query will be satisfied by Google’s result, yes, they won’t get to your website, but they will still come to your business, because you’ve optimized the content such that Google is showing, through Google My Business, such that those searchers want to engage with you. I think this sometimes gets lost in the SEO battle. We’re trying so hard to earn the click to our site that we’re forgetting that a lot of search experience ends right at the SERP itself, and we can optimize there too.

Results

In the zero-results sets, Google was still willing to show AdWords, which means if we have customer targets, we can use remarketed lists for search advertising (RLSA), or we can run paid ads and still optimize for those. We could also try and claim some of the data that might show up in zero-result SERPs. We don’t yet know what that will be after Google rolls it back out, but we’ll find out in the future.

Answers

For answers, the answers that Google is giving, whether that’s through voice or visually, those can be curated and crafted through featured snippets, through the card lists, and through the answer boxes. We have the opportunity again to influence, if not control, what Google is showing in those places, even when the search ends at the SERP.

All right, everyone, thanks for watching for this edition of Whiteboard Friday. We’ll see you again next week. Take care.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

There is no doubt that Google Analytics is one of the most important tools you could use to understand your users’ behavior and measure the performance of your site. There’s a reason it’s used by millions across the world.

But despite being such an essential part of the decision-making process for many businesses and blogs, I often find sites (of all sizes) that do little or no data filtering after installing the tracking code, which is a huge mistake.

Think of a Google Analytics property without filtered data as one of those styrofoam cakes with edible parts. It may seem genuine from the top, and it may even feel right when you cut a slice, but as you go deeper and deeper you find that much of it is artificial.

If you’re one of those that haven’t properly configured their Google Analytics and you only pay attention to the summary reports, you probably won’t notice that there’s all sorts of bogus information mixed in with your real user data.

And as a consequence, you won’t realize that your efforts are being wasted on analyzing data that doesn’t represent the actual performance of your site.

To make sure you’re getting only the real ingredients and prevent you from eating that slice of styrofoam, I’ll show you how to use the tools that GA provides to eliminate all the artificial excess that inflates your reports and corrupts your data.

Common Google Analytics threats

As most of the people I’ve worked with know, I’ve always been obsessed with the accuracy of data, mainly because as a marketer/analyst there’s nothing worse than realizing that you’ve made a wrong decision because your data wasn’t accurate. That’s why I’m continually exploring new ways of improving it.

As a result of that research, I wrote my first Moz post about the importance of filtering in Analytics, specifically about ghost spam, which was a significant problem at that time and still is (although to a lesser extent).

While the methods described there are still quite useful, I’ve since been researching solutions for other types of Google Analytics spam and a few other threats that might not be as annoying, but that are equally or even more harmful to your Analytics.

Let’s review, one by one.

Ghosts, crawlers, and other types of spam

The GA team has done a pretty good job handling ghost spam. The amount of it has been dramatically reduced over the last year, compared to the outbreak in 2015/2017.

However, the millions of current users and the thousands of new, unaware users that join every day, plus the majority’s curiosity to discover why someone is linking to their site, make Google Analytics too attractive a target for the spammers to just leave it alone.

The same logic can be applied to any widely used tool: no matter what security measures it has, there will always be people trying to abuse its reach for their own interest. Thus, it’s wise to add an extra security layer.

Take, for example, the most popular CMS: WordPress. Despite having some built-in security measures, if you don’t take additional steps to protect it (like setting a strong username and password or installing a security plugin), you run the risk of being hacked.

The same happens to Google Analytics, but instead of plugins, you use filters to protect it.

In which reports can you look for spam?

Spam traffic will usually show as a Referral, but it can appear in any part of your reports, even in unsuspecting places like a language or page title.

Sometimes spammers will try to fool by using misleading URLs that are very similar to known websites, or they may try to get your attention by using unusual characters and emojis in the source name.

Independently of the type of spam, there are 3 things you always should do when you think you found one in your reports:

Never visit the suspicious URL. Most of the time they’ll try to sell you something or promote their service, but some spammers might have some malicious scripts on their site.

This goes without saying, but never install scripts from unknown sites; if for some reason you did, remove it immediately and scan your site for malware.

Filter out the spam in your Google Analytics to keep your data clean (more on that below).

If you’re not sure whether an entry on your report is real, try searching for the URL in quotes (“example.com”). Your browser won’t open the site, but instead will show you the search results; if it is spam, you’ll usually see posts or forums complaining about it.

If you still can’t find information about that particular entry, give me a shout — I might have some knowledge for you.

Bot traffic

A bot is a piece of software that runs automated scripts over the Internet for different purposes.

There are all kinds of bots. Some have good intentions, like the bots used to check copyrighted content or the ones that index your site for search engines, and others not so much, like the ones scraping your content to clone it.

In either case, this type of traffic is not useful for your reporting and might be even more damaging than spam both because of the amount and because it’s harder to identify (and therefore to filter it out).

It’s worth mentioning that bots can be blocked from your server to stop them from accessing your site completely, but this usually involves editing sensible files that require high technical knowledge, and as I said before, there are good bots too.

So, unless you’re receiving a direct attack that’s skewing your resources, I recommend you just filter them in Google Analytics.

In which reports can you look for bot traffic?

Bots will usually show as Direct traffic in Google Analytics, so you’ll need to look for patterns in other dimensions to be able to filter it out. For example, large companies that use bots to navigate the Internet will usually have a unique service provider.

I’ll go into more detail on this below.

Internal traffic

Most users get worried and anxious about spam, which is normal — nobody likes weird URLs showing up in their reports. However, spam isn’t the biggest threat to your Google Analytics.

You are!

The traffic generated by people (and bots) working on the site is often overlooked despite the huge negative impact it has. The main reason it’s so damaging is that in contrast to spam, internal traffic is difficult to identify once it hits your Analytics, and it can easily get mixed in with your real user data.

There are different types of internal traffic and different ways of dealing with it.

Direct internal traffic

Testers, developers, marketing team, support, outsourcing… the list goes on. Any member of the team that visits the company website or blog for any purpose could be contributing.

In which reports can you look for direct internal traffic?

Unless your company uses a private ISP domain, this traffic is tough to identify once it hits you, and will usually show as Direct in Google Analytics.

Third-party sites/tools

This type of internal traffic includes traffic generated directly by you or your team when using tools to work on the site; for example, management tools like Trello or Asana,

It also considers traffic coming from bots doing automatic work for you; for example, services used to monitor the performance of your site, like Pingdom or GTmetrix.

Some types of tools you should consider:

Project management

Social media management

Performance/uptime monitoring services

SEO tools

In which reports can you look for internal third-party tools traffic?

This traffic will usually show as Referral in Google Analytics.

Development/staging environments

Some websites use a test environment to make changes before applying them to the main site. Normally, these staging environments have the same tracking code as the production site, so if you don’t filter it out, all the testing will be recorded in Google Analytics.

In which reports can you look for development/staging environments?

This traffic will usually show as Direct in Google Analytics, but you can find it under its own hostname (more on this later).

Web archive sites and cache services

Archive sites like the Wayback Machine offer historical views of websites. The reason you can see those visits on your Analytics — even if they are not hosted on your site — is that the tracking code was installed on your site when the Wayback Machine bot copied your content to its archive.

One thing is for certain: when someone goes to check how your site looked in 2015, they don’t have any intention of buying anything from your site — they’re simply doing it out of curiosity, so this traffic is not useful.

In which reports can you look for traffic from web archive sites and cache services?

You can also identify this traffic on the hostname report.

A basic understanding of filters

The solutions described below use Google Analytics filters, so to avoid problems and confusion, you’ll need some basic understanding of how they work and check some prerequisites.

Things to consider before using filters:

1. Create an unfiltered view.

Before you do anything, it’s highly recommendable to make an unfiltered view; it will help you track the efficacy of your filters. Plus, it works as a backup in case something goes wrong.

2. Make sure you have the correct permissions.

You will need edit permissions at the account level to create filters; edit permissions at view or property level won’t work.

3. Filters don’t work retroactively.

In GA, aggregated historical data can’t be deleted, at least not permanently. That’s why the sooner you apply the filters to your data, the better.

4. The changes made by filters are permanent!

If your filter is not correctly configured because you didn’t enter the correct expression (missing relevant entries, a typo, an extra space, etc.), you run the risk of losing valuable data FOREVER; there is no way of recovering filtered data.

But don’t worry — if you follow the recommendations below, you shouldn’t have a problem.

5. Wait for it.

Most of the time you can see the effect of the filter within minutes or even seconds after applying it; however, officially it can take up to twenty-four hours, so be patient.

Types of filters

There are two main types of filters: predefined and custom.

Predefined filters are very limited, so I rarely use them. I prefer to use the custom ones because they allow regular expressions, which makes them a lot more flexible.

Within the custom filters, there are five types: exclude, include, lowercase/uppercase, search and replace, and advanced.

Here we will use the first two: exclude and include. We’ll save the rest for another occasion.

Essentials of regular expressions

If you already know how to work with regular expressions, you can jump to the next section.

REGEX (short for regular expressions) are text strings prepared to match patterns with the use of some special characters. These characters help match multiple entries in a single filter.

Don’t worry if you don’t know anything about them. We will use only the basics, and for some filters, you will just have to COPY-PASTE the expressions I pre-built.

REGEX special characters

There are many special characters in REGEX, but for basic GA expressions we can focus on three:

^ The caret: used to indicate the beginning of a pattern,

$ The dollar sign: used to indicate the end of a pattern,

| The pipe or bar: means “OR,” and it is used to indicate that you are starting a new pattern.

When using the pipe character, you should never ever:

Put it at the beginning of the expression,

Put it at the end of the expression,

Put 2 or more together.

Any of those will mess up your filter and probably your Analytics.

A simple example of REGEX usage

Let’s say I go to a restaurant that has an automatic machine that makes fruit salad, and to choose the fruit, you should use regular xxpressions.

This super machine has the following fruits to choose from: strawberry, orange, blueberry, apple, pineapple, and watermelon.

To make a salad with my favorite fruits (strawberry, blueberry, apple, and watermelon), I have to create a REGEX that matches all of them. Easy! Since the pipe character “|” means OR I could do this:

REGEX 1: strawberry|blueberry|apple|watermelon

The problem with that expression is that REGEX also considers partial matches, and since pineapple also contains “apple,” it would be selected as well… and I don’t like pineapple!

To avoid that, I can use the other two special characters I mentioned before to make an exact match for apple. The caret “^” (begins here) and the dollar sign “$ ” (ends here). It will look like this:

REGEX 2: strawberry|blueberry|^apple$ |watermelon

The expression will select precisely the fruits I want.

But let’s say for demonstration’s sake that the fewer characters you use, the cheaper the salad will be. To optimize the expression, I can use the ability for partial matches in REGEX.

Since strawberry and blueberry both contain “berry,” and no other fruit in the list does, I can rewrite my expression like this:

Optimized REGEX: berry|^apple$ |watermelon

That’s it — now I can get my fruit salad with the right ingredients, and at a lower price.

3 ways of testing your filter expression

As I mentioned before, filter changes are permanent, so you have to make sure your filters and REGEX are correct. There are 3 ways of testing them:

Right from the filter window; just click on “Verify this filter,” quick and easy. However, it’s not the most accurate since it only takes a small sample of data.

Using an online REGEX tester; very accurate and colorful, you can also learn a lot from these, since they show you exactly the matching parts and give you a brief explanation of why.

Using an in-table temporary filter in GA; you can test your filter against all your historical data. This is the most precise way of making sure you don’t miss anything.

If you’re doing a simple filter or you have plenty of experience, you can use the built-in filter verification. However, if you want to be 100% sure that your REGEX is ok, I recommend you build the expression on the online tester and then recheck it using an in-table filter.

Quick REGEX challenge

Here’s a small exercise to get you started. Go to this premade example with the optimized expression from the fruit salad case and test the first 2 REGEX I made. You’ll see live how the expressions impact the list.

Now make your own expression to pay as little as possible for the salad.

Remember:

We only want strawberry, blueberry, apple, and watermelon;

The fewer characters you use, the less you pay;

You can do small partial matches, as long as they don’t include the forbidden fruits.

Tip: You can do it with as few as 6 characters.

Now that you know the basics of REGEX, we can continue with the filters below. But I encourage you to put “learn more about REGEX” on your to-do list — they can be incredibly useful not only for GA, but for many tools that allow them.

How to create filters to stop spam, bots, and internal traffic in Google Analytics

Back to our main event: the filters!

Where to start: To avoid being repetitive when describing the filters below, here are the standard steps you need to follow to create them:

Go to the admin section in your Google Analytics (the gear icon at the bottom left corner),

Under the View column (master view), click the button “Filters” (don’t click on “All filters“ in the Account column):

Click the red button “+Add Filter” (if you don’t see it or you can only apply/remove already created filters, then you don’t have edit permissions at the account level. Ask your admin to create them or give you the permissions.):

Then follow the specific configuration for each of the filters below.

The filter window is your best partner for improving the quality of your Analytics data, so it will be a good idea to get familiar with it.

Valid hostname filter (ghost spam, dev environments)

Prevents traffic from:

Ghost spam

Development hostnames

Scraping sites

Cache and archive sites

This filter may be the single most effective solution against spam. In contrast with other commonly shared solutions, the hostname filter is preventative, and it rarely needs to be updated.

Ghost spam earns its name because it never really visits your site. It’s sent directly to the Google Analytics servers using a feature called Measurement Protocol, a tool that under normal circumstances allows tracking from devices that you wouldn’t imagine that could be traced, like coffee machines or refrigerators.

Real users pass through your server, then the data is sent to GA; hence it leaves valid information. Ghost spam is sent directly to GA servers, without knowing your site URL; therefore all data left is fake. Source: carloseo.com

The spammer abuses this feature to simulate visits to your site, most likely using automated scripts to send traffic to randomly generated tracking codes (UA-0000000-1).

Since these hits are random, the spammers don’t know who they’re hitting; for that reason ghost spam will always leave a fake or (not set) host. Using that logic, by creating a filter that only includes valid hostnames all ghost spam will be left out.

Where to find your hostnames

Essentially, a hostname is any place where your GA tracking code is present. You can get this information from the hostname report:

Go to Audience > Select Network > At the top of the table change the primary dimension to Hostname.

If your Analytics is active, you should see at least one: your domain name. If you see more, scan through them and make a list of all the ones that are valid for you.

Types of hostname you can find

The good ones:

Type

Example

Your domain and subdomains

yourdomain.com

Tools connected to your Analytics

YouTube, MailChimp

Payment gateways

Shopify, booking systems

Translation services

Google Translate

Mobile speed-up services

Google weblight

The bad ones (by bad, I mean not useful for your reports):

Type

Example/Description

Staging/development environments

staging.yourdomain.com

Internet archive sites

web.archive.org

Scraping sites that don’t bother to trim the content

The URL of the scraper

Spam

Most of the time they will show their URL, but sometimes they may use the name of a known website to try to fool you. If you see a URL that you don’t recognize, just think, “do I manage it?” If the answer is no, then it isn’t your hostname.

(not set) hostname

It usually comes from spam. On rare occasions it’s related to tracking code issues.

Below is an example of my hostname report. From the unfiltered view, of course, the master view is squeaky clean.

Now with the list of your good hostnames, make a regular expression. If you only have your domain, then that is your expression; if you have more, create an expression with all of them as we did in the fruit salad example:

Hostname REGEX (example)

yourdomain.com|hostname2|hostname3|hostname4

Important! You cannot create more than one “Include hostname filter”; if you do, you will exclude all data. So try to fit all your hostnames into one expression (you have 255 characters).

The “valid hostname filter” configuration:

Filter Name: Include valid hostnames

Filter Type: Custom > Include

Filter Field: Hostname

Filter Pattern: [hostname REGEX you created]

Campaign source filter (Crawler spam, internal sources)

Prevents traffic from:

Crawler spam

Internal third-party tools (Trello, Asana, Pingdom)

Important note: Even if these hits are shown as a referral, the field you should use in the filter is “Campaign source” — the field “Referral” won’t work.

Filter for crawler spam

The second most common type of spam is crawler. They also pretend to be a valid visit by leaving a fake source URL, but in contrast with ghost spam, these do access your site. Therefore, they leave a correct hostname.

You will need to create an expression the same way as the hostname filter, but this time, you will put together the source/URLs of the spammy traffic. The difference is that you can create multiple exclude filters.

Filter for internal third-party tools

Although you can combine your crawler spam filter with internal third-party tools, I like to have them separated, to keep them organized and more accessible for updates.

The “internal tools filter” configuration:

Filter Name: Exclude internal tool sources

Filter Pattern: [tool source REGEX]

Internal Tools REGEX (example)

trello|asana|redmine

In case, that one of the tools that you use internally also sends you traffic from real visitors, don’t filter it. Instead, use the “Exclude Internal URL Query” below.

For example, I use Trello, but since I share analytics guides on my site, some people link them from their Trello accounts.

Filters for language spam and other types of spam

The previous two filters will stop most of the spam; however, some spammers use different methods to bypass the previous solutions.

For example, they try to confuse you by showing one of your valid hostnames combined with a well-known source like Apple, Google, or Moz. Even my site has been a target (not saying that everyone knows my site; it just looks like the spammers don’t agree with my guides).

However, even if the source and host look fine, the spammer injects their message in another part of your reports like the keyword, page title, and even as a language.

In those cases, you will have to take the dimension/report where you find the spam and choose that name in the filter. It’s important to consider that the name of the report doesn’t always match the name in the filter field:

Report name

Filter field

Language

Language settings

Referral

Campaign source

Organic Keyword

Search term

Service Provider

ISP Organization

Network Domain

ISP Domain

Here are a couple of examples.

The “language spam/bot filter” configuration:

Filter Name: Exclude language spam

Filter Type: Custom > Exclude

Filter Field: Language settings

Filter Pattern: [Language REGEX]

Language Spam REGEX (Prebuilt)

\s[^\s]*\s|.{15,}|\.|,|^c$

The expression above excludes fake languages that don’t meet the required format. For example, take these weird messages appearing instead of regular languages like en-us or es-es:

Examples of language spam

The organic/keyword spam filter configuration:

Filter Name: Exclude organic spam

Filter Type: Custom > Exclude

Filter Field: Search term

Filter Pattern: [keyword REGEX]

Filters for direct bot traffic

Bot traffic is a little trickier to filter because it doesn’t leave a source like spam, but it can still be filtered with a bit of patience.

The first thing you should do is enable bot filtering. In my opinion, it should be enabled by default.

Go to the Admin section of your Analytics and click on View Settings. You will find the option “Exclude all hits from known bots and spiders” below the currency selector:

It would be wonderful if this would take care of every bot — a dream come true. However, there’s a catch: the key here is the word “known.” This option only takes care of known bots included in the “IAB known bots and spiders list.” That’s a good start, but far from enough.

To start your bot trail search, click on the Segment box at the top of any report, and select the “Direct traffic” segment.

Then navigate through different reports to see if you find anything suspicious.

Some reports to start with:

Service provider

Browser version

Network domain

Screen resolution

Flash version

Country/City

Signs of bot traffic

Although bots are hard to detect, there are some signals you can follow:

An unnatural increase of direct traffic

Old versions (browsers, OS, Flash)

They visit the home page only (usually represented by a slash “/” in GA)

Extreme metrics:

Bounce rate close to 100%,

Session time close to 0 seconds,

1 page per session,

100% new users.

Important! If you find traffic that checks off many of these signals, it is likely bot traffic. However, not all entries with these characteristics are bots, and not all bots match these patterns, so be cautious.

Perhaps the most useful report that has helped me identify bot traffic is the “Service Provider” report. Large corporations frequently use their own Internet service provider name.

I also have a pre-built expression for ISP bots, similar to the crawler expressions.

IP filter for internal traffic

We already covered different types of internal traffic, the one from test sites (with the hostname filter), and the one from third-party tools (with the campaign source filter).

Now it’s time to look at the most common and damaging of all: the traffic generated directly by you or any member of your team while working on any task for the site.

To deal with this, the standard solution is to create a filter that excludes the public IP (not private) of all locations used to work on the site.

Examples of places/people that should be filtered

Office

Support

Home

Developers

Hotel

Coffee shop

Bar

Mall

Any place that is regularly used to work on your site

To find the public IP of the location you are working at, simply search for “my IP” in Google. You will see one of these versions:

IP version

Example

Short IPv4

1.23.45.678

Long IPv6

2001:0db8:85a3:0000:0000:8a2e:0370:7334

No matter which version you see, make a list with the IP of each place and put them together with a REGEX, the same way we did with other filters.

IP address expression: IP1|IP2|IP3|IP4 and so on.

The static IP filter configuration:

Filter Name: Exclude internal traffic (IP)

Filter Type: Custom > Exclude

Filter Field: IP Address

Filter Pattern: [The IP expression]

Cases when this filter won’t be optimal:

There are some cases in which the IP filter won’t be as efficient as it used to be:

You use IP anonymization (required by the GDPR regulation). When you anonymize the IP in GA, the last part of the IP is changed to 0. This means that if you have 1.23.45.678, GA will pass it as 1.23.45.0, so you need to put it like that in your filter. The problem is that you might be excluding other IPs that are not yours.

Your Internet provider changes your IP frequently (Dynamic IP). This has become a common issue lately, especially if you have the long version (IPv6).

Your team works from multiple locations. The way of working is changing — now, not all companies operate from a central office. It’s often the case that some will work from home, others from the train, in a coffee shop, etc. You can still filter those places; however, maintaining the list of IPs to exclude can be a nightmare,

You or your team travel frequently. Similar to the previous scenario, if you or your team travels constantly, there’s no way you can keep up with the IP filters.

If you check one or more of these scenarios, then this filter is not optimal for you; I recommend you to try the “Advanced internal URL query filter” below.

URL query filter for internal traffic

If there are dozens or hundreds of employees in the company, it’s extremely difficult to exclude them when they’re traveling, accessing the site from their personal locations, or mobile networks.

Here’s where the URL query comes to the rescue. To use this filter you just need to add a query parameter. I add “?internal” to any link your team uses to access your site:

Internal newsletters

Management tools (Trello, Redmine)

Emails to colleagues

Also works by directly adding it in the browser address bar

Basic internal URL query filter

The basic version of this solution is to create a filter to exclude any URL that contains the query “?internal”.

Filter Name: Exclude Internal Traffic (URL Query)

Filter Type: Custom > Exclude

Filter Field: Request URI

Filter Pattern: \?internal

This solution is perfect for instances were the user will most likely stay on the landing page, for example, when sending a newsletter to all employees to check a new post.

If the user will likely visit more than the landing page, then the subsequent pages will be recorded.

Although this solution is a bit more complicated to set up, once it’s in place:

It doesn’t need maintenance

Any team member can use it, no need to explain techy stuff

Can be used from any location

Can be used from any device, and any browser

To activate the filter, you just have to add the text “?internal” to any URL of the website.

That will insert a small cookie in the browser that will tell GA not to record the visits from that browser.

And the best of it is that the cookie will stay there for a year (unless it is manually removed), so the user doesn’t have to add “?internal” every time.

Bonus filter: Include only internal traffic

In some occasions, it’s interesting to know the traffic generated internally by employees — maybe because you want to measure the success of an internal campaign or just because you’re a curious person.

In that case, you should create an additional view, call it “Internal Traffic Only,” and use one of the internal filters above. Just one! Because if you have multiple include filters, the hit will need to match all of them to be counted.

If you configured the “Advanced internal URL query” filter, use that one. If not, choose one of the others.

The configuration is exactly the same — you only need to change “Exclude” for “Include.”

Cleaning historical data

The filters will prevent future hits from junk traffic.

But what about past affected data?

I know I told you that deleting aggregated historical data is not possible in GA. However, there’s still a way to temporarily clean up at least some of the nasty traffic that has already polluted your reports.

For this, we’ll use an advanced segment (a subset of your Analytics data). There are built-in segments like “Organic” or “Mobile,” but you can also build one using your own set of rules.

To clean our historical data, we will build a segment using all the expressions from the filters above as conditions (except the ones from the IP filter, because IPs are not stored in GA; hence, they can’t be segmented).

You just need to follow the instructions on that page and replace the placeholders. Here is how it looks:

In the actual template, all text is black; the colors are just to help you visualize the conditions.

After importing it, to select the segment:

Click on the box that says “All users” at the top of any of your reports

From your list of segments, check the one that says “0. All Users – Clean”

Lastly, uncheck the “All Users”

Now you can navigate through your reaports and all the junk traffic included in the segment will be removed.

A few things to consider when using this segment:

Segments have to be selected each time. A way of having it selected by default is by adding a bookmark when the segment is selected.

You can remove or add conditions if you need to.

You can edit the segment at any time to update it or add conditions (open the list of segments, then click “Actions” then “Edit”).

The hostname expression and third-party tools expression are different for each site.

If your site has a large volume of traffic, segments may sample your data when selected, so if you see the little shield icon at the top of your reports go yellow (normally is green), try choosing a shorter period (i.e. 1 year, 6 months, one month).

Conclusion: Which cake would you eat?

Having real and accurate data is essential for your Google Analytics to report as you would expect.

But if you haven’t filtered it properly, it’s almost certain that it will be filled with all sorts of junk and artificial information.

And the worst part is that if don’t realize that your reports contain bogus data, you will likely make wrong or poor decisions when deciding on the next steps for your site or business.

The filters I share above will help you prevent the three most harmful threats that are polluting your Google Analytics and don’t let you get a clear view of the actual performance of your site: spam, bots, and internal traffic.

Once these filters are in place, you can rest assured that your efforts (and money!) won’t be wasted on analyzing deceptive Google Analytics data, and your decisions will be based on solid information.

And the benefits don’t stop there. If you’re using other tools that import data from GA, for example, WordPress plugins like GADWP, excel add-ins like AnalyticsEdge, or SEO suites like Moz Pro, the benefits will trickle down to all of them as well.

Besides highlighting the importance of the filters in GA (which I hope I made clear by now), I would also love for the preparation of these filters to give you the curiosity and basis to create others that will allow you to do all sorts of remarkable things with your data.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

My name is Patrick Curtis, and I’m the founder and CEO of Wall Street Oasis, an online community focused on careers in finance founded in 2006 with over 2 million visits per month.

User-generated content and long-tail organic traffic is what has built our business and community over the last 12+ years. But what happens if you wake up one day and realize that your growth has suddenly stopped? This is what happened to us back in November 2012.

In this case study, I’ll highlight two of our main SEO problems as a large forum with over 200,000 URLs, then describe two solutions that finally helped us regain our growth trajectory — almost five years later.

Two main problems

1. Algorithm change impacts

Ever since November 2012, Google’s algo changes have seemed to hurt many online forums like ours. Even though our traffic didn’t decline, our growth dropped to the single-digit percentages. No matter what we tried, we couldn’t break through our “plateau of pain” (I call it that because it was a painful ~5 years trying).

2. Quality of user-generated content

Related to the first problem, 99% of our content is user-generated (UGC) which means the quality is mixed (to put it kindly). Like most forum-based sites, some of our members create incredible pieces of content, but a meaningful percentage of our content is also admittedly thin and/or low-quality.

How could we deal with over 200,000 pieces of content efficiently and try to optimize them without going bankrupt? How could we “clean the cruft” when there was just so much of it?

Fighting back: Two solutions (and one statistical analysis to show how it worked)

1. “Merge and Purge” project

Our goal was to consolidate weaker “children” URLs into stronger “master” URLs to utilize some of the valuable content Google was ignoring and to make the user experience better.

For example, instead of having ~20 discussions on a specific topic (each with an average of around two to three comments) across twelve years, we would consolidate many of those discussions into the strongest two or three URLs (each with around 20–30 comments), leading to a much better user experience with less need to search and jump around the site.

Changes included taking the original post and comments from a “child” URL and merging them into the “master” URL, unpublishing the child URL, removing the child from sitemap, and adding a 301 redirect to the master.

Below is an example of how it looked when we merged a child into our popular Why Investment Banking discussion. We highlighted the original child post as a Related Topic with a blue border and included the original post date to help avoid confusion:

This was a massive project that involved some complex Excel sorting, but after 18 months and about $ 50,000 invested (27,418 children merged into 8,515 masters to date), the user experience, site architecture, and organization is much better.

Initial analysis suggests that the percentage gain from merging weak children URLs into stronger masters has given us a boost of ~10–15% in organic search traffic.

2. The Content Optimization Team

The goal of this initiative was to take the top landing pages that already existed on Wall Street Oasis and make sure that they were both higher quality and optimized for SEO. What does that mean, exactly, and how did we execute it?

We needed a dedicated team that had some baseline industry knowledge. To that end, we formed a team of five interns from the community, due to the fact that they were familiar with the common topics.

We looked at the top ~200 URLs over the previous 90 days (by organic landing page traffic) and listed them out in a spreadsheet:

We held five main hypotheses of what we believed would boost organic traffic before we started this project:

Longer content with subtitles: Increasing the length of the content and adding relevant H2 and H3 subtitles to give the reader more detailed and useful information in an organized fashion.

Changing the H1 so that it matched more high-volume keywords using Moz’s Keyword Explorer.

Changing the URL so that it also was a better match to high-volume and relevant keywords.

Adding a relevant image or graphic to help break up large “walls of text” and enrich the content.

Adding a relevant video similar to the graphic, but also to help increase time on page and enrich the content around the topic.

We tracked all five of these changes across all 200 URLs (see image above). After a statistical analysis, we learned that four of them helped our organic search traffic and one actually hurt.

Summary of results from our statistical analysis

Increasing the length of the articles and adding relevant subtitles (H2s, H3s, and H4s) to help organize the content gives an average boost to organic traffic of 14%

Improving the title or H1 of the URLs yields a 9% increase on average

Changing the URL decreased traffic on average by 38% (this was a smaller sample size — we stopped doing this early on for obvious reasons)

Including a relevant video increases the organic traffic by 4% on average, while putting an image up increases it by 5% on average.

Overall, the boost to organic traffic — should we continue to make these four changes (and avoid changing the URL) — is 32% on average.

Key takeaway:

Over half of that gain (~18%) comes from changes that require a minimal investment of time. For teams trying to optimize on-page SEO across a large number of pages, we recommend focusing on the top landing pages first and easy wins before deciding if further investment is warranted.

We hope this case study of our on-page SEO efforts was interesting, and I’m happy to answer any questions you have in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Picture this scenario. You’re a new employee that has just been brought in to a struggling marketing department (or an agency brought on to help recover lost numbers). You get access to Google Analytics, and see something like this:

(Actual screenshot of the client I audited)

This can generate two types of emotional response: excitement or fear (or both). The steady decline in organic traffic excites you because you have so many tactics and ideas that you think can save this company from spiraling downward out of control. But there’s also the fear that these tactics wont be enough to correct the course.

Regardless of whether these new tactics would work or not, it’s important to understand the history of the account and determine not only what is happening, but why.

The company may have an idea of why the traffic is declining (i.e. competitors have come in and made ranking for keywords much harder, or they did a website redesign and have never recovered).

Essentially, this boils down to two things: 1) either you’re struggling with organic optimization, or 2) something was off with your tracking in Google Analytics, has since been corrected, and hasn’t been caught.

In this article, I’ll go over an audit I did for one of my clients to help determine if the decline we saw in organic traffic was due to actual poor SEO performance, an influx in competitors, tracking issues, or a combination of these things.

I’ll be breaking it down into five different areas of investigation:

Keyword ranking differences from 2015–2017

Did the keywords we were ranking for in 2015 change drastically in 2017? Did we lose rankings and therefore lose organic traffic?

Top organic landing pages from 2015–2017

Are the top ranking organic landing pages the same currently as they were in 2015? Are we missing any pages due to a website redesign?

Looking at the SEMrush organic traffic cost metric as well as Moz metrics like Domain Authority and competitors.

Goal completions

Did our conversion numbers stay consistent throughout the traffic drop? Or did the conversions drop in correlation with the traffic drop?

By the end of this post, my goal is that you’ll be able to replicate this audit to determine exactly what’s causing your organic traffic decline and how to get back on the right track.

Let’s dive in!

Keyword ranking differences from 2015–2017

This was my initial starting point for my audit. I started with this specifically because the most obvious answer, for a decline in traffic is a decline in keyword rankings.

I wanted to look at what keywords we were ranking for in 2015 to see if we significantly dropped in the rankings or if the search volume had dropped. If the company you’re auditing has had a long-running Moz account, start by looking at the keyword rankings from the initial start of the decline, compared to current keyword rankings.

I exported keyword data from both SEMrush and Moz, and looked specifically at the ranking changes of core keywords.

March was a particularly strong month across the board, so I narrowed it down and exported the keyword rankings in:

March 2015

March 2016

March 2017

December 2017 (so I could get the most current rankings)

Once the keywords were exported, I went in and highlighted in red the keywords that we were ranking for in 2015 (and driving traffic from) that we were no longer ranking for in 2017. I also highlighted in yellow the keywords we were ranking for in 2015 that were still ranking in 2017.

2015 keywords:

2017 keywords:

(Brand-related queries and URLs are blurred out for anonymity)

One thing that immediately stood out: in 2015, this company was ranking for five keywords, including the word “free.” They have since changed their offering, so it made sense that in 2017, we weren’t ranking for those keywords.

After removing the free queries, we pulled the “core” keywords to look at their differences.

March 2015 core keywords:

Appointment scheduling software: position 9

Online appointment scheduling: position 11

Online appointment scheduling: position 9

Online scheduling software: position 9

Online scheduler: position 9

Online scheduling: position 13

December 2017 core keywords:

Appointment scheduler: position 11

Appointment scheduling software: position 10

Online schedule: position 6

Online appointment scheduler: position 11

Online appointment scheduling: position 12

Online scheduling software: position 12

Online scheduling tool: position 10

Online scheduling: position 15

SaaS appointment scheduling: position 2

There were no particular red flags here. While some of the keywords have moved down 1–2 spots, we had new ones jump up. These small changes in movement didn’t explain the nearly 30–40% drop in organic traffic. I checked this off my list and moved on to organic landing pages.

Top organic landing page changes

Since the dive into keyword rankings didn’t provide the answer for the decline in traffic, the next thing I looked at were the organic landing pages. I knew this client had switched over CMS systems in early 2017, and had done a few small redesign projects the past three years.

After exporting our organic landing pages for 2015, 2016, and 2017, we compared the top ten (by organic sessions) and got the following results.

2015 top organic landing pages:

2016 top organic landing pages:

2017 top organic landing pages:

Because of their redesign, you can see that the subfolders changed between 2015/2016 to 2017. What really got my attention, however, is the /get-started page. In 2015/2016, the Get Started page accounted for nearly 16% of all organic traffic. In 2017, the Get Started page was nowhere to be found.

If you run into this problem and notice there are pages missing from your current top organic pages, a great way to uncover why is to use the Wayback Machine. It’s a great tool that allows you to see what a web page looked like in the past.

When we looked at the /get-started URL in the Wayback Machine, we noticed something pretty interesting:

In 2015, their /get-started page also acted as their login page. When people were searching on Google for “[Company Name] login,” this page was ranking, bringing in a significant amount of organic traffic.

Their current setup sends logins to a subdomain that doesn’t have a GA code (as it’s strictly used as a portal to the actual application).

That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.

While this helped explain some of the traffic loss, the next thing we looked at was the on-page metrics to see if we could spot any obvious tracking issues.

Comparing on-page engagement metrics

Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.

Looking at the same top organic pages, I compared the on-page engagement metrics.

2015 on-page metrics:

2016 on-page metrics:

2017 on-page metrics:

While the overall engagement metrics changed slightly, the biggest and most interesting discrepancy I saw was in the bounce rates for the home page and Get Started page.

According to a number of different studies (like this one, this one, or even this one), the average bounce rate for a B2B site is around 40–60%. Seeing the home page with a bounce rate under 20% was definitely a red flag.

This led me to look into some other metrics as well. I compared key metrics between 2015 and 2017, and was utterly confused by the findings:

Looking at the organic sessions (overall), we saw a decrease of around 80,000 sessions, or 27.93%.

Looking at the organic users (overall) we saw a similar number, with a decrease of around 38,000 users, or 25%.

When we looked at page views, however, we saw a much more drastic drop:

For the entire site, we saw a 50% decrease in pageviews, or a decrease of nearly 400,000 page views.

This didn’t make much sense, because even if we had those extra 38,000 users, and each user averaged roughly 2.49 pages per session (looking above), that would only account for, at most, 100,000 more page views. This left 300,000 page views unaccounted for.

This led me to believe that there was definitely some sort of tracking issue. The high number of page views and low bounce rate made me suspect that some users were being double counted.

However, to confirm these assumptions, I took a look at some external data sources.

Using SEMrush and Moz data to exclude user error

If you have a feeling that your tracking was messed up in previous years, a good way to confirm or deny this hypothesis is to check external sources like Moz and SEMrush.

Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.

Another good thing to look at is domain authority and core page authority. If your site has had a few redesigns, moved URLs, or anything like that, it’s important to make sure that the domain authority has carried over. It’s also important to look at the page authorities of your core pages. If these are much lower than when they were before the organic traffic slide, there’s a good chance your redirects weren’t done properly, and the page authority isn’t being carried over through those new domains.

If, like me, you don’t have Moz data that dates back far enough, a good thing to check is the organic traffic cost in SEMrush.

Organic traffic cost can change because of a few reasons:

Your site is ranking for more valuable keywords, making the organic traffic cost rise.

More competitors have entered the space, making the keywords you were ranking for more expensive to bid on.

Usually it’s a combination of both of these.

If our organic traffic really was steadily decreasing for the past 2 years, we’d likely see a similar trendline looking at our organic traffic cost. However, that’s not what we saw.

In March of 2015, the organic traffic cost of my client’s site was $ 14,300.

In March of 2016, the organic traffic cost was $ 22,200

In December of 2017, the organic traffic cost spiked all the way up to $ 69,200. According to SEMrush, we also saw increases in keywords and traffic.

Looking at all of this external data re-affirmed the assumption that something must have been off with our tracking.

However, as a final check, I went back to internal metrics to see if the conversion data had decreased at a similar rate as the organic traffic.

Analyzing and comparing conversion metrics

This seemed like a natural final step into uncovering the mystery in this traffic drop. After all, it’s not organic traffic that’s going to profit your business (although it’s a key component). The big revenue driver is goal completions and form fills.

This was a fairly simple procedure. I went into Google Analytics to compare goal completion numbers and goal completion conversion rates over the past three years.

If your company is like my client’s, there’s a good chance you’re taking advantage of the maximum 20 goal completions that can be simultaneously tracked in Analytics. However, to make things easier and more consistent (since goal completions can change), I looked at only buyer intent conversions. In this case it was Enterprise, Business, and Personal edition form fills, as well as Contact Us form fills.

If you’re doing this on your own site, I would recommend doing the same thing. Gated content goal completions usually have a natural shelf life, and this natural slowdown in goal completions can skew the data. I’d look at the most important conversion on your site (usually a contact us or a demo form) and go strictly off those numbers.

For my client, you can see those goal completion numbers below:

Goal completion name

2015

2016

2017

Contact Us

579

525

478

Individual Edition

3,372

2,621

3,420

Business Edition

1,147

1,437

1,473

Enterprise Edition

1,178

1,053

502

Total

6,276

5,636

5,873

Conversion rates:

Goal completion name

2015

2016

2017

Contact Us

0.22%

0.22%

0.23%

Individual Edition

1.30%

1.09%

1.83%

Business Edition

0.46%

0.60%

0.76%

Enterprise Edition

0.46%

0.44%

0.29%

Average

0.61%

0.58%

0.77%

This was pretty interesting. Although there was clearly fluctuation in the goal completions and conversion rates, there were no differences that made sense with our nearly 40,000 user drop from 2015 to 2016 to 2017.

All of these findings further confirmed that we were chasing an inaccurate goal. In fact, we spent the first three months working together to try and get back a 40% loss that, quite frankly, was never even there in the first place.

Tying everything together and final thoughts

For this particular case, we had to go down all five of these roads in order to reach the conclusion that we did: Our tracking was off in the past.

However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.

If your goal completions are way down (by a similar percentage as your traffic), that’s also a good clue that your declining traffic numbers are correct.

If you’ve looked at all of these metrics and still can’t seem to figure out the reasoning for the decrease and you’re blindly trying tactics and struggling to crawl your way back up, this is a great checklist to go through to confirm the ominous question of tracking issue or optimization issue.

If you’re having a similar issue as me, I’m hoping this post helps you get to the root of the problem quickly, and gets you one step closer to create realistic organic traffic goals for the future!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Almost every consultant or in-house SEO will be asked at some point to investigate an organic traffic drop. I’ve investigated quite a few, so I thought I’d share some steps I’ve found helpful when doing so.

Is it just normal noise?

Before you sound the alarm and get lost down a rabbit hole, you should make sure that the drop you’re seeing is actually real. This involves answering two questions:

A.) Do you trust the data?

This might seem trivial, but at least a quarter of the traffic drops I’ve seen were simply due to data problems.

The best way to check on this is to sense-check other metrics that might be impacted by data problems. Does anything else look funky? If you have a data engineering team, are they aware of any data issues? Are you flat-out missing data for certain days or page types or devices, etc.? Thankfully, data problems will usually make themselves pretty obvious once you start turning over a few rocks.

One of the more common sources of data issues is simply missing data for a day.

B.) Is this just normal variance?

Metrics go up and down all the time for no discernible reason. One way to quantify this is to use your historical standard deviation for SEO traffic.

For example, you could plot your weekly SEO traffic for the past 12 months and calculate the standard deviation (using the STDEV function on Google Sheets or Excel makes this very easy) to figure out if a drop in weekly traffic is abnormal. You’d expect about 16% of weeks to be one standard deviation below your weekly average just by sheer luck. You could therefore set a one-standard-deviation threshold before investigating traffic drops, for example (but you should adjust this threshold to whatever is appropriate for your business). You can also look at the standard deviation for your year-over-year or week-over-week SEO traffic if that’s where you’re seeing the drop (i.e. plot your % change in YoY SEO traffic by week for the past 12 months and calculate the standard deviation).

SEO traffic is usually pretty noisy, especially on a short time frame like a week.

Let’s assume you’ve decided this is indeed a real traffic drop. Now what? I’d recommend trying to answer the eleven questions below, at least one of them will usually identify the culprit.

Questions to ask yourself when facing an organic traffic drop

1. Was there a recent Google algorithm update?

If there was an algorithm update, do you have any reason to suspect you’d be impacted? It can sometimes be difficult to understand the exact nature of a Google update, but it’s worth tracking down any information you can to make sure your site isn’t at risk of being hit.

2. Is the drop specific to any segment?

One of the more useful practices whenever you’re looking at aggregated data (such as a site’s overall search traffic) is to segment the data until you find something interesting. In this case, we’d be looking for a segment that has dropped in traffic much more than any other. This is often the first step in tracking down the root cause of the issue. The two segments I’ve found most useful in diagnosing SEO traffic drops specifically:

But there will likely be plenty of other segments that might make sense to look at for your business (for example, product category).

3. Are you being penalized?

This is unlikely, but it’s also usually pretty quick to disprove. Look at Search Console for any messages related to penalties and search for your brand name on Google. If you’re not showing up, then you might be penalized.

Rap Genius (now Genius) was penalized for their link building tactics and didn’t show up for their own brand name on Google.

4. Did the drop coincide with a major site change?

This can take a thousand different forms (did you migrate a bunch of URLs, move to a different JavaScript framework, update all your title tags, remove your navigation menu, etc?). If this is the case, and you have a reasonable hypothesis for how this could impact SEO traffic, you might have found your culprit.

5. Did you lose ranking share to a competitor?

If you’ve lost rankings, it’s worth investigating the specific keywords that you’ve lost and figuring out if there’s a trend. Did your competitors launch a new page type? Did they add content to their pages? Do they have more internal links pointing to these pages than you do?

GetStat’s Share of Voice report lets you quickly see whether a competitor is usurping your rankings

It could also just be a new competitor that’s entered the scene.

6. Did it coincide with a rise in direct or dark traffic?

If so, make sure you haven’t changed how you’re classifying this traffic on your end. Otherwise, you might simply be re-classifying organic traffic as direct or dark traffic.

7. Has there been a change to the search engine results pages you care about?

You can either use Moz’s SERP features report, or manually look at the SERPs you care about to figure out if their design has materially changed. It’s possible that Google is now answering many of your relevant queries directly in search results, put an image carousel on them, added a local pack, etc. — all of which would likely decrease your organic search traffic.

8. Is the drop specific to branded or unbranded traffic?

If you have historical Search Console data, you can look at number of branded clicks vs. unbranded clicks over time. You could also look at this data through AdWords if you spend on paid search. Another simple proxy to branded traffic is homepage traffic (for most sites, the majority of homepage traffic will be branded). If the drop is specific to branded search then it’s probably a brand problem, not an SEO problem.

9. Did a bunch of pages drop out of the index?

Search Console’s Index Status Report will make it clear if you suddenly have way fewer URLs being indexed. If this is the case, you might be accidentally disallowing or noindexing URLs (through robots.txt, meta tags on the page, or HTTP headers).

Search Console’s Index Status Report is a quick way to make sure you’re not accidentally noindexing or disallowing large portions of your site.

10. Did your number of referring domains and/or links drop?

It’s possible that a large number of your backlinks have been removed or are no longer accessible for whatever reason.

A sudden drop in backlinks could be the reason you’re seeing a traffic drop.

11. Is SEM cannibalizing SEO traffic?

It’s possible that your paid search team has recently ramped up their spend and that this is eating into your SEO traffic. You should be able to check on this pretty quickly by plotting your SEM vs. SEO traffic. If it’s not obvious after doing this whether it’s a factor, then it can be worth pausing your SEM campaigns for specific landing pages and seeing if SEO traffic rebounds for those pages.

To be clear, some level of cannibalization between SEM and SEO is inevitable, but it’s still worth understanding how much of your traffic is being cannibalized and whether the incremental clicks your SEM campaigns are driving outweigh the loss in SEO traffic (in my experience they usually do outweigh the loss in SEO traffic, but still worth checking!).

That’s all I’ve got — hopefully at least one of these questions will lead you to the root cause of an organic search traffic drop. Are there any other questions that you’ve found particularly helpful for diagnosing traffic drops? Let me know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Good to see you again! With the Monday Martin Luther King, Jr. holiday, this was a short week on Copyblogger. On Tuesday, Kelton Reid kicked things off with a thoughtful look at impostor syndrome — with clues on how to approach it from different sources, including the famous Turing Test. And on Wednesday, I talked Read More…

Do you get all the traffic you’d like for your site? Do visitors just keep pouring in, letting you meet all of your business goals with ease? Yeah, don’t worry, no one actually says Yes to that question. Getting new people to your site can be tricky, and changes in Google and Facebook algorithms don’t Read More…