Why we shouldn’t make content decisions based on analytics data

Maybe you want to take a new approach to your content. Or maybe you’re developing a new content strategy for your school, college or university. But where do you begin?

An audit of your existing content and a review of your site analytics seems like a sensible starting point, yes? But is this really the right place to look for the data to inform new content decisions, especially major decisions?

I think not.

Site analytics can and do tell us something about our content, such as:

– If a change we’ve made is having a positive (or negative) impact…as long as we’ve done so in a controlled way (so we can be confident that the effect wasn’t caused by something else we changed)– Whether visitors are sticking with our content (good ‘time on page data’)– Whether one thing performs better than another (A/B testing).

However, when it comes to major redevelopments and new sites, we need to treat our analytics data with a degree of scepticism. Site analytics are a reflection on the use of what we currently have, not an accurate indicator or what we should or could have.

In other words, they give us a window to the “what is”, and that can be overly restrictive – even dangerous – if taken instead as a vision for “what might be”.

Let’s say we’re designing a new student portal. Our analytics tell us that the FAQs section of our current student portal is the most visited section of the existing site and our visitors spend a lot of time there. If we rely on analytics alone, we’re going to find ourselves saying “Hey, we need to make the FAQs section even bigger, better and more prominent. Hell, we could even just get rid of the rest of the portal and just have FAQs”.

What if the reality is that they visit the FAQs so much because they know that they can’t find information anywhere else? Because the portal is so bad to begin with. What if they spend ages there because they find FAQs completely unnavigable and not because they love the section so much? Analytics data tells us nothing about this.

Another situation: we’ve been producing amazing advice and guidance blog posts written by students for students for months. The analytics tell us that nobody is looking at them. So we decide that they’re a waste of resource and we should stop doing them, right? But what if they weren’t well signposted? What if nobody knew they were there or knew how to get to them? Relying on analytics alone won’t tell us that. They’ve given us an incomplete story.

The problem with analytics is that it’s all too easy to see them as the whole story, when actually they are just an intro, or background context. Especially in a major redevelopment project and when designing a content overhaul.

This is why when making major content decisions, we mustn’t rely on analytics data alone. It leaves us in danger of simply recreating over and over things that actually aren’t really serving our audiences or us. Conversely they can encourage us to abandon the content that might have high value to us and our audience that isn’t being maximised because the current site structure or promotional activity isn’t serving that content well enough.

So what’s the alternative?

Another seemingly obvious solution exists in audience research. We go to our users and we ask them “what would you like?” Sound sensible? Of course it does! But what if they don’t know what’s possible or what they could have? In this scenario instead of treating them as users of your site, you’re asking them to be content designers and do your job for you. “Tell us what you want” sounds noble, but it puts them under the pressure of creativity and imaging solutions that they’re not aware they need.

So we just take a stab in the dark then?

Well, no. That’s not what I advocate either. Instead I recommend that you:

– Take the data, explore the analytics but add much more to it

– Conduct user tests and over-the-shoulder tests to see what’s really going on

– Push the boundaries of your own thinking in the content strategy and content design process so you’re able to test at the extreme edges of possibility, not just iterate on bad practice. You won’t necessarily arrive in an obscure place, by by using that to test with, you can explore new boundaries

– Craft in-person research with your users that presents them with possibility and creative options, and have them comment on those solutions (the number of times that I’ve run focus groups when you suggest something and are met with an “oooohhh that would be great, I wouldn’t have thought of that” response)

– Be prepared to occasionally run A/B tests on the big things, not just tiny details.

I’d love to hear more about your approaches to creating new content that doesn’t rely only on analytics. Or, if you completely disagree with my view, then equally I’d love to hear your experiences that have told you a different story. Let’s have a chat.