Category: Using analytics

Why do news organizations persist in using total page views as a measure of success? Perhaps because if you're afraid of numbers then you're even more afraid of bad numbers, or numbers that tell you that your site isn't as successful as you want.

As with unique visitors and time on site, page views is a deeply flawed metric for understanding how a news organization is growing and retaining audiences.

If the number of page views goes up, it could be a good thing. Or, it could be bad.

If page views go down, it could be a bad thing. Or – you guessed it – it could be good.

We would all like to think that a soaring number of page views means lots of people are eagerly pawing through our sites reading everything that's written. However, how many times have you gone to a site and clicked on, say, 12 pages, fruitlessly looking for something?

This is counted as:

1. One unique visitor2. One visit3. 12 page views

And one dissatisfied person who may not come back.

The page views metric rewards the bad design and navigation that many news sites have (sorry). Most news sites persist in using section titles that are the same as their legacy media product (e.g., "Local News," "Life"), leaving audiences – if they're so inclined – to have to click numerous times before landing on a story about a particular city or activity like gardening.

Or, a site breaks up a story into multiple pages, which can be annoying to a reader and reduces the possibility the reader will read the entire story and rate it, e-mail it or leave a comment. What could be counted as one page view with a comment is counted as, say, five page views.

If a site is redesigned and readers can find what they want with fewer clicks, total page views will – should – go down.

To truly grow, a news org must understand every action its audiences are taking on its sites. These are challenging times that require news sites to experiment and try many different things. Not all things will work, which means sometimes the numbers will be bad. But – you guessed it – that's a good thing. We have to know if something's not working so we can fix it.

Someday I’ll give up web analytics and move on to something real like pottery or something, but until then I’ll keep fighting the good fight to get news orgs to stop using monthly unique visitors as an indicator of success.

It’s tempting, I know, to count UVs because the number of monthly subscribers is the standard for print, and the number of people Nielsen says watched a program is the standard for TV.

But technology has made everything different. Strategy is more important than ever, and understanding audiences, not just counting them, is essential.

UVs are counted by counting the number of cookies, or computers,
that go to a site. This means UVs are always significantly overcounted
or undercounted.

If one person uses three computers, it’s counted as
three unique visitors.

Conversely, a number of people going to one computer – for example, at a school or library – means that UVs will be undercounted.

(So that’s the reason why news orgs use total UVs – would they use this number if it were consistently undercounted? Don’t think so…)

Mason notes that while the UV metric is “particularly important for those sites that are dependent on advertising revenues as a major source of
income,” it “must always be treated with caution and never taken at face
value.”

How do you measure mobile? It’s a mess, even for web analytics gurus like Judah Phillips at Monster.com, who said as much in an interview with IQWorkforce:

“The mobile space is interesting to me too, but it’s very much like traditional web analytics on smaller screens with some absolutely crazy data collection, sessionization, and visitorization challenges.”

Huh? Let’s start at the beginning.

Just as millions of other people, I have a mobile phone. However, I just discovered that I don’t have a “phone.” I have a “device,” or just simply, “a mobile.” That’s right – “mobile” is now a noun.

I guess the definition of “phone” is now limited to something on which you only make or take calls. And, it turns out, even the lowest end mobiles – or lean mobiles – at least have texting capabilities. (If you want to sound like a techie, use SMS, or short message service. Sheesh.)

Up until last year I had a lean mobile with a camera. I loved using the camera but didn’t send photos to anyone because it would have cost me $15, just like that. I didn’t send any texts because each one made or received cost $0.20 each, which meant every time I got an unsolicited text from a company or an unknowing friend I was a little annoyed. It wasn’t the cost. It was just the principle of it all. I neither wanted nor needed these texts, and I had no control over receiving them.

I’m on a smartphone now, an iPhone 3GS, for no particular reason other than it sounded fun. I’m paying $5/month extra to make or get 200 texts, and I’ve found texting pretty useful – so useful that if I find myself doing more than 200 texts/month I’ll probably pay the AT&T rip-off unlimited-text fee of $20/month. I’ve also been doing everything else everyone else does – reading news stories, tweeting, updating my Facebook status, checking in for flights, buying things.

And I have no idea whether I’ve been using mobile apps or the mobile web.

It turns out “There are ‘three worlds’ in mobile: apps, mobile Web and SMS. In the
case of smartphone owners, they will use all three to varying degrees.” (From Internet2Go.net, March 2009.)

You know what “three worlds” means. Three different sets of metrics.

And, guess what? Apps are device-specific, which means there are different sources of metrics for iPhone apps (which, contrary to popular belief, hasn’t taken over the world), BlackBerry, Palm Pre, etc.

And….

Mobile web browsers (e.g., Safari on an iPhone) are also device-specific.

And….

All of those mobile usage numbers from comScore, Ground Truth (a mobile measurement firm) and the like only measure one mobile world, the world of mobile web browsers. Theydon’t measure usage from apps. And how many people use mobile apps rather than the mobile web, especially for Facebook and Twitter? I dunno – a lot?

The mobile usage numbers may all be flawed, but they all do point to mobile’s continuing rapid growth. So, unfortunately, that means we’ve got to understand what new nouns like “sessionization” and “visitorization” mean.

I exercise regularly and don’t smoke, but I avoid reading Prevention.com. It’s just too annoying to be reminded of all of the other “smart ways to live well.” When I see “by the editors of Prevention.com,” I have this picture in my mind of a bunch of really healthy people popping out cheerful stories like “5 Vitamins Your Bones Love” and “10 Reasons You’re Always Exhausted.”

Now I have another (annoying) vision, thanks to MinOnline‘s story about how well Prevention is using web analytics. Of course a staff that is so pragmatic and probably always mentally alert would resist “going for the cheap link grab and traffic spike” – the junk food of web analytics.

Prevention stays “on its own brand message and [courts] the kinds of audiences that it and
its advertisers really want. ‘We got back to engaging with our customer
in the ways we knew they wanted us to engage with them,’ says [vp/digital Bill] Stump.
Fishing for any and all eyeballs and courting simple traffic spikes in
the search-driven universe doesn’t pay off in the end. ‘You get waves of
traffic, but the tide goes back out and what are you left with?’
Instead, by keeping to the needs of the ‘core customer’ in everything
that goes out to syndication or into the e-mail newsletters, prevention.com is courting the
people who tend to stay.

“Now, each big wave raises the sea level for all of prevention.com’s metrics, says
Stump. In the last two years, overall page views climbed 60%. In the
last year, the number of visits per user went up 12%. But it is the
engagement metrics of which Stump is proudest. ‘The number that warms my
heart,’ he says, ‘is page views per visitor that are up 49%.’ That
means the new visitors are sticking with the site and drilling much
deeper than they ever have before. ‘In general, advertisers want an
engaged audience. They want the metrics that show that people value your
brand and come to you for something that is unique. We own natural
health and fitness and beauty. We are the authentic voice.'”

Time spent on a site or a visit ranks right up there with total page views and monthly unique visitors as widely quoted metrics masking as indicators of success for news organizations.

No, it's not a crime to misuse a metric, but isn't a shame to waste your time on something that's not absolutely essential to your site's success?

Plus, the way that time spent is calculated is flawed. All web metrics are flawed somewhat, but time spent is really misleading.

More on the ugly methodology later – let's tackle time spent's uselessness first. In other words, if the methodology were acceptable would time spent still be a key performance indicator?

Advertisers have always made decisions based on the level of engagement a news org's audiences have with its brand and content. But both content and the ways people use and interact with content are different – and thus the way engagement is measured is different, too.

In the past, time spent was an important measure of engagement for news orgs and advertisers. People spent whole chunks of time with one medium or another. Readership surveys measured time spent per day or per week.

Because these were surveys, time spent was based on self-reported information. It was what people said they did vs. what they actually did.

But it didn't matter whether what people said matched with what they did. What mattered was how engaged people felt. People who reported they spent an hour a day with Monday's newspaper but actually only spent twenty minutes believed they spent a large chunk of time and attention with a news org.

In stark contrast, web advertising decisions depend on knowing actual behavior as reported via rows upon rows of numbers ruthlessly pouring out every second. Among many other things, advertisers track the number of times their ads come up and are clicked upon. Sites and audiences are more niche and are highly segmented. The algorithms for and definitions of "engagement" vary for every site and every company.

Time spent just isn't a good indicator of engagement. Someone who spends five minutes a day on a site, goes to five different stories each visit and adds comments twice a week is clearly more engaged than someone who comes onto a site for 30 minutes a week and clicks idly on a few pages while talking on the phone.

How many times have you spent 30 minutes or so on a site, flipping and flapping through what seems like a million page views in a fruitless attempt to find something? Maybe you spent 30 minutes in such a visit once – and never went back.

A news org's success in the long-term will be based not on how much time people spend on a site but what they do once they're there.

In the late 1990s, “America Online” was the shiny new company everyone watched, feared and tried to copy. Just “AOL” now, it’s hardly as fresh or inspiring. With its new CEO, logos and use of web analytics to select the stories it covers and evaluating its reporters, has AOL once again become a news organization to watch?

AOL’s announcement that it will employ “judicious use of Web-analytics software” sparked the expected flutter of coverage. It’s admitted to using data to inform (dictate?) news decisions, so you could be led to believe that AOL is adopting a true audience-based approach. However, after reading the Feb. 22 story in BusinessWeek and the reactions gathered by Media Post News, it seems like AOL is still using a traditional advertising-based mass media strategy. It’s still trying to be all things to all people. It’s just using web analytics to decide what those things are.

“Audience growth and audience engagement have to be the things that we judge the most off of our journalist investments,” AOL CEO Tim Armstrong is quoted as saying. So far, so good.

Armstrong also said that “brand ads should be a lot bigger on the Internet today,” talking about how online advertising revenue should pick up. But there was no mention about AOL’s own brand strategy, something that would answer the question of “What is AOL?” for audiences and advertisers once and for all. On which niches will it focus? How much of its content will be unique and compelling enough to those niche audiences so that they’ll come back regularly?

“The right approach to the content business is to KNOW YOUR AUDIENCE, or
the people that come to your site, and create a product for THEM. AOL’s
approach is clearly not centered on this….it’ll drive up page views and therefore, revenue but that’s not
likely to last as the industry becomes more analytics savvy. Today, a
million uniques with zero session times, high bounce rate and no repeat
visitors isn’t seen as a sign of a lack of audience but in the not too
distant future it will.”

I haven’t researched AOL myself, so I don’t know if all of the details in the BusinessWeek and Media Post News stories really reflect what AOL is doing. So I’ll just note some some things news orgs should think about when using web analytics to inform news decisions and evaluate journalists.

Evaluating success (either a site’s or a journalist’s) by total page views doesn’t work. A large number of page views may just indicate visitors got there by mistake or clicked around trying to find something. Plus, dynamic content (Flash, etc.) will not be counted as page views. Page views can be a useful metric, but only when combined with other metrics – such as ratios – that give context.

Engagement can’t be determined by web traffic or behavioral data alone. Attitudinal research is essential to find out why or why don’t people come to a site regularly, what they want and what they’re not finding.

If journalists are going to be held accountable for web traffic and audience engagement will they also have control over the factors that drive traffic, such as design, navigation and marketing? Or will they just submit their stories and hope for the best?

“AOL is even considering sharing a portion of quarterly profits with staffers whose work fetches the most page views.” BusinessWeek

How will traffic goals be set? If journalists will be rewarded for generating “traffic” (however it’s defined), will they be fired if they don’t? Will the benchmarks or starting points – and the time journalists have to reach the required traffic levels – be based on whether a topic is already established or whether it’s one a news org wants to nurture and grow because the topic is essential to achieving its strategy?

“Tacked to the newsroom walls in AOL’s downtown Manhattan headquarters are pages and pages of Web traffic data.” BusinessWeekUh, this would cause even me to shut down. It’s definitely not “judicious use of Web-analytics software.” Does AOL have a few key performance indicators that everyone understands and on which they can focus as a team?

Software and reports don’t make decisions; people do. Successful use of web analytics depends on the decision-makers understanding and using the information correctly. If news orgs believe the success of their websites depends on being truly audience-focused then they must also ensure the analytical resources and processes are there as well.

AOL may stumble again but at least it’s trying something different. I look forward to learning from AOL whether it succeeds or fails.

YouTube‘s become a verb and a household name, but I’ll always see it as an organization that’s brought metrics into the lives of the common people (those who have broadband Internet, anyway). The “Most Popular” and “Featured Videos” are seen worldwide, sometimes garnering millions of views. “Hey, did you see….” is usually accompanied by something like “…and it has x million views on YouTube!”

Number of views is great for little else other than bragging rights. It’s one of the “famous” metrics (web analytics guru Avinash Kaushik‘s term) that “are staring you in the face when you crack open any analytics tool” but “barely contain any insight.”

Yep, for anyone in the content business, number of views is right up there with hall of famers number of page views and monthly unique visitors.

YouTube has pushed all of its account holders – no matter how amateur – to use meaningful metrics. In March 2008 it launched Insight, its “video analytics tool for all users,” along with some almost-preachy instructions on how to use metrics to get more people to watch your videos and, of course, come more often to YouTube.

The Insight tool allows you to track “community engagements” (there’s that word again) in terms of ratings, comments, and favorites. YouTube doesn’t want you to settle for people just watching your video. People have to show, in a measurable way, that they not only watched it but also reacted to it.

At the very least people should give a star rating (one is bad, five is good). Rating is easy, quick and anonymous. Tagging a video as a favorite is the next rung. And if they’re really engaged, they’ll leave comments.

But, as anyone who’s ever spent any time at all on YouTube knows, many comments are spam, obscene and irrelevant – just noise. But the value of social media metrics is in looking beyond what James Kobelius in Information Management points out is an “often low and laughable” signal-to-noise ratio.

Kobelius notes that “if you crawl, correlate, categorize, mine, and explore it with the
right tools….[this unstructured information] can yield unexpected insights….The intelligence value of any individual tweet [or comment] in isolation is
negligible….Intelligence emerges from the aggregate.”

If you can stomach a few obscenities, look at this thought in Encyclopaedia Dramatica about YouTube view fraud and how the ratio of VPC, or views per comment, “is the most accurate way to determine if anyone” cares. “A high VPC usually means view fraud has been committed.”

The example in ED shows that a video with 136,097 views and 3,529 comments has a VPC of 38.7, a low number that indicates this is a video “that people actually find funny.” The video with 296,413 views, 541 comments and thus a VPC of 547.9 is probably something nobody really cares about.

I calculated some VPCs from this week’s “Most Popular” videos and came up with some numbers that I don’t know what to do with yet. To see if VPC can be used as a key performance indicator, I’ll need to calculate VPCs and crawl through the cacophony of a variety of news videos. VPC may never be “famous,” but it might be insightful.