Measurement Strategies: Balancing Outcomes and Outputs

I’m finding myself in a lot of conversations where I’m explaining the difference between “outputs” and “outcomes.” It’s a distinction that can go a long way when it comes to laying out a measurement strategy. It’s also a distinction that can seem incredibly academic and incredibly boring. To the unenlightened!

Outputs are simply things that happened as the result of some sort of tactic. For instance, the number of impressions for a banner ad campaign is an output of the campaign. Even the number of clickthroughs is an output — in and of itself, there is no business value of a clickthrough, but it is something that is a direct result of the campaign.

An outcome is direct business impact. “Revenue” is a classic outcome measure (as is ROI, but this post isn’t going to reiterate my views on that topic), but outcomes don’t have to be directly tied to financial results. Growing brand awareness is an outcome measure, as is growing your database of marketable contacts. Increasing the number of people who are talking about your brand in a positive manner in the blogosphere is an outcome. Visits to your web site is an outcome, although if you wanted to argue with me that it is really just an aggregated output measure — the sum of outputs of all of the tactics that drive traffic to your site — I wouldn’t put up much of a fight.

Why Does the Distinction Matter?

The distinction between outputs and outcomes matters for two reasons:

At the end of the day, what really matters to a business are outcomes — if you’re only measuring outputs, then you are doing yourself a disservice

Measuring outputs and outcomes can help you determine whether your best opportunities for improvement lie with adjusting your strategy or with improving your tactics

Your CEO, CFO, CMO, COO, and even C-3PO (kidding!) — the people whose tushes are most visibly on the line when it comes to overall company performance — care that their Marketing department is delivering results (outcomes) and is doing so efficiently through the effective execution of tactics (outputs).

Campaign Success vs. Brand Success

Avinash Kaushik wrote a post a couple of weeks ago about the myriad ways to measure the results of a “brand campaign.” Avinash’s main point is that “this is a brand campaign, so it can’t be measured” is a cop-out. If you read the post through an “outcomes vs. outputs” lens, you’ll see that measuring “brand” tends to be more outcome-weighted than output-weighted. And (I didn’t realize this until I went back to look at the post as I was writing this one), the entire structure of the post is based on the outcomes you want for your brand — attracting new prospects, sharing your business value proposition more broadly, impressing people about your greatness, driving offline action, etc.

Avinash’s post focuses on “brand campaigns.” I would argue that all campaigns are brand campaigns — while they may have short-term, tactical goals, they’re ultimately intended to strengthen your overall brand in some fashion. You have a strategy for your brand, and that strategy is put into action through a variety of tactics — direct marketing campaigns, your web site, a Facebook page, press releases, search engine marketing, banner ads, TV advertising, and the like. Many tactics are in play at once, and they all act on your brand in varying degrees:

And, of course, you also have happenstance working on your brand — a super-celebrity makes a passing comment about how much he/she likes your product (or, on the other hand, a celebrity who endorses your product checks into rehab), you have to issue a product recall, the economy goes in the tank, or any of these happen to one of your competitors. You get the idea. The picture above doesn’t illustrate the true messiness of managing your brand and all of the other arrows that are acting on it.

Oh, and did I mention that those arrows are actually fuzzy and squiggly? It’s a messy and fickle world we marketers live in! But, here’s where outcomes and outputs actually come in handy:

In a perfect world, you would measure only outcomes for your tactics…which would mostly mean you would actually measure at some point after the arrows enter the brand box above, but…

You don’t live in a perfect world, so, instead, you find the places where you can measure the brand outcomes of your tactics, but, more often than not, you measure the outputs of your tactics (measuring closer to the left side of the arrows above), which means…

You actually measure a mix of outcomes and outputs, which is okay!

Tactics are what’s going on on the front lines. Their outputs tend to be easily measurable. For instance, you send an e-mail to 25,000 people in your database. You can measure how many people never received it (output — bouncebacks), how many people opened it (output), how many people clicked through on it (output), and how many people ultimately made a purchase (outcome). Except the outcome…is probably something you wildly under count, because it can be darn tough to actually track all of the people for whom the e-mail played some role in influencing their ultimate decision to buy from your company. The outputs can also be measured very soon after the tactic is executed (open rate is a highly noisy metric, I realize, but it is still useful, especially if you measure it over time for all of your outbound e-mail marketing), whereas outcomes often take a while to play out.

At the same time, if you ignored measuring the tactics and, instead, focussed solely on measuring your brand, you would find that you were measuring almost exclusively outcomes (see Avinash’s post and think of typical corporate KPIs like revenue, profitability, customer satisfaction, etc.)…but you would also find that your measurements have limited actionability, because they reflect a complex amalgamation of tactics.

So, What’s the Point?

Measure your brand. Measure each of your tactics. Accept that measurement of the tactics is heavily output-biased and measurable on a short cycle, while measurement of your brand is heavily outcome-biased and is a much messier and sluggish beast to affect.

Watch what happens:

If your brand is performing poorly (outcomes), but your tactics are all performing great (outputs), then reconsider your strategy — you chose tactics that are not effective

If your brand is performing well…cut out early and play some golf! Really, though, if your tactics are performing poorly, then you may still want to scrutinize your strategy, as you’re succeeding in spite of yourself!

The key is that tactics are short-term, and driving improvement in how they are executed — through process improvements, innovative execution, or just sheer opportunism — is an entirely different exercise (operating on a different — shorter — time horizon) than your strategy for your brand. Measure them both!

3 Comments

I would make it a requirement of every campaign to not only have the measurable outputs it’s looking for, not only to have the business-related outcomes they’re looking for, but the intangible educational/informational outcomes that give you some good insights about your company, brand, campaign, and tactics that you couldn’t really put into a particular measurement plan. Things that you may have suspected or may have had no clue about that really add a personal dimension to people exposed to the campaign.

Outcomes like:
– This target audience doesn’t like message style X
– Time of day/day of week really seems to be important/seems to affect results

And then there’s the ones that weren’t really part of the campaign, but exposes things in the company that needed to be addressed:
– Oh, I guess funnel/process A really is quite broken
– Now that we have all these emails we collected, we really don’t have a focused way to market to them
– Looks like we didn’t set up enough segments/custom reports for referring traffic/campaigns

Those last set of outcomes are my favorite – we’ve used data from a campaign to gather insights about internal processes and procedures that will make every campaign going forward more successful. Anyone can fill the bucket, but it takes some good observations to fix the leaks.