Clicky

It’s the battle of the Tag Management…Conferences these
days. I was down in San Diego for Tealium’s Digital Velocity this past week and
I'm looking forward to Ensighten’s Agility Conference this week. Even the names are
kind of the same! I don’t generally favor scheduling competing Conferences so
close together (Jim Sterne and I, for example, work to keep a nice separation
between X Change and eMetrics events), but that’s kind of the way things are in
the Tag Management world these days. It’s growing, it’s crowded, it’s
competitive and it’s not all lovey-dovey.

That’s a shame really, because I admire all of the companies
in the space. Tagman pioneered this space years before the rest of us saw and
understood the need. And though Ensighten wasn’t the first TMS, to my mind they were
the first to understand the unique application of Tag Management for Web
analytics. They saw and helped popularize the idea that Tag Management was more
than universal tagging and their GUI was the first that specifically targeted
Web analytics and it’s particular and special challenges. That triggered a wave of great products that have steadily leapfrogged the
state-of-the-art (and each other) to deliver vastly improved capabilities and
experiences in a very short time. BrightTag’s Server-to-Server methodology is a fundamental re-think
of the potential for digital data collection and they serve that atop a robust
platform for analytics and non-analytics tags. Satellite’s elegant and comprehensive GUI for
Web Analytics tagging has re-defined what a TMS can do in the interface –
greatly extending the scope of the TMS and lowering the bar on the expertise
you need to implement. Tealium’s newest
release continues the trend in both directions – greatly improved user interface AND
advanced data collection (DataCloud). We’ve seen these companies almost
continually re-define what Tag Management is and can be.

Maybe friendly is over-rated.

Here’s a few thoughts from Digital Velocity. The core of
Tealium’s new release is fresh evidence of the rapid advancement in the user interface
for tag management – both in fixed and increasingly in mobile Web. For hands-on
users of these systems, it’s a huge boon.

What makes DataCloud potentially important? With any tag
management system, you have the ability to direct data to your own servers. To
create, in effect, a real-time event-level data feed that frees you from the
necessity of getting a data feed with 24 hour latency from your Web analytics
vendor. There is a bit of cost to be paid of course – you lose some of the
processing that happens vendor-side and is incorporated into those feeds. Which
means you have to do a bit more work getting the feed setup and clean.

DataCloud, however, is more than this simple re-direct
capability. The beauty of DataCloud is that it captures ALL of the event
streams (e.g. from all of your tags) and routes them to a single, cloud-based location.
But that’s not the whole story either. DataCloud creates a universal key across
all those streams. In other words, your
ad serving tags (for instance) will now have a keyed identifier that matches
your web analytics tags - visitor to visitor! For everybody. Anonymous visitors
and logged in customers. Every tag, every visitor. That’s cool. This is Genesis done in a completely different,
simpler, cleaner fashion. It’s a great idea.

Which isn’t to say there aren’t issues with this approach as
well. Naturally, it only works well with tag-based feeds. You still have to
integrate your non-tagged data the old-fashioned, sweaty way. Nor is it a
panacea for all tagged data. After all, not everything you need to know about
the event is necessarily captured client-side at the tag. So you are still
likely to have to integrate feeds from digital vendors. But that integration
will be internal key to internal key (their own keys) and you’ll have already
solved the cross-system join which is, after-all, the trickiest one.

One other thought from Digital Velocity. Like all the TMS
companies, Tealium has been growing like gang-busters. With their recent hiring
of Jeff Lunsford as CEO (and not just Jeff – it felt like a Veterans Day
re-union down in San Diego with all the WebSideStory guys there), it’s clear
that they’ve upped their game as a software and technology company. This is a
team with deep roots in SaaS development and they are putting together a lot of
the same pieces that helped build WebSideStory. That makes for a tight-knit
and highly professional technology team. That’s not an easy thing to build from
scratch and it may prove to be a critical differentiator. When you’re
evaluating enterprise software it is as much about the company as the code, and
Tealium looks to be building a great company.

As one of the Tealium guys remarked to me, “We’re putting
the band back together!”

It sure looks that way. And if they aren’t quite on a
mission from god, they are nevertheless on a crazy wild, Blues Brother’s kind
of ride.

How is it possible that with 2013 only 3 weeks old I’m
already behind schedule? I’d hoped to open 2013 with my resolutions blog
(check), finishing up the strategy series (check), and then doing a recap of
all of 2012 writings to highlight favorite pieces and revisit some of the
thinking (un-check). Perhaps I should have tackled that in December, but
somehow the holidays just got in the way. I still hope to do that big recap,
but before I go there, I wanted to highlight this past week’s release of our
first whitepaper for 2013: Measuring and
Evaluating your Social Media Effort.

In this year’s resolutions, I argued that measurement and
analytics is often much less impactful than we’d hope or expect and I laid out
a half-dozen things that enterprise analytics should be focused on to make
measurement matter. Social media wasn’t directly on that list and plays a part
only in the final item around revamping online customer research. So does it
really matter?

For most Semphonic clients (large-scale, well established
multi-channel enterprise), social media is not the marketing channel that
matters most. It’s not where our clients spend the most money. It’s not where
they put the most effort. It’s not where they get the best returns. But there
are two aspects of social media that make the potential for measurement impact
unusually high. First, and probably most important, is that social media
efforts and techniques are still evolving and maturing rapidly. Few
organizations believe that they have “figured out” social media, so the demand
for and interest in measurement is very high. Second, the organization of the
enterprise around social media has not coalesced in a single common model. Many
enterprises haven’t decided on organization or ownership of social media
functions and those that have may not have fully grasped the implications of
their decisions.

So while social media may not be the biggest piece of most
digital marketing programs, it may be a piece that measurement can impact
dramatically.

Measuring and
Evaluating your Social Media Effort provides a comprehensive view of how to
think about measurement in social media and how measurement can drive not only
the actual social media program but the enterprise organization around social
media.

The goal of measurement, after all, is clarity. A good measurement
foundation for social media begins with a functional consideration of what social
media might be used for and accomplish. These use-cases not only drive down
into the actual measurement required, they drive “up” into the
pieces of the organization that should own them. So creating a good measurement
foundation for social media will often provide helpful clarity around
organization.

That’s why Measuring and Evaluating your Social Media
Effort opens with a look at the wide-range of different business functions
that social media can support: from viral campaigns to customer support to
brand research.

The argument here is simple. The term “social media” doesn’t
capture one thing that can be packaged up in a single place in the enterprise
and measured in a single way. Instead, it encompasses a wide variety of purposes,
owners, measurements, and meaningful analytics. That diversity has deep implications for how your organize social media efforts across the enterprise. It's almost impossible (and usually totally mistaken), when an enterprise has a diversity of social media interests, to lodge ownership in a single place.

Given that diversity of functions, how can the enterprise
approach social media measurement? In the white paper, we lay out Semphonic’s answer.
It’s an answer rooted in our approach to digital measurement in
general. Start with your audience. Break audience down by purpose. Measure
accordingly. It’s Two-Tiered Segmentation for the social media world.

Of course, moving this digital measurement approach into
social media requires some significant adaptations. Not the least of these is
understanding how to begin the process of segmentation in social media and the
critical role of sampling in that segmentation process. Almost every social
media measurement you make is part of some sample, even when your goal is to
measure the entire population. The size and complexity of the social
environment make true population capture nearly impossible.

Typically, social media measurement works by capturing a
very large percentage of the conversations taking place. Surely if you’re
capturing 80% of all conversations, you’re conclusions are likely to be much
more accurate than traditional opinion research where samples were often no
more than tiny percentiles of the actual population…

Not really.

It’s actually better to have a small but truly random sample of
a population than a large non-random sample. The process of listening in social
media often introduces significant non-random filters of the population, making
even very large captures potentially misleading for measurement.

If deciding on your audience is job one in social media, getting your samples right (or close to right) is job two.

With audience and function defined, the next step in
effective social media measurement is to create appropriate measurements of
success. As with all digital measurement, good KPIs must be set within the
context of audience and function. The whitepaper carves out a set of likely
KPIs for functions as different as communities, customer support, viral
campaigns, and PR.

These KPIs deepen typical social media measurement and are
meant to provide a powerful kick-start to effective thinking about metrics and
reporting in social dashboards.

Sometimes, of course, it’s as important to know what NOT do
as it is to get positive guidance. The whitepaper addresses a class of really
bad social media KPIs and using two examples (Brand Mentions and Facebook
Fans) shows why there are misleading
or wrong-headed.

Finally, the whitepaper delves into the implications for
technology. Just as measurement imperatives drive up to organization and down
to metrics, they also drive across to technology. That tie between measurement imperative and technoloy stack is a big piece
of what I’ve been showing how to connect in the series on Digital Measurement
Strategy. With so many different functions, types of success and organizational
owners, it’s no surprise that, in our view, choosing a single “social media
technology” is wrong-headed. The white paper lays out some key groupings of
technology by function and, in particular, delves into the role of listening
tools for data collection.

You can download the entire white paper (for free of course)
on the Semphonic Website. Download here!

In a series of posts at the end of last year, I outlined the
steps to create a comprehensive digital measurement strategy. Building a
strategy begins with a careful, objective assessment of the existing digital measurement program. In the next step, a high-level model of
the digital business is constructed. The model is designed to focus attention
on what needs to be captured to understand and optimize the digital channel.
Using that Model, a Data Science Roadmap is created. The Roadmap describes the
real-world analytics projects that need to be tackled. These projects are used
to drive the technology stack and data integration requirements. These
requirements encapsulate everything you need as an organization to create the
digital measurement system envisioned in the business model. In the final step
of the process, all of these steps are brought together into a single,
prioritized plan. The plan describes the analysis projects, the order
they’ll be tackled, and the technology and human resources required to execute those projects.

What I like about this process is how closely it ties each
step together. The ultimate output of a strategy is a budget, and in an
enterprise, your always fighting for some larger piece of the pie. That makes
it imperative that your budget build a tight bridge between what you’re asking
for and what you’re promising to deliver.

Heaven knows that budget processes are usually anything but
rational. On the national political stage, such processes are visibly risible.
But it’s not always much better in large private enterprise. Naturally,
everybody wants more money and everyone thinks they have good reasons for it. In
the world of analytics and measurement, it isn’t easy to prove your case. We
may sometimes benefit (and I for one thank the stars) from the trendiness of “big
data” and “analytics”, but we suffer constantly from the perception that investment
in measurement produces a “soft” return and that we are a discipline that has
consistently over-promised and under-delivered.

By closing the loop between what you’re asking for and what
you will deliver (and how that delivery will help optimize the digital
channel), you can forestall many of the common enterprise concerns. You’ll also
be delivering a budget and a strategy that are not only much more transparent
and tightly connected than is common in digital analytics, you’ll be delivering
something that is better than 99% of the budget requests Senior Managers see. Creating
a tight linkage between real strategic thinking and tactical budget requests
just isn’t the norm.

Fortunately, once we’ve gotten to this step in the process,
we have almost everything we need already in place. We have the current state
(Assessment), desired state (Model), the project plans (Data Science Roadmap),
and the resourcing requirements by project. All that’s really needed now is a prioritization of the project plans.

Not that prioritization is an easy exercise.

I think it’s best handled in two steps. Recall that we broke
up each of the steps into system components. So, for example, the business
model, data science plan and data integration strategy are all grouped within
systems:

By tackling a system at at time, you can reduce the number of projects to a more manageable and understandable level:

This slide summarizes a powerful story. It describes the
analysts necessary to tackle the measurement within the Conversion System. It
explains what those analysts are committed to producing (and when they’ll
produce it) and it shows the technology tools necessary for their tasks. In
conjunction with the Data Science Roadmap and the Business Model, it tells a
complete story: what you want to accomplish, how you’ll do it, and the
resources necessary to get the job done.

This is also a powerful antidote to corporate Pollyanna thinking
of the sort that goes along with having 3 people on your measurement team while
you talk about building a world class analytics program. Digital programs need
to be able to show what it means to do the job right (the Model), how they plan
to do it (the Roadmap), and the resources necessary for that.

If, for example, you want the basic site analytics to
optimize conversion (Topology, Use Cases, and Funnel Analysis) done in the
first half of 2013, you need two analysts devoted to these projects. Cut an
analyst, and you know exactly what you’ll lose. Fund another analyst, you know
exactly what you’ll get.

To me, that’s the essence of good planning, good budgeting,
and good strategy.

It’s hard not to get caught up in the New Year’s frenzy of
evaluation, assessment, and prediction. 2012 was, by a large margin, the most
successful year ever at Semphonic and that mirrors the broader industry.
Analytics is becoming ubiquitous and the need for measurement and data science
has come closer to cliché than to evangelism. That’s all good. It’s great to be
on the right-side of history. These are exciting times.

But change in the real-world has a funny way of confounding
your expectations and happening both much faster and much more slowly than you
believe. It’s a conundrum wrapped up nicely by that old cliché that the more
things change, the more they stay the same.

A lot has changed in digital analytics. There's been a dramatic leap into
big data platforms and the explosion of Hadoop. We've seen significant advances with Web
analytics tools into deeper segmentation, more detail data, and better
integration with personalization and testing. True real-time
personalization engines and CMS' designed to support web personalization have emerged and carved out real market share.
The rapid evolution of mobile measurement within analytics and warehousing
tools has provided vastly improved mobile measurement capabilities. The creation of a new generation of Social Media measurement software has shifted the landscape from toys to tools.

So what’s stayed the same? I think it’s the actual use of measurement
to drive the digital channel.

For every client that hires Semphonic to analyze and improve
their digital performance, there are three that hire us to audit their tags,
build them reports, or help figure out a strategy. This despite the fact that
we’re a company that sells hard on analytics. Having the right tags, good
reports and a real strategy are all valuable. Even critical. But I think the
percentages should be reversed. Tags, reports and strategy are all there to
help produce analysis that drives change.

The truth is that for a lot of companies, having measurement
and analytics is about having measurement and analytics. Not using them. It’s a
lot of hand-waving and empty gestures.

If you’re a digital marketing manager, a good goal for 2013
is to drive measurement use. That simple. I think it’s time to stop focusing on the constituent
pieces (technology, infrastructure, reporting, analysis) and start focusing on
the application of measurement. For me personally, I want to focus our
analytics practice in 2013 on clients who let us really drive their digital
channel.

So how do you do that?

Here are half-a-dozen things I think every enterprise should
be doing in 2013 to drive measurement USE:

Creating an independent team that
does nothing but analyze and suggest optimizations for digital marketing spend
(regardless of your agency setup). At minimum, this team should develop and use
a Media Mix Model, a model of Incremental lift and attribution, and an analysis
of program variance.

Creating a measurement driven
testing plan with Use-Case Analysis and Segmentation. This analysis should
identify core use-cases on the site by segment and define a testing plan for
each based on key behavioral findings. It should include a buffet of smaller
tactical recommendations and at least one larger testing hypothesis for
improvement in each use-case. Analysis to Testing is THE cornerstone of
creating a virtuous cycle of measurement and change in your digital properties.

Creating a basic personalization
system for online. Personalization is the most powerful method of using
analytics to drive impact. The tools now exist (from CQ5 to Celebrus to SAS to Causata)
to support effective digital
personalization. You can’t afford to let this capability languish. For most
organizations, it is the ultimate use of digital analytics.

Changing from reporting to
forecasting. When I review my work in 2012, no theme seems more resonant to me
than the idea that vast majority of current measurement is about “showing the
current state“. I liken this to a weatherman whose only instrument is a
thermometer. That makes for a pathetic weather report. It’s time to move from
building thermometers to building barometers. Forecasting is the easiest path
to analysis there is. It’s a way to force the organization to understand the
way the business works, the key levers of change, and the holes in the
measurement system (quite similar, in fact, to what I’ve been talking about in
the Strategic Plan posts).

Use analysis to nail down at least
one key business question. I don’t care if it’s establishing a lifetime value
model, a lead assessment model, a lead assignment model, a Web to Call-Center
proclivity model, etc. The goal should be to start answering big questions.
Pick an important problem and DO WHATEVER IT TAKES to build an analysis that
drives optimization.

Fundamentally revamp your online
customer research. The current state of online survey research and use of
customer attitudes is pathetic. Consolidate and standardize all your customer
attitudes research in a warehouse, create an enterprise-wide customer attitudes
reporting system, and make sure that survey research is focused on researching
key attitudes and drivers of choice that can be used to support my 5 previous
steps. I don’t know of a single enterprise in the world who is doing this well.

My hope is that in 2014, we can look back at
2013 as a year of great change, not in the things that always seem to evolve,
but in the things that never seem to. By concentrating on these types of goals,
we can make 2013 the year in which measurement actually mattered.