The Impact Matrix | A Digital Analytics Strategic Framework

The universe of digital analytics is massive and can seem as complex as the cosmic universe.

With such big, complicated subjects, we can get lost in the vast wilderness or become trapped in a silo. We can wander aimlessly, or feel a false sense of either accomplishment or frustration. Consequently, we lose sight of where we are, how we are doing and which direction is true north.

I have experienced these challenges on numerous occasions myself. Even simple questions like “How effective is our analytics strategy?” elicit a complicated set of answers, instead of a simple picture the CxO can internalize. That’s because we have to talk about tools (so many!), work (collection, processing, reporting, analysis), processes, org structure, governance models, last-mile gaps, metrics ladders of awesomeness, and… so… much… more.

Soon, your digital analytics strategic framework that you hoped would provide a true north to the analytics strategy question looks like this…

The frameworks above cover just one dimension of the assessment (!). There is another critical framework to figure out how you can take your analytics sophistication from wherever it is at the moment to nirvanaland.

There are two deeply painful outcomes of the approaches you see in the pictures above (in which you’ll also see my work represented as well).

1. Obvious:

No CxO understands the story we are trying to tell – or, even the fundamentals of what we do in the world of analytics. Therefore, they are inclined to remain committed to faith-based decision-making and continue to starve analytics of the attention and investment it deserves.

2. Non-obvious:

Leaders of analytics organizations do not truly appreciate the wonderful effectiveness, or gross ineffectiveness, of their analytics practice (people, process, tools). You see… None of the currently recommended frameworks and maturity models aids analytics leaders in truly understanding the bottom line impact of their work. The result is analytical strategies that are uninformed by reality, and driven new tool features, random expert recommendations and shiny objects (OMG we have to get offline attribution!).

When one grasps these two outcomes – blind business leaders, blind analytics leaders – it is simply heartbreaking.

Simplifying Complexity.

The dilemma of how to simplify this complexity, to create sighted business and analytics leaders, has lingered with me for quite some time. I’ve intended to create a simple visual that absorbs the scale, complexity and many moving parts.

I wanted to create a visual that would function as a diagnostic tool to determine if you are lost, trapped in a silo or wandering aimlessly. It would help you realize the extent to which analytics impacted the business bottom line today, and what your future analytics plans should accomplish.

Then one day, a magic moment.

During a discussion around planning for measurement, a peer was struggling with a unique collection of challenges. He asked me a couple of questions, and that sparked an idea.

I walked up to the whiteboard, and excitedly sketched something simple that abstracted away the complexity – and yet preserved the power of smarter thinking at the same time.

Here’s the sketch I drew in response:

Yes, it was an ugly birth. But, to me, the proud parent, it was beautiful.

It took a sixteen hour direct flight to Singapore for the squiggly sketch to come to life – where else, in PowerPoint!

The end result was just five slides. As the saying goes: It's not the ink, it's the think.

I want to share the fully fleshed out, put into practice and refined, version of those four slides with you today. Together, they’ll help you fundamentally rethink your analytics practice by, 1. understanding data’s actual impact on your company today and, 2. picking very precise and specific things that should be in your near and long-term analytics plans.

The Impact Matrix.

To paint a simple picture of the big, complicated world of analytics, the whiteboard above shows a 2×2 matrix.

The business impact is on the y-axis, illustrated from Super Tactical to Super Strategic.

The time-to-useful is on the x-axis, illustrated from Real-Time to 6-Monthly.

Before we go on… Yes, breaking the x-axis into multiple time segments creates a 2×5 matrix, and not a 2×2. Consider that to be the price I’ve paid in order to make this more actionable for you. :)

Diving a bit deeper into the y-axis… Super Tactical is the smallest possible impact on the business (fractions of pennies). Super Strategic represents the largest possible impact on the business (tens of millions of dollars).

The scale on the y-axis is exponential. You’ll notice the numbers in light font between Super Tactical and Super Strategic go from 4 to 10 to 24 to 68 and onward. This demonstrates that impact is not a step-change – every step up delivers a massively higher impact.

Diving a bit deeper into the x-axis… While most data can be collected in real-time now, not all metrics are useful in real-time.

As an example, Impressions can be collected in real-time and they can also become useful in real-time (if actioned, they can have a super tactical impact – fractions of pennies). Customer Lifetime Value on the other hand takes a long time to become useful, over months and months (if actioned, it can have a super strategic impact on the business – tens of millions of dollars).

Here is a representation of these ideas on the Impact Matrix:

[You can download an Excel version of the Impact Matrix at the end of this post.]

Impressions can be used in real-time for decision-making by your display, video and search platforms (e.g., via automation). You can report Gross Profit in real-time, of course, but doing so is almost entirely useless. It should be deeply analyzed monthly to yield valuable, higher impact actionable insights. Finally, Lifetime Value will require perhaps the toughest strategic analysis, from data accumulated over months, and the action takes time to yield results – but they are magnificent.

Pause. Reflect on the above picture.

If you understand why each metric is where it is, the rest of this post will fill you with euphoric joy rarely experienced without physical contact.

The Impact Matrix: A Joyous Deep Dive.

In all, the Impact Matrix contains 46 of the most commonly used business metrics – with an emphasis on sales and marketing. The metrics span digital, television, retail stores, billboards, and any other presence of a brand you can think of. You see more digital metrics because digital is more measurable.

Some metrics apply across all channels, like Awareness, Consideration and Purchase Intent. You’ll note the most critical bottom line metrics, which might come from your ERP and CRM systems, are also included.

Every metric occupies a place based on business impact and time of course, but also in context of other metrics around it.

Here’s a magnified view that includes the bottom left portion of the matrix:

Let’s continue to internalize impact and time-to-useful by looking at a specific example: Bounce Rate. It’s in the row indicating an impact of four and in the time-to-useful column weekly. While Bounce Rate is available in real-time, it is only useful after you’ve collected a critical amount of data (say, over a week).

On the surface, it might seem odd that a simple metric like Bounce Rate has an impact of four and TV GRPs and % New Visits are lower. My reason for that is the broader influence of Bounce Rates.

Effectively analyzing and acting on Bounce Rates requires the following:

* A deep understanding of owned, earned and paid media strategies.

* The ability to identify any empty promises made to the users who are bouncing.

* Knowing the content, including its emotional and functional value.

* The ability to optimize landing pages.

Imagine the impact of those insights; it is well beyond Bounce Rates. That is why Bounce Rate garners more weight than Impressions, Awareness and other common metrics.

When designating a metric as a KPI, this is your foremost consideration: depth of influence.

With a better understanding of the Impact Matrix, here’s the full version:

[You can download an Excel version of the Impact Matrix at the end of this post.]

As you reflect on the filled out matrix, you’ll note that I’ve layered in subtle incentives.

For example, if you were to compute anything Per Human, you would need to completely revamp your identity platforms (a strategy I’ve always favored: Implications Of Identity Systems On Incentives). Why should you make this extra effort? Notice how high those metrics sits on the business impact scale!

Other hidden features.

The value of voice of customer metrics is evident by their high placement in context of the y-axis. Take a look at where Task Completion Rate by Primary Purpose and Likelihood to Recommend are, as an example. They are high in the hierarchy due to their positive impact on both the business and the company culture – thus delivering a soft and hard advantage.

You’ll also note that most pure digital metrics – Adobe, Google Analytics – sit in the tactical bottom line impact. If all you do day and night is just those metrics, this is a wake-up call to you in context of your actual impact on the company and the impact of that on your career.

At the top-right, you’ll discover my obsession with Profit and Incrementality, which form the basis of competitive advantage in 2018 (and beyond). Analyzing these metrics not only fundamentally changes marketing strategy (think tens of millions of dollars for large companies); their insights can change your company’s product portfolio, your customer engagement strategies and much more.

The matrix also includes what is likely the world’s first widely available machine learning-powered metric: Session Quality, which you’ll find roughly in the middle. For every session on your desktop or mobile site, Session Quality provides a score between 1 and 100 as an indication of how close the visitor is to converting. The number is computed based on a ML algorithm that has learned from deep analysis of your user behavior and conversion data.

Pause. Download the full resolution version of the picture. Reflect.

It is my hope that the placement of each of the 46 metrics will help you add metrics that might be unique to your work. (Share them in comments below, add to our collective knowledge.)

With a better understanding of the matrix, you are ready to overcome the two problems that broke our hearts at the start of the post – and do something super-cool that you did not think we might.

Action #1: Analytics Program Maturity Diagnostic.

Enough theory, time to some real, sexy, work.

The core driver behind creation of the Impact Matrix was the non-obvious problem #2: How much does your analytics practice matter from a bottom line perspective?

YOU matter if you have a business impact. You’ll have a business impact if your analytics practice is sophisticated enough to produce metrics that matter. See the nice circular reference?

:)

In our case we measure maturity not by evaluating people, process, and layers upon layers of tools, rather we measure maturity by evaluating the output of that entire song and dance.

Answer this simple question: What metrics are most commonly used to make decisions that drive actual actions every week/month/more?

Ignore the metrics produced as an experimental exercise nine months ago. Ignore the metrics whose only purpose is to float along the river of data pukes. Ignore the metrics you wish you were analyzing, but don’t currently.

Reality. Assess, reality. No point in fooling yourself.

Take the subset of metrics that actively drive action, and change the font color for them to green in the Impact Matrix.

For a large European client with a multi-channel existence, here’s what the Impact Matrix looked like after this honest self-reflection:

More of the digital metrics are green, because there are more digital metrics period. You can see the company’s marketing strategy spans television and other offline advertising, including retail.

You’ll likely recognize many of these metrics as the one that your analytics practice outputs every day. They represent the result of a lot of hard work by the company employees, and external analytics partners.

We are trying to answer the how much does the analytics practice matter question. You can see that more sharply now.

For this company most green metrics cluster in the bottom-left quadrant, with most having an impact of ten or under (on a y-axis scale of 1 to a ). There is one clear outlier (Nonline Direct Revenue – a very difficult metric to compute, so hurray!)

As every good consultant know, if you have a 2×2 you can create four thematic quadrants. In our case the four quadrants are called Solid Foundation, Intermediate, and Advanced:

For our company, the maturity of the analytics practice fit mostly in the Solid Foundation quadrant.

Is this a good thing?

It depends on how long the analytics practice has been around, how many Analysts the company has, how much money it has invested in analytics tools, the size of their agency analytics team, so on and so forth.

If they have a team of 50 people spending $18 mil on analytics investment each year, over the last decade, with 12 tools and 25 research studies each year… You can now infer that this is not a good thing.

Regardless, the Impact Matrix now illuminates clearly that highly influential metrics are underutilized. These are the metrics that facilitate deeper thought, patience and analysis to deliver big bottom line impact.

Recommendation Uno:

Conduct this exercise for your own company. Identify the metrics actively being used for decision-making. Which quadrant reflects the maturity of your analytics program? With the investment in people, process, tools, and consultants, are you in a quadrant where your bottom line impact is super strategic?

Recommendation Dos:

Identify your target quadrant. In this instance the company could move bottom-right and then up. They could also move top-left and then top-right. The choice depends on business strategy and current people, process, tools reality. This should be obvious; you always want the Advanced quadrant lit up. But, you can’t go from Beginner to Advanced directly – evolution works smarter than revolution. (If your Solid Foundation quadrant is not lit up, do that first.)

Recommendation Très:

Create a specific plan for the initiatives you need to undertake to get to your next desired quadrant. You’ll certainly need new talent, you’ll need a stronger strategic leader (less ink, more think), you’ll need to identify specific analytics projects to deliver those metrics, and you’ll most definitely need funding. Divide the plan into six-month segments with milestones for accountability.

The good news is that it is now, finally, clear where you are going AND why you are going there. Congratulations!

Recommendation Cuatro:

Start a cultural shift. Share the results of your assessment, the green and black reflection of the current reality, with the entire company. Inspire each Marketer, Finance Analyst, Logistics Support Staff, Call Center Manager, and every VP to move one step up or one step to the right. If they currently measure AVOC, challenge them to move to Unique Page Views or Click-thru Rate. It will be a small challenge, but it will improve sophistication and, as you can see in the matrix, the impact on the bottom line.

Most companies wait for some Jesus-Krishna hybrid to descend from heaven and deliver a glorious massive revolution project (overnight!). These never happen. Sorry, Jesus-Krishna. Instead, what it takes is each employee moving a little bit up and a little bit to the right while the Analytics team facilitates those shifts. Small changes accumulate big bottom line impact over time.

So. What’s your quadrant? And, what’s your next right or next up move?

Action #2: Aligning Metrics & Leadership Altitude.

When offered data, everyone wants everything.

People commonly believe that more data means better results. Or, that if an Agency is providing a 40 tab, font size 8, spreadsheet full of numbers that they must have done a lot of work – hence better value for money. Or, a VP wants two more histograms that represent seven dimensions squeezed into her one page dashboard.

If more data equaled smarter decisions, they would be peace on earth.

A core part of our job, as owners of the analytics practice, is to ensure that the right data (metric) reaches the right person at the right time.

To do so, we must consider altitude (aka the y-axis).

Altitude dictates the scope and significance of decisions. It also dictates the frequency at which data is received, along with the depth of insights that need to accompany the data (IABI FTW!). Finally, altitude determines the amount of time allotted to discuss findings.

Managers have a lower altitude, they are required to make tactical decisions – impacting say tens of thousands of dollars. VPs have a higher altitude, they are paid a ton more in salary, bonus and stock, because they carry the responsibility for making super strategic decisions – impacting tens of millions of dollars.

This problem has a beautifully elegant solution if you use the Impact Matrix.

Slice the matrix horizontally to ensure that the metrics delivered to each leader are aligned with their altitude.

[You can download an Excel version of the Impact Matrix at the end of this post.]

VPs sit at decision making that is squarely in the Super Strategic realm – on our scale ~40 and higher. This collection of metrics power heavy decisions requiring abundant business context, deep thinking and will influence broad change. Analysts will need that time to conduct proper analysis and obtain the IABI.

You can also see that nearly all metrics delivered to the VPs arrive monthly or even less frequently. Another reflection of the fact that their altitude requires solving problems that will connect across orgs, across incentives, across user touch points, etc.

So. Are the metrics on your VP Dashboards/Slides the ones in Super Strategic cluster?

Or. Is your analytics practice such that your VPs spend their time making tactical decisions?

Below the VP layer, you’ll see metric clusters for slightly less strategic impact on the company bottom line for Directors. The time-to-useful also changes on the x-axis for them. Following them is the layer for managers, who make even more frequent, tactical decisions.

The last layer is my favorite way to improve decision making: Removing humans from the process. :)

Recent technical advancements allow us to use algorithms – machine learning – to automate decisions made by metrics that have a Super Tactical impact. For example, there is no need for any human to review Viewability because advanced display platforms optimize campaigns automatically against this metric. In fact an expensive human looking at reports with that metric will only slow things down – eliminating the fractions of penny impact that that metric delivers.

Recommendation Cinco:

Collect the dashboards and main reports created by your analytics practice. Cluster them by altitude (VP, Directors…). Identify if the metrics being reported to each leadership layer are the ones being recommended by the Impact Matrix.

For example: Does your last CMO report include Profit per Human, Incremental Profit per Non-line Channel, % Contribution of Non-line Channels to Sales? If yes, hurray! Instead, if they are reporting Awareness, Consideration, Intent, Conversions, Bounce Rate… Sad time. Why would your CMO use his or her valuable time making tactical choices? Is it a culture problem? Is it a reflection of the lack of analytical savvy? Why?

Put simply, the big and complicated is not so big and not so complicated. This simple analysis will help identify core issues that are stymieing the contribution data can make to smarter, faster, business success.

Recommendation Seis:

Kick off a specific initiative to tackle automation. If data is available in real-time and useful in real-time, there are algorithms out there that can make decisions for humans. If there is a limitation, it is all yours (people, bureaucracy, connection points, etc.).

For the other layers, action will depend on what the problem is. It could require new leadership in the analytics team, it could require a shift in company culture, or it could require a different engagement model across layers (managers, directors, VPs). One thing adjusting the altitude will certainly require: Change in how employees are compensated.

As you notice above, the strength of the matrix is in it’s ability to simplify complexity. That does not mean that you don’t have to deal with complexity – you can be more focused about it now!

Action #3: Strategic Alignment of Analytical Effort.

One more slicing exercise for our matrix, this time for the analytics team itself.

Analytics teams face a daunting challenge when figuring out what type of effort to put into tackling the fantastic collection of possibilities represented in the Impact Matrix.

That challenge is compounded by the fact that there is always too much to do and too few people to do it with. Oh, and don’t get me started on time! Why are there only 24 hours in a day??

So, how do we ensure that each has an optimal analytical approach?

Slice the matrix vertically along the time-to-useful dimension…

[You can download an Excel version of the Impact Matrix at the end of this post.]

For any metric that is useful in real-time, we have to unpack the forces of automation. Campaigns can be optimized based on real-time impressions, clicks, visits, page views, cost per acquisition etc. We need to stop reporting these, and start feeding them into our campaign platforms like AdWords, DoubleClick etc. With simple rules – ranges mostly – automation platforms can do a better job of taking action than humans.

If you are investing in machine learning talent inside your team, even narrowly intelligent algorithms they build will learn faster and surpass humans quickly for these simple decisions.

With the day-to-day sucking of life spirit reduced, tactical impact decisions automated, the analytics practice has time to focus on metrics that have a longer time-to-useful and need deeper human analysis to extract the IABI.

For metrics available weekly or within a few weeks, reporting to various stakeholders (mostly Managers and Directors) should adequately inform decisions. Use custom alerts, trigger threshold targets and more to send this data to the right person at the right time. For weekly time-to-useful metrics, your stakeholders have enough tactical context that you don’t need to spend time on deep analysis since the metrics inform the tactical decisions.

More role clarity, a thoughtful shift of the burden to the stakeholders, and more free time to focus on what really matters.

For where time-to-useful is in the month range, you are now truly heading into strategic territory. Reflect on the metrics there – challenging, strategic, Director and VP altitude. It is no longer enough to just report what happened, you need to identify why it happened and what the causal impact is for the why factors. This will yield insights that will have millions of dollars of potential impact on the company. That means, you’ll need to invest in ensuring your stories have more than just insights but also include specific recommended actions and predicted business impact. Amazingly, you’ll have just as much text as data in your output (that’s how you know you are doing it right!).

Finally, we have the pinnacle of analytics achievement. Our last vertical slice includes metrics that measure performance across customer segments, divisions and channels, among other elements. This is where meta-analysis comes into play, requiring even more time, with even more complex analytical techniques that pull data into BigQuery or similar environments where you can do your own joins, unleash R, use statistically modeling techniques (hello random forests!) to find the most important factors affecting your company’s performance.

The distribution of your analytical team’s effort across these four categories is another method of assessing maturity as well as ensuring optimal impact by the precious few analytical resources. For example: If most of your time is occupied by providing data to decision-makers for metrics in the Automate and Reporting vertical slices, you are likely in the beginner stage (and not having much impact on the business bottom line).

Recommendation Siete:

Find an empty conference room. Project all the work your team has delivered in the last 30 days on the screen. Cluster it by Automated, Reporting, Analysis and Meta-Analysis. Roughly compute what percentage of the team’s time was spent in each category. What do you see? Is the distribution optimal? And, are the metrics in each cluster the ones identified by the Impact Matrix?

The answers to these questions will cause a fundamental re-imagination of your analytics practices. The implications will be deep and wide (people, process, tools). That is how you get on the road to true nirvanaland.

#sisepuede

At the core of the Impact Matrix is the only thing that matters: the business bottom line. Using two simple dimensions, impact and time-to-useful, you can explain simply three unique elements of any successful analytics practice. The reflections are sometimes painful, but bringing them to light allows us to take steps toward systematic improvement of our analytical practice.

When your CMO asks, “How effective is our analytics strategy?”, what’s your answer? How simply can you frame it? What are the primary inputs to your near and long-term analytics evolution plans? If your VPs are getting the metrics in the Advanced quadrant, what strategies have been effective in getting you there? If you’ve successfully implemented pattern matching and advanced classification meta-analysis techniques, care to share your lessons with us?

Please share your feedback about the Impact Matrix, and answers to the above questions, via comments below. I look forward to the conversation.

I'm betraying my nerdery here, but after the intro to the Matrix my pulse quickened. I could not wait to read unpacking of the next element! Yes, even I can't believe I'm saying that about an article on analytics frameworks.

The x and y-axis are such simple manifestations of what really matters. Yet, the reason this article is four thousand words is that the simple somehow captures all the complexity.

My favourite bits were the vertical and horizontal slicing. I colored our most used metrics in green, the vertical slice paints a unforgiving picture for us. A lot of work to do.

Sage advice as always, my good sir. I can't argue with you nor do I think you'll find anyone here disagreeing too much.

That said, I've seen entire boards get swayed and make the right decision for the wrong reason. For example, let's say a company is obsessed with one vanity metric that goes up in a time period at the same time that an underappreciated, yet truly important KPI improves. What do you say here? Anything?

I feel like I've made huge mistakes by pointing out that some numbers people care about don't really matter and they should be impressed by something else instead. In fact, I'm finding that it's the easiest way to lose a deal in sales so I've stopped doing it as much :)

Does it ever pay to let people make good decisions based on bad data or does it typically come back to bite you at some point? These situations are always awkward for me. What do you do?

When I have an opportunity to sell an idea, I work hard to quantify the impact of that good idea. Ex: I love the % Contribution to Nonline Sales as a metric. It is very expensive to produce (via MMMs). I need a leader to give me money to do it. I lead with the benefit – by how much can I improve the bottom line (save money or increase revenue). Then, if the leader loves something else it atleast encourages them to try to quantify the impact of what they love. Most of the time they can't, we move with the quantified idea.

Every time I feel like I'm catching up to you, you move the goal further out Avinash.

It was ABO, but you created something better with DMMM. We had that going and you created See-Think-Do-Care. We had good success with that and were feeling settled but you have once again dropped something smarter with the Impact Matrix!

Thank you for constantly pushing the industry thinking up and to the right.

To add to the matrix, I find myself fixating on getting the most efficient CPM for my display advertising regardless of the quality of the placement, or the quality of any on-site sessions. While I would place CPM as a super tactical metric between real-time and monthly (and it is absolutely something I automate on certain platforms), I still fixate on this metric because the websites for the accounts I manage are of poor quality. By poor quality, I mean the websites have slow loading times, no conversions or micro-conversions, and the information to gain from the sites is pretty minimal.

In my situation, page depth is a fairly irrelevant metric because I can pick and choose the most relevant page to use as a landing page for any ad type (these websites have roughly 6 pages that are relevant to consumers, not including social links to Facebook, Instagram etc.). This takes away a lot of meaning from a ton of on-site metrics like bounce rate and page value and makes it extremely hard to accurately measure other useful metrics like site Quality (which these sites cannot meet the requirements for) and consideration level.

Unfortunately, I have no control over the layout, complexity, places, to put specific tracking code, and goal paths of these websites. Also, I'm not aware of a good way to unify offline success to on-site metrics. With the technology I use, I'm able to get much more information out of ad platforms than the website, but I'd like to make more use out of analyzing website data.

Avinash, do you or any of the other great minds here have recommendations for moving up and right just within the lower left quadrant for marketers with incredibly basic websites that represent local brick-and-mortar stores?

Steve: It feels like we should have a longer conversation, I sense there is a lot more in your story that I don't understand. Let me still try to add some value.

CPMS are a super tactical metric. And, most modern platforms where you buy display ads can take into account both the CPM and Outcomes (macro conversions atleast) to optimize your placements. They do so using Machine Learning – algorithms that take into account hundreds of thousands (not a metaphor) variables to optimize for highest value for you. This means, you don't have to worry about your placement websites being poor – the platform will take that into account.

Page Depth is less a metric related to landing pages, and more a reflection of the value delivered to the user in their experience on your site.

Overall, if all you have to go on is the ad platform – regardless of if it is AdWords or your friendly small ad network – it will always impose a limit on the insights you can get. It will be almost impossible to get out of the Solid Foundation (or even cover it completely). This will also limit the consulting rate you can charge or salary increases you can get.

My recommendation is: Hand off as much as you can to automation, try to move one metric up or one metric to the right of where you are today. Take some of the time this progress delivers to reflect on if you would like to consider other roles where you can use the full power of your brilliance.

Frameworks thus far, including mine linked to at the top of this post, obsess about activity. They were simply too self-obsessive. Let's look at all dimensions of analytics activity, and take for granted that a movement in activity equates with business outcomes. It was rarely true.

The Impact Matrix does not care about all the analytics activity. It cares about the outcome of that activity, and what the expected size of impact is from the outcomes. The change of perspective, looking at the business instead of the mirror, delivers a radically different analytics strategy.

There is one more step still left: Figuring out how to make that new strategy come to life. In this, self-reflective analytics frameworks and maturity models still play a role.

I absolutely agree – the Impact Matrix and the notion of maturity goes hand in hand. The idea of "time to impact" in your matrix is reflected by the fact growing one maturity level typically takes a year (in my model at least, and this is because "humans" are involved, and humans needs time to truly embrace a data-informed culture!). Obsessing on objectives and delivering KPIs is a recipe for failure if those efforts aren't aligned with other key process areas (i.e. the 5 dimensions of my model) – which is why "balancing" the spider chart is also super important.

Note that initially my model had 6 key process areas (Objectives and Scope were distinct). After regrouping those two I realized having a balanced score showed a star.

So if analysts wants to become super stars, they should embrace your matrix and the DAMM :)

P.S. Honestly, I haven't worked on the DAMM in a while – but I frequently receive feedback about how relevant and useful it is – even if it was conceived a decade ago!

I've been a long time fan of Occam's Razor. From that lens what stood out to me is the evolution of the recommendations here.

In your early posts the solution to the CMO question was to use better metrics, or smarter reports. Then you recommended more types of data with Web Analytics 2.0. The Digital Marketing Measurement Model emphasized the point of view that data was not enough, management and strategy were key connections for Analysts. With the Impact Matrix you are pushing us to completely absorb the management lens by ignore reports and tools in that discussion.

On one hand it is humbling for this long time analytics practitioner to see that exposing tools and governance and analytics detail is not effective in improving management's ability to discern our impact. On the other hand as Steven mentions I am glad you are helping us evolve so that we are not stuck in old beliefs.

I can't say. It was a proprietary metric I'd created for just one client. I can say it measures something incredibly valuable that has a third order impact – over months and months – related to the company's fundamental existence. Hence, super valuable.

– What was the reasoning behind labeling (brand) awareness as relatively tactical? Most (How Brands Grow) Brand marketeers see this as the holy grail and might agree less with the overview because of this, making it harder to commit to the exercise..?

– I personally like the metric 'share of branded queries within a category' as a kpi for the SEE stage as this reflects actual behavior. Would this be a valuable addition to the overview?

Hi Avinash, congrats for this DAS framework – It makes a lot of sense because is pretty actionable and challenge analysts to go further in their mission to nurture a better analytical culture inside organisations, what I believe still remains the most sceptical and underestimate outcome.

On my point of view, “time-to-useful” is the main concept here in terms of "design thinking" – It should demanding enough to take the most out of the whole approach.

I really try and implement a recommendation from each read and strive for incremental improvement. My analytics get better everyday.

I still struggle to influence leadership with the importance of your recommendations. I'm pretty excited to implement the analytics matrix with my newest client. They are low on the analytics maturity scale so there should be plenty of opportunity to implement these ideas.

First, I want to stress that it is hard to influence leadership. Over time I've found that I have to walk away from data this and data that and look at how much better data, and re-frame my perspective to be from their view. This sounds simple, it takes work, but I do two things 1. What three things are they really trying to get done for the business? They, not me (so many times I would take the business in a totally different direction, but that initially is irrelevant). 2. What can I do for each of those three things to achieve them 50% faster?

This investigation (#1) and solutions (#2) will get them to stop and listen. For each solution I create, I also do a quick and dirty model. Charles to get to goal #2 50% faster, you would have to do xyz with data, requiring abc investment, and my model predicts in addition to that 50% speed, you will also make $4.7 million of revenue. If I can get that, it changes more minds. Even stubborn ones (turns out people love money more than their opinion, beyond a certain point!).

I love to stand on the shoulders of giants, so would add another tab – technology investment. I have a horrible feeling for many organisations the total tech spend (vendor cost, deployment and support) would be much greater bottom left rather than top right.

It should be possible in most organisations to get the system costs and make this quite accurate. Again I think it would help in achieving focus on what really matters.

It would be another way of crystallizing the reason why someone might be in a quadrant. Additionally, I can also imagine it playing a role in ensuring future promises are not incongruous with investment decisions.

Sorry to rain on the parade of admiration here, but I see a big fail in your model (sorry, I really am a fan of your analysis for the most part, and your book sent me down the Analytics road a few years ago). Here is my issue – take the screenshot of your filled-out X-Y grid. Now blur it so that you don't see any words, just placements. What do you see? A line. The data points are a strongly correlated collection to a straight line. When I see two items that are that strongly correlated, I ask myself why I am collecting and plotting it, when they are essentially saying the same thing? You talk all the time about data pukes, and the futility/confusion that results from too much unnecessary information. When I look at what you have produced, what it tells me is that there is a strong correlation between impact and timeline (something I might have some other squabbles with). But only one axis gives any useful information – the second axis is redundant.

Jay: I've one of those people that believes that unless there is some rain on every parade, it is hard to truly appreciate how lovely it is to have a parade. :)

I welcome your rain. I am a tiny bit confused about your comment, if what I say below is not addressing your feedback please let me know.

The placements on the matrix are not data points, rather they are metrics. In as much there is no attempt on my part for you to read any let's draw a straight line as if these were data points to see if there are correlations – implied or otherwise. Though, when we do have data points – say Conversion Rates from Offline Campaigns – that is exactly what we would do.

Here's what you should see: Metric A is in position x5 – y2. That illuminates how long before a metric becomes useful, even if it is available in real time (x5) and how impactful it will be on the bottom-line (y2). Then, consider Metric B, look at its position, internalize how much more or less value it will bring to the business.

The time-to-useful dimension is there as a representative of what it takes for a metric to become significant enough to use, and as you move to the right, how freaking hard it is to get it. I believe it is critical to get Analysts and business leaders to understand this. Else, just one use case, people run around promising real-time campaign optimization using real-time lifetime value. A silly proposition.

I hope this helps. Delighted to receive more rain, honestly, and try to again.

It would be really beneficial for me if you could explain the metrics in the spreadsheet. For example, what is "Task Completion Rate (by PP)" or "profit per human" ? No doubt, I am fairly new to the analytics world, just starting my career.

Even more, if you could add some note why it is there (your explanation for the Bounce Rate is completely stunning), that would be breathtaking.

And again – thank you a lot, even if you just leave the article and the spreadsheet as it is.

After a month of rolling our the Impact Matrix in our org it has been amazing to see the impact.

Our first read did not paint a great picture of our analytics impact. We were mostly clustered in the bottom left. I was worried about the reaction of our senior management. Surprisingly they were excited about the reality check. I think your stress on who sees what data was helpful as they realized there was a different way then getting them tactical metrics.

We've created a 180 day plan to move some of our key tactical metrics one step up and one step to the right per your guidance.

As you say, it is marketing heavy, so I was wondering where you would place some of the more PR-focused metrics?

For instance, positive organic coverage in the media (in this case, the media is the audience activity we are trying to capture; are they responding to our efforts by writing about us), and organic coverage leading to consideration (here we would consider whether the readers of said coverage visited the campaign website or read other related materials – data you can obtain with, for example, Trendkite PT attribution tool).

Can these two metrics that refer to two different audiences (the media versus the public) both live in the same matrix?

This is an exceptional simplification of a space that often seems chaotic and confusing. It is very clear where each piece of the puzzle fits. Beyond that it is now easy to see the disconnect between where we are and where we need to go.

We are on the bottom left of the matrix. This is a sad realization. You've helped us understand that our overwhelming focus on activity has not set us up for success.

In a world where a video is marked as viewed when the first three seconds are completed (!!), AOVC measures the number of video ad impressions where the ad was visible and audible all the way through completion of the video.

I was wondering, if you could elaborate a little bit more on how this framework works together with your other models? More specifically the Digital Marketing & Measurement Model and ABO Model.
E.g. for a company just starting to get serious with analytics, would you recommend starting with the Digital Marketing & Measurement Model and then adjust for where the identified metrics and KPIs are set on above matrix?

Martin: I'm sorry for the confusion. Consider it the result of my personal evolution as I learn and grow more. :)

Acquisition, Behavior, Outcomes is a framework you should apply to individual reports and dashboards. It helps ensure that you are showing the full picture that will allow your leaders to make the smartest decisions.

Digital Marketing & Measurement Model is a good framework to identify the critical few metrics the business should focus on. The ABO mindset is included in there. I still think of the DMMM as the best way to get to simplicity.

The Impact Matrix (this post) is best for larger companies who are thinking about what they are measuring, how their broad measurement structures are organized, and give shape and focus to their analytics strategy. Hence, it solves bigger and more complex collection of problems (ones that larger companies face).

Thanks a lot for this valuable article. This Matrix raises a huge question inside me. What yould be your advice to compute some the hardest metrics in the impact matrix (Q/U, inc prfit per channel, inc revenu via testing…). I mean where to start ? From my experience it is really hard to gather data to compute those advanced KPIs. Do you have any learning to share ? Where do you usually start when you convince your clients to compute those KPIS ?

Finally, do you have a clear definition (or some links with in-depth explanation) of those metrics that may turn them into more easily understandable KPIs for the top management ? I know that if today I knock at the door of VPs talking about Q/U, inc prfit per channel, inc revenu via testing… Everyone will be lost..

Thomas: The most important thing to remember is that you want to do the Advanced Metrics because of the competitive advantage they'll build for you. They are hard to measure, hence most won't. :)

The start is always the same, as you suggest, collecting data. In my case, it has been a combination of having technical savvy in team and chocolate for the IT teams who will give us access to the data (and don't forget chocolate for the legal team who will also need to rightly review you request to ensure compliance).

You convince clients by showing them "what-if" scenarios of what you can compute.

Q/U is a proprietary metric. But, others need very little explanation. Profit per Channel is, profit delivered by every marketing channel (ensure you are including total cost of ownership – as outlined in this post on How to Measure the Super Bowl). Same for Revenue via Testing, etc.

If possible ,if you could elaborate a little more on how this framework works together with your other models. If you are investing in machine learning talent inside your team, even narrowly intelligent algorithms they build will learn faster and surpass humans quickly for these simple decisions.