Seven Steps to Creating a Data Driven Decision Making Culture.

The title of my presentation at the Washington DC Emetrics summit was: Creating a Data Driven Web Decision Making Culture – Lessons, Tips, Insights from a Practitioner.

My hope was to share tips and insights that might help companies move from just having lots and lots of data to creating cultures where decisions are made not on gut-feel, or the proverbial seat of the pants, but rather based on data.

In this post I hope to share the essence of some of the main ideas communicated in the speech. The format is: words from the slide followed by a short narrative on the core message of the slide. Hope you find it useful.

State of the Union…. ‡ Time to implementation: five minutes‡ Tools are just that, sadly‡ Humans love gut (literally in some cases :))‡ Math is hard

Core Message: The biggest challenge in our current environment is that it is trivial to implement a tool, it takes five minutes. But tools are limiting and can just give us data. What compounds the challenge is that we all have this deep tendency to make decisions that come from who we are influenced from our life experiences.

Based on my humble experience of the last few years here are seven common sense recommendations for creating a data driven company culture……

Core Message: The most common mistake in web analytics is to slap a clickstream tool (Omniture, WebTrends, HBX / WebSideStory, CoreMetrics etc) on the website and to start sending reports chock full of clickstream kpi’s out. Great for a couple months and then you lose the audience. Sit down with your core audience and figure out what motivates them, how their personal salary / bonus is paid? Start with measuring these Outcomes metrics (revenue, leads, profit margins, improved product mix, number of new customers etc).

Once your audience figures out that you exist to make them successful (and not spam them with reports) they will be your eternal friends and now you can slowly over time start to help evolve them from Outcomes to some pretty complex clickstream analysis and KPI’s.

Core Message: There is a lot of confusion between what is reporting and what is analysis. Analysis in our world is hard to do, data data every where and nary a insight any where. Reporting is going into your favorite tool and creating a bizzilon reports in the hope that a report in there will tell you, or your users, will spark action. That is rarely the case.

An additional challenge is that both reporting and analysis can take over your lives, you will have to make a explicit choice as to what you want to spend time on. Remember that if at the end of x hours of work if your table / graph / report is not screaming out the action you need to take then you are doing reporting and not analysis.

# 5 Depersonalize decision making

‡ “HiPPO’s” rule the business world ~ Highest Paid Person's Opinion ‡ It is never about you, it can’t be about you ~ Benchmarking is awesome ~ Leverage competitive analysis ~ Experimentation and testing rocks ‡ Execution strategy: ~ Transparency, standardization, looking outside in ~ Be a slave to customer centricity • Its about your customers (internal & external)

Core Message: I can’t say it any better, HiPPO’s rule the world, they over rule your data, they impose their opinions on you and your company customers, they think they know best (sometimes they do), their mere presence in a meeting prevents ideas from coming up. The solution to this problem is to depersonalize decision making, simply don’t make it about you or what you think. Go outside, get context from other places. Include external or internal benchmarks in your analysis. Get competitive data (we are at x% of zz metric and our competition is at x+9% of zz metric).

Be incessantly focussed on your company customers and dragging their voice to the table (for example via experimentation and testing or via open ended survey questions). Very few people, HiPPO’s included, can argue with a customer’s voice, the customer afterall is the queen / king! : )

# 4 Proactive insights rather than reactive

‡ “Traditional Web Analytics” = Going “forward” while looking out of the rear view mirror and driving in the reverse! ‡ Get ahead of the train, earn a seat at the strategy table ‡ Execution strategy: ~ Don’t wait for questions to be asked ~ Attend “operational” meetings and session ~ Drag in best practices from outside ~ You can no longer be just a “web analyst”, now its healthy doses of “web smart guy/gal” ~ 20% of your time should be providing analysis no one asked for and only you can perform

Core Message: Web Analytics is “rear view mirror” analysis, by the time you get the data, even in real time, it is already old. This complicates things quite a bit. In order to get ahead don’t wait until someone stops by asking for a report. Get ahead of the game. Attend strategy and operational meetings. Be aware of what the upcoming changes are to the site or your campaigns or acquisition options. Before you are asked have a plan to analyze the impact and proactively present results. You will win kudos and you would, because of who you are, have provided better analysis than what might have been asked for (or worse they might just keep doing stuff and never know if it works).

That last bullet above is very important: If you are a Analyst, and not a report writer, 20% of your time should be devoted to pouring over data and doing analysis that no one asked for but only you can do because you are the only smart one in the family.

Core Message: Almost every company hires for the position of a Analyst, often Senior Analyst, and then quickly proceeds to convert them into report writers. “Here is our Omniture / WebTrends / HBX tool, here is a list of all our internal data consumers, and here are all the reports that they need.” This is a perfect job for a summer intern (they come with the additional benefit of wanting to work really really hard for no pay). The job of a management team that wants to see a data driven culture is to first empower their analysts. This means giving them the strategic objectives of the website and then get out of the way. Make sure that the work load is the analyst is such that they can spend 80% of their time doing analysis. Hire critical thinkers.

Data driven cultures rarely exist on Reporting. They thrive and prosper on analysis, by one person or by every person in the organization.

# 2 Solve for the Trinity

‡ ClickStream is 33% of the input, on its best day ‡ ClickStream = only the What Research = adds the Why Outcomes = the How Much (as in: are you kidding we only made this much? :)) ‡ Execution strategy: ~ If you only have clickstream, get the others ~ Integrate clickstream, outcomes, surveys, usability, open text voc ~ Start with How Much, move to What and grow into Why

Core Message: I am sure you are all bored to death hearing me talk about the Trinity strategy (click here if you are not bored). The lesson here is simple, only doing clickstream analysis does not create data driven culture because clickstream data can’t consistently provide deeply impactful analysis. Normal business people have a hard time digesting the amazing limits to which we stretch clickstream data. Bring other sources of data that make for richer and full picture analysis. This will make it much easier to connect with your users and the things that they find valuable and can understand.

Secret sauce: Start with the How Much, evolve to the What Is then strive for the Why (or why not if that is where you find yourself : ).

# 1: Got Process?

‡ Report publishing / emailing schedule is not a process ‡ Web decision making can’t be ad-hoc or just post-facto ‡ Decision making is a journey, not a destination ‡ Execution strategy: ~ Steal / be inspired by Process Excellence, adapt as necessary ~ Identify core web processes, push to identify operations, define success metrics, put decision making process in place ~ Get stake holders to have skin in the game

Core Message: This is perhaps the single biggest differences between cultures that achieve the mythical status of being data driven and those who languish. Process helps create frameworks that people can understand, follow and, most importantly, repeat. Process Excellence (six sigma) can also help guide you and ensure that you are actually focusing on the Critical Few metrics and help establish goals and control limits for your metrics so that it becomes that much easier to stay focussed and execute successfully.

Processes don’t have to be complex scary things. The picture shared was that of a simple powerpoint slide that using a very visual flow illustrated exactly what the process for executing a a/b or multivariate test was, end to end. It showed who is responsible for each step and what deliverables are expected. Very easy to do. But now not just you but everyone knows that to do. At the end of the day it is process that creates culture, do you have structured processes in your company?

One critical bonus recommendation……

# 0 Ownership of web analytics: Business

‡ Think, imagine, move at the pace of business ‡ Ownership close to outcomes, proactive and analytical needs ‡ Successful Web Analytics usually, not always, outside IT ‡ Execution strategy: ~ Identify the website / web strategy owner for your company ~ Consider moving your Analytics function (all of it) over to them ~ Insist on the Analytics function own and drive holistic reporting, analysis and testing strategies ~ Create and measure success metrics for your Analytics team

Core Message: I get asked this question all the time, who should own web analytics. Most companies don’t have a single team that owns web analytics end to end. There is a team in IT responsible for the tag, another team in the PMO responsible for gathering requirements, yet another team, usually fractured all over or in IT, responsible for creating reports and someone else responsible for looking at the data and doing something, or usually nothing.

Web analytics should be owned by a business function, optimally the one that owns the web strategy (not the web site, web strategy). That will align measurement of the success of the strategy very closely with ownership of the strategy. This will also ensure that the team has the air cover it needs, the business has skin in the game and usually, though not always, business teams have a different mindset than IT and can think smart and move fast (this is not to say IT can’t, I have spent four years in IT myself : )).

In summary: Data Driven Organizations……..

* Focus on Customer Centric Outcomes * Reward analysis and not number of emailed reports * While measuring success against benchmarks * Which is achieved by empowering your analysts * Who solve for the Trinity, not just clickstream * Using a well defined process * That is owned and driven by the business function

How is your company doing? Do you have a culture that foster's some or all of the above? Have you observed strategies that work for you? Have you tried some of all of the above and it still did not lead to success? Please share your tips, feedback, success stories via comments.

Comments

It was an outstanding presentation! You're on my top 3 of eMetrics Washington 2006 ;-).

I really like the way you think and you make lots of sense as Aurélie said. Liked a lot also you 20% of time to be spent on reports no one requests.

I've learn with time that this is capital. When we were starting WA at OX2 I always argued with Aurélie because I thought she was going to far way very often in comparison with what the client had requested. I see now that this is our strength. We allow ourselves going a little bit further, bringing (or at least we try) Insights to our customers.

Getting back to your call for feedback, I just might have a comment. It's regarding the boss' pay check. When I see how Robbin had interpreted your saying, in her post called The Web: Is it Really about Money?, I guess that you should maybe specify that this advice is when your an analysist within a company. I don't think it will work that well always when you're an external consultant. So maybe just clarifying that issue is my only remark.

René: Thanks so much for the feedback, it is exactly the kind of critique I was looking for. At the summit Robbin had also been kind enough to share her feedback on the money element. We had a great conversation and I have some good food for thought on how I can better frame this point in future presentations.

Your post is excellent as usual but I am a bit confused when you say:
"Time to implementation: five minutes"

Well, we try to convince our clients to spend time (more than five minutes) to plan their tagging.
We advice them to do a "plan de marquage" that can be translated by a tagging map.
Which label to give to a page, in which heading to place a page (not necessary the same as the website), which parameters to insert in their tag code, etc…
This work forces people to think about what they expect from their tool.
It will dramatically influence the "quality of data", of report and of analysis. I think that it can't be neglected.

Nice summary. I am now even more bitter that I didn't see your presentation of this (I don't remember what session I was in). I especially appreciate your thoughts on the difference between reporting and analysis, and the importance of trusting your analysts enough to find things that truly scare you about your website. I find that a lot of people think that an analysts job is to find data that supports an action, rather than the other way around. It's a big cart-before-horse problem that your post goes a long way towards addressing.

Great post Avinash. Wish I could have made it out to the eMetrics Summitt to see it live. Would have been nice to grab the slide deck as well as your post from the site.

I especially like the HiPPO portion. Interesting that others may have mistook the point as something like "your opinion doesn't matter, you're working for the man". Personally, I've found significant success with highlighting how we're doing compared to others and bringing out customer comments and it now makes up a significant portion of my reporting and has even forced some deeper analysis. It has stopped my HiPPO in his tracks and made him sit up and listen.

Benoit: Very good point, tagging a site can be quite a complicated experience based on what web analytics vendor is being used and how much data the website is willing to cough up.

There are vendors that will allow you to have one standard tag on the entire website and they will capture 90% of the data you need instantly. So that implementation can be five minutes (throw the javascript tag in a global website element like a footer, hit save and go get a bottle of wine).

If your vendor can't support a standard tag on the site and needs to have multiple tags on the site or you need to pass a lot of variables via javascript to your web analytics tool then here is the recommendation:

1) Implement a "standard" simple tag on the site as soon as you can.

2) Learn from that implementation what your company really needs and where the gaps are in data capture / web analytics tool.

3) Strategically update the tags on pages as you have to. Get smarter over time.

4) Buy REL Software's Web Link Validator to check that the right pages have the right tags (or if they are missing tags all together). Costs between $95 to $495 and it does a lot more than check for missing tags.

I have to admit I am a bit biased against "let's make a master plan before we go out there and implement perfectly the first time" (and this is not what you are saying of course, I have heard it said by other folks). Since business users ask for what they want and not what they need (until after you give them what they wanted), I think a iterative approach works best.

Very good summary of what's important in developing an analytics program. I think that the core of the issue is that reporting is easy and requires very little skill and responsibility for the outcome. Analysis requires you to put your reputation on the line. However, if the data is correct and the evidence is clear, it should be a confident action and recommendation.

Great stuff, keep it coming! I wish I could have attended and seen you deliver this in person, but enjoyed reading summary and how you presented it here.

If you're an analyst, look for a boss who wants and knows how to use an analyst and will let you do your thing.

If you're a web site or maketing program "owner" with bottomline accountability, look for an analyst who uses data to generate important questions and strategic ideas. Hire someone who has demonstrated that he or she knows how to help you use his work to grow the business and advance your career. Or save your job.

Avinash, what have you done in terms of data presentation in the past to best satisfy the various stakeholders' needs. Obviously a CEO wants to see different metrics than a Marketing Manager but what do you think is the most effective way to present the data to both of them?

Manoj: You have asked a very complex question, hard to answer with a pithy reply. I'll try.

Just as you segment data infinitely to find insights so we have to segment and present exactly the targeted data to each stake holder.

My usual recommendation for Senior Management (say VP and higher) is to present only the core metrics that measure what the company strategy is solving for. The dashboard should fit on one page, mostly graphs, and not show more than seven metrics. Each should be measured against a goal or benchmark(remember context is king and you also don't want them to think).

For others my recommendation is to empower them to find the data they need and help themselves. If you do a good enough job with the Sr Management dashboard you can bet they will put pressure on the organization to measure metrics / kpi's that have "line of sight" (relevance) to what is on their dashboard.

Hope this helps (I realize it is probably not as expansive as what you might be looking for, in my speeches I have shown real life "pictures" but they can't go on the blog for obvious reasons).

The only open point which annoys me every day is the question about benchmarks. I'm working for an IT company in Germany and we have a lot of country websites. We already definied several KPIs, but the first question I always hear is: "Is it a positive or negative result?"

How can I answer this question?

What comparisons of which data make sence? I already realized that I even can't compare the results between our different country websites because there are so many influencing factors which make a comparison not useless (e.g. different country means different markets, different prospects, different awareness level of our brand etc.).

I'm also aware that a comparison between our websites and the websites of competitors would raise the same problems/questions.

How are you dealing with the question about benchmarks? How do you evaluate your KPIs. Is it reasonable to define goals for every year, e.g. "in 2007 we'd like to have a conversion rate xy by 20.6%"?

Mafalda: In my speech this was covered under step # 5 (Depersonalize decision making). The core thought was that if you want to make headway with numbers and trends then it can't be about you/me, we have to bring benchmarking (internal and external) or bring customer context. This way it is not about us, it is what "someone else" is saying.

As regards to benchmarking I had shared two suggestions:

1) "Internal" Benchmarking: Simply use the data you have to create "benchmarks". So for example never create a 7 day or 12 month trend, always do 8 days or 13 months. The cool thing is you just gave your users a "benchmark" about how you were performing in a earlier comparable time period. This is one small examples.

The other way to do internal benchmarking is what you suggest. Look at trends and patterns over x amount of time or for y customer segments and create your own benchmark. For example for the last year conversion has been 2.4% and now we have three new VP's for Conversion and we are doing SEM so we should not have Conversion of 3.4%. Now 3.4% might just be a way to start the conversation.

If you end up doing better than 3.4 then you dig deeper and if you end up lower then you dig deeper. Either way the analysis has started.

2) External Benchmarking: Get benchmarks from outside. For example we use the ACSI (www.theacsi.org) for measuring Customer Satisfaction. We don't have to say your site is bad, we have a external benchmark that says that. Another example is the last shop.org benchmark for conversion was 2.2% for online retailers, so that is something we can use. Finally a source such as HitWise is a great resource for benchmarking (you can benchmark how much search traffic you get vs your competitors or what is your share of keywords vs others etc etc).

As regards to different countries, I am sure there is something you can start with. I would look for that, even if you can find one metric. Giving all of them one thing that they can all be "benchmarked" against can be a great way to motivate the right behavior.

Here's a suggestion: Run a simple pop-up survey with two questions on all your websites….

1) Why are you visiting our website today? (Open ended answer.)

2) Were you able to complete the task that you came to the site for? (Answers: Yes or No.)

Now benchmark all of them against this. You have a winner on your hands. :)

I was in charge of INFO management for a large FMCG and they placed limited importance on it . It was like banging your head against a brick wall. IT didn't get it either – too focussed on transaction capture and monitoring and control reports for finance

Avinash, enjoyable reading. Especially liked the point on reporting versus analysis. To add to above, would it not be valuable to i) think of every business decision as a choice, ii) phrase that choice as a specific question, iii) spend enough time on the wording, bent of the question till one is satisfied that it is infact the most relevant question, and then iv) look for data that may help answer that question?

To illustrate, if two people walk out of a meeting looking for solutions to the question – "What should we do to grow the company?", one could interpret that as "..grow net profit" and might take a look at cost data. The other might interpret the same as "..grow revenue" and might ignore cost data completely. What if both had the second interpretation?

Another example is the question "What discount in price should I give customer X?" (in response to a customer asking for a 5% discount). Lets assume facts show you this is a highly profitable customer you want to keep. Maybe the customer is really looking for some kind of financial compromise, and perhaps you can keep him happy by adding 30 days to his payment terms (and be better off). If you had asked the question "What additional incentive can I give my customer to keep his business?", you may have been looking for a different set of data.

My belief is that such situations are commonplace in organizations. We tend to hurry through the questions, and spend all our time time looking for answers. The big question may well be "For which question do I need an answer?".

I stopped reading mid-way through because I just had to say, thanks for the common since tip that I think will help me to finally acclamate to my new company, "Hippo's rule the business world". I may have to put it up at work to remind me that in order to convince our Hippo's (and I know I won't be able to always) I'll need depersonalized figures, facts and analysis and this gathered from other Int'l Hippo's in order to get ideas and recommendations to be seriously considered in my company.

I'm late to your presentation about Creating Data Driven Culture, but it is still timely and relevant! I watched the vid on youtube recently, and I wish I'd known it was outlined here on your blog already. It woulda saved me alot of time as I'm sure your Notes are much better than mine! I was trying to capture both what you said, as well as the thoughts I had about what you said. It was like your talk was about the proverbial 'Elephants in the Room' that don't always come to the forefront conversation, and definitely not with the clarity with which you spoke them. Sometimes, the corporate dynamic has a way of over-complicating the simple, while over-simplifying the dynamic. Way too much going on in that vid – in a good way! :-)

I found a lot of valuable food for thought, but I especially love that you broached and responded to the question of WHO should own the data in organizations. I like your point that it should be Owned by the people who are responsible for (who's necks are on the line for) the Outcomes. Ideally, I see that as a co-ownership among multiple depts (i.e. Marketing, Sales, Production), not just a single dept.

Avinash –
I have nothing to do with web analytics, probably I dont even know what it means. However I live, breathe and get paid for analysis.

I can drop the word "web" from your blog and it is still is right on money. Every word is so true for just any business's decision making. .Or should I drop the word "business" too. Thanks for sharing your thoughts.

Without power of data any decision making will only be derailed by the loudest or funniest guy in the room!

Mine is definitely coming years after this post, but I found this particular read -priceless. I train people in advanced analysis using Excel but most of the time, I fail to track my own business metrics. Analysis is good, but what is the percentage of these analytic that are used by corporate hippos? . I once worked for a 501(c) in Africa and it seems most of the data I generated remained in the C-level's inbox for months coz we believed analysis is the thing.

I’m sorry I missed last week’s eMetrics conference in Washington, D.C.. Avinash Kaushuk presented and posted the highlights on his blog, Occam’s Razor. There are some strong insights here, especially useful as metrics/analytics come under my management in the weeks to come. […]

[…] What we're reading There are some pretty fantastic resources on the web for people who use Google Analytics, and those interested in learning more. We want to mention a few blogs on web analytics generally and on Google Analytics that we've been reading. We highly recommend these to all of you who use data to back up your online decisions. ROI Revolution Blog ROI Revolution is a Google Analytics Authorized Consultant (GAAC). This frequently updated blog contains interviews with web analytics experts, as well as Google Analytics tips and in-depth explanations of reports with screenshots. Great reading. Take a look at these two recent posts: Start at the Beginning: Making Sense of the Google Analytics Toolbox by Meredith Smith Understanding Google Analytics' Data Over Time Report by Michael Harrison GA Experts Blog A European GAAC affiliated with Omega Digital Media and a very informative Google Analytics-focused blog addressing practical questions and offering some pretty ingenious solutions. Learn about a new filter called "Override Bid Term Filter" that will show you the actual search keywords that brought a visitor to your site, not just the keyword that you bid on in your PPC account, in the recent post How to Get Detailed PPC Keyword Data from Google Analytics This Just In Written by Justin Cutroni who works at EpikOne, a one-stop, do-it-all GAAC on the east coast, which has its own informative blog. Justin posts helpful, troubleshooting articles that help clarify Google Analytics and make it even more understandable, useful, and accessible. Check out Justin's recent posts: Google Analytics: How to Tell When Something is Wrong Google Analytics Configuration Mistake #3: Third Party Domains Occam's Razor Written by Avinash Kaushik, head of web research and analytics at Intuit, and a vocal and visible analytics practitioner, advocate, and thought leader. Every web analyst, marketer, webmaster, IT specialist, and executive should read his recent post: Seven Steps to Creating a Data Driven Decision Making Culture Posted by Jeff Gillis, Google Analytics Team […]

[…] You may have read Ronny's paper or heard Avinash's talk about the HiPPO – (Highest Paid Person's Opinion), but there is more to the story than a fancy acronym for Africa's most dangerous animal – and your meeting's biggest foe. It has become analyst lore because of the great image that it imparts and because of the message it delivers. […]

You have a great web analytics tool? Check.(That was easy!) You have smart people that not only report data but provide insights? Check. (Congratulations, these are hard to find) Now the main challenge for your team is driving change into…

[…]
Is your organization making the shift to a more data-driven model? It can sometimes be a frustrating experience. Yes, the benefits are vast, but the transformation can take years, and you may come up against resistance here and there.

Following up on an older post called "Seven Steps to Creating a Data-Driven Decision Making Culture," Avinash Kaushik, Analytics Evangelist for Google, posted a new blog offering guidelines for creating a data-driven boss. Even if you often rely on your gut when it comes to decision-making, you're sure to find some gems in there — whether you lack analytical expertise, or you’re the analysis guru in your company.
[…]

[…] Using analytics moves website improvement past opinions into hard data. No longer will site design and architecture be based on subjective viewpoints — not even on HiPPOs — the Highest Paid Person’s Opinions. […]

[…] If your company is like most companies, implemented ideas and then changes to your websites layout, design, navigation and other features are usually made based on internal decisions by HiPPOS (Highest Paid Persons Opinion). Decisions based on irrational factors such as your competitors site looks a certain way so “we should do it too”, Amazon does it, my ego wants it, and even “it looks prettier”. […]

[…]
Too Many Cooks in the Kitchen Result in Mush
I’m sure you’ve seen it, websites where there’s such a jumbled mess of content on a webpage, you know it’s a battle from internal stakeholders. Typically, you see this jumble of information on large corporate websites, in particular, large tech companies with HW and SW products. (I know this as I used to do it)

Being SNOWED is far worse than Avinash’s acronym of HiPPO (Highest Paid Person’s Opinion). At least with a HiPPO there will be at least some direction where the website is headed.

What’s worse is having too many masters too serve, resulting in serving none. I created this little acronym which I encourage you to share with your stakeholders, hopefully your website won’t be SNOWED in either.
[…]

Give your online marketing staff them the autonomy to make good decisions regardless of the HIPPO (Highest Paid Person’s Opinion) in the room. Did you know that the Obama campaign had a 81-person new media team that grew to nearly 170 people by the end of the campaign?

Hire talented, trained staff (I would argue that this also means you actually need to pay for their talent. The number of nonprofit openings for online marketing staff with extremely low salaries I think demonstrates that many nonprofits don’t “get” that hiring talented online marketing staff is worth it. From the report: Chief Technology Officer Michael Slaby had this piece of advice:
[…]

[…]
Turns out, he had a post from February of this year (2007) on the exact subject of Reporting vs. Analysis. The first thing I noticed was that he referenced a post from Avinash Kaushik that touches on the subject as well. This really is a small world, apparently (see my last entry in this blog…where I referenced an Avinash post!).
[…]

[…]
This is why we do thing differently. We lead with something totally free, and we ask our clients to judge us according to the value that we deliver. We can take such a radical position, because we lead with data. In the infinitely measurable, infinitely accountable medium that is the web, data is the lifeblood of every good organization. Data in the right analytical hands leads naturally to insight. We help site owners instrument and deploy low cost voice of customer and web analytics tools, thereby opening up powerful and always-on data capture streams. We measure, we analyze, we offer up insight. In this way, we earn trust, credibility, influence, and the right to be taken seriously. If the fit is right, we then move on with the execution and production of critical changes, thereby optimizing the website and aligning its key goals with the needs and desires of its visitors.
[…]

[…]
Some may argue to do the statistical correlation first, to which I wouldn’t disagree… but the point here is that you can get some pretty powerful information with just some raw data and excel. HiPPOs (as Avinash likes to call the Highest Paid Person’s Opinion) care about statistics, but they don’t like to see them. They would rather see a chart that tells a story.
[…]

[…]
I first came across the term when I started reading Occams Razor by Avinash Kaushik. Essentially a HiPPO is:
Highest Paid Person’s Opinion
The person who has the biggest pull in the desision making process, and as such most wills are bent to their will. And an idiotic HiPPO can really mess your strategy up. These are the guys who want you to rank for a keyword cause they think it represents the business best – not because it converts well or is even the target market. This is the person who doesnt give a hoot about doing it right, they want results either way. This is the person who decides on a whim to get rid of a core section of the site “because they didnt like it” without informing their SEO. This is the person who consistently shoots down valuable ideas just because they dont want to understand.
[…]

[…]
What gets measured gets improved. This isn’t 1995 – a hit counter isn’t going to cut it. There’s no excuse not to have at least Google Analytics (or something) tracking and providing information. What to look at? Let’s see – top content, pages with high bounce rates, referring sites, search keywords… the list goes on. Oh, and remember: reporting is not the same as analysis. So let’s figure out what we really need to measure (let’s call them KPIs) and set some goals.
[…]

[…]
This view of data as our savior returned in around 2006 as the answer to handling HiPPOs, or the highest paid person’s opinion.** I agree that a decision made because an uninformed executive said so is bad. But the opposite extreme—because the data said so—is just as bad. Why? Many people misunderstand this sentiment as the data can make decisions for us. A mindless decision is often a bad decision.
[…]

[…]
As most of us have probably experienced, many web designers (and HiPPOs) seem to take the approach of “if I think it’s pretty, then it’ll work”. Which is terrific when they’re the only ones who will ever visit the site. Unfortunately a firm’s customers don’t always have the same preferences as the developers, and too many times the result is a site experience which is not just hard for visitors to use, but at times simply irrelevant.
[…]

[…]
Encouraging efficiency and the importance of metrics – “data-driven decision-making” – would improve chances of an organization actually synthesizing what an acquired team had to offer. Putting a framework around how existing team members can be innovative within the organization would make any injection of new ideas that much more welcomed and effective. Then the “fresh eyes” have their best shot at ensuring everyone’s successful.
[…]

[…]
As most of us have probably experienced, many web designers (and HiPPOs) seem to take the approach of “if I think it’s pretty, then it’ll work.” Which is terrific when they’re the only ones who will ever visit the site. Unfortunately a firm’s customers don’t always have the same preferences as the developers, and too many times the result is a site experience which is not just hard for visitors to use, but at times simply irrelevant. So with the idea of “letting the customer drive” in mind, One To One has recently witnessed some amazing long-term potential for our clients in the form of testing of new, dynamic content on landing pages and deeper areas of a site, all through an approach which includes but isn’t limited to the analytic methods commonly referred to as “A/B/N”, and multivariate testing, AKA “MVT”.
[…]

[…]
And Avinash has a good post on Seven Steps to Creating a Data Driven Decision Making Culture. where he talks about making the data easily relevant to the needs of stakeholders and showing specific wins in those areas.
[…]

[…]
The social media review is an audit of the brand’s current strategy, the community sentiment, and current conversation topics. Since each aspect of the review changes from time to time, it’s important to create a social media audit every quarter and before every new campaign.

Compiling the research into one place helps outline exactly what data is important, whether there are any overlaps in marketing channels, and how the brand can further optimize their strategy.
[…]

[…]
If you’re not sure what a HiPPO is in the context of web analytics, then read this post by Avinash Kaushik. In summary though, he’s the highest paid person impacting your job, and his opinion can often make a big difference.
[…]

[…]
But, sadly that’s not always the case in the boardroom and in the office where other factors come into play. (Like Avinash Kaushik’s HiPPO analogy: Highest Paid Person’s Opinion). You might be a brand manager under 30 who likes it but you have a 48 year old CMO who hates it or who just arrived and feels like it’s run its course. And across the table sits Finance – you know, the guys who give out the budgets – and they feel while, yes, sales are going well they believe it’s a result of that new distribution strategy that Superstar Steve just implemented.
[…]

[…]
Yet, Evidence Based Marketing is preferable to the alternative – based on guts, fads, Guru’s revelations, Highest Paid Person’s Opinion (HiPPO), and just plain observations. The alternative is rife with errors that are not easy to see or measure – Cognitive errors. We suffer from multiple cognitive biases like Recency Effect, Availability Bias, Selection Bias and Survivorship Bias.
[…]

[…]
A great insight into the power of evidence-based decision making. Solid facts, presented well, are hard to ignore. Twain was probably also thinking about the need to have an approach to testing and experimentation when he said this. There’s no better way to silence the HIPPO (highest paid person’s opinion) than to have concrete evidence from testing about why one approach is better than the other.
[…]

[…]
You’ve probably heard of HIPPO. Avinash talks a lot about how you can combat the Highest Paid Person’s Opinion with analytics, analysis and testing. Here’s one of his blog posts from waaay back in 2006 where he mentions the HIPPO problem: 7 steps to creating a data-driven decision making culture.
[…]

[…]
First, they depersonalize decision making. Forcing everyone to back up their assertions with data, facts, or intelligent reasoning, helps create a culture where the best ideas win, rather than what Avinash Kaushik calls the HiPPO rule: the highest paid person’s opinion wins. Second, they remind us of the unavoidable uncertainty inherent in so many decisions. Because more often than we’re comfortable with, the answer to these questions will be “I don’t know.”
[…]

[…]
Website Design is fraught with opinion and bruised egos, and as a designer told me recently, he had to ‘develop very tough skin’. For my part, it always brings to mind a phrase that I borrow from Avinash of occams razor, which is HIPPO.
[…]

[…]
New sources of competitive advantage will be created and there won’t be a HiPPO in sight. For more, check out Wired magazine’s great article on the merits of A/B Testing, and the Decision Hacker blog, and Avinash Kaushik’s Seven Steps to Creating a Data-Driven Culture.
[…]

[…]
Despite this body of evidence, many people persist in acting as if such a priori opinions are reliable. This is The HiPPO Syndrome. HiPPO = Highest Paid Person’s Opinion: an acronym that was first popularized by Avinash Kaushik and by our Microsoft Experimentation Platform team. I designed HiPPO stress toys, posters and web pages that helped popularize the acronym.
[…]

[…]
For a truly data-driven organization to function at its peak, the ethos of analysis must permeate the entire organization. If data-driven thinking is only characteristic of the marketing team, you’ll be missing out on a wealth of rich and actionable data. If your organization is not there yet, take some advice from Avinash Kaushik on how to convince the rest of your organization to focus on the data, and to ditch reporting for true analysis.
[…]

[…]
I borrowed this headline from Avinash Kaushik’s presentation on Seven Steps to Creating a Data Driven Decision Making Culture. Gut decisions might feel good, but data-driven decision making takes personal conflict and feelings out of the situation, and focuses the discussion on the facts we have in front of us.
[…]

[…]
No sooner you let it be known, mostly inadvertently, that are about to send out a survey to customers than starts incessant requests (and commands) from your co-workers (and bosses) to add just one more question to it. Just one more question they have been dying to find the answer for but have not gotten around to do a survey or anything else to find the answer for. Just one question right? What harm can it do? Sure you are not opening the floodgates and adding everyone’s question, just one question to satisfy the HiPPO?
[…]

[…]
Papyrus: papyrus, the thick paper-like material produced from the pith of the papyrus plant is back. First manufactured in Egypt as far back as the third millennium BC, it is actually still used by communities living in the vicinity of swamps. We’ve heard the requests loud and clear: papyrus report exports would be an exciting option that would provide a “wow” factor at your next presentation and make your data more tangible. Just be sure you keep them in a dry climate where it is most stable. Next time your HIPPO questions the data break out the Papyrus and Abacus and prove them wrong!
[…]

[…]
For a truly data-driven organization to function at its peak, the ethos of analysis must permeate the entire organization. If data-driven thinking is only characteristic of the marketing team, you’ll be missing out on a wealth of rich and actionable data. If your organization is not there yet, take some advice from Avinash Kaushik on how to convince the rest of your organization to focus on the data, and to ditch reporting for true analysis.
[…]

[…]
Papyrus: papyrus, the thick paper-like material produced from the pith of the papyrus plant is back. First manufactured in Egypt as far back as the third millennium BC, it is actually still used by communities living in the vicinity of swamps. We’ve heard the requests loud and clear: papyrus report exports would be an exciting option that would provide a “wow” factor at your next presentation and make your data more tangible. Just be sure you keep them in a dry climate where it is most stable. Next time your HIPPO questions the data break out the Papyrus and Abacus and prove them wrong!
[…]

[…]
I borrowed this headline from Avinash Kaushik’s presentation on Seven Steps to Creating a Data Driven Decision Making Culture. Gut decisions might feel good, but data-driven decision making takes personal conflict and feelings out of the situation, and focuses the discussion on the facts we have in front of us. Avinash’s post talks about
[…]

[…]
I think the most challenging part for small and midsize businesses is finding the time or resources to do so. There’s no doubt its an investment, but it’s the first step in building a data-driven culture. It’s a step that, in my opinion, that will continue to pay dividends.”
[…]