Why Your Analytics are Failing You

Many organizations investing millions in big data, analytics, and hiring quants appear frustrated. They undeniably have more and even better data. Their analysts and analytics are first-rate, too. But managers still seem to be having the same kinds of business arguments and debates — except with much better data and analytics. The ultimate decisions may be more data-driven but the organizational culture still feels the same. As one CIO recently told me, “We’re doing analytics in real-time that I couldn’t even have imagined five years ago but it’s not having anywhere near the impact I’d have thought.”

What gives? After facilitating several Big Data and analytics sessions with Fortune 1000 firms and spending serious time with organizations that appear quite happy with their returns on analytic investment, a clear “data heuristic” has emerged. Companies with mediocre to moderate outcomes use big data and analytics for decision support; successful ROA—Return on Analytics—firms use them to effect and support behavior change. Better data-driven analyses aren’t simply “plugged-in” to existing processes and reviews, they’re used to invent and encourage different kinds of conversations and interactions.

“We don’t do the analytics or business intelligence stuff until management identifies the behaviors we want to change or influence,” says one financial services CIO. “Improving compliance and financial reporting is the low-hanging fruit. But that just means we’re using analytics to do what we are already doing better.”

The real challenge is recognizing that using big data and analytics to better solve problems and/or make decisions obscures the organizational reality that new analytics often requires new behaviors. People may need to share and collaborate more; functions may need to set up different or complementary business processes; managers and executives may need to make sure existing incentives don’t undermine analytic-enabled opportunities for growth and efficiencies.

For example, at one medical supply company, integrating the analytics around “most profitable customers” and “most profitable products” has required a complete re-education of the account sales and technical support teams both for “upsetting” and “educating” clients on higher value-added offerings. The company realized that these analytics shouldn’t simply be used to support existing sales and services practices but treated as an opportunity to facilitate a new kind of facilitative and consultative sales and support organization.

The quality of big data and analytics, ironically, mattered less than the purpose to which they were put. The most interesting tensions and arguments consistently revolved around whether the organization would reap the greatest returns from using analytics to better optimize existing process behaviors or get people to behave differently. But the rough consensus was that the most productive conversations centered on how analytics changed behaviors rather than solved problems.

“Most people in our organization do better with history lessons than with math lessons,” one consumer product analytics executive told me. “It’s easier for people to understand how new information and metrics should change how they do things than getting them to understand the underlying algorithms … We’ve learned the hard way that “over-the-wall” data and analytics isn’t the way for our internal customers to get value from our work.”

Getting the right answer—or even Asking the Right Question—turns out not to be the dominant concern of high ROA enterprises. The questions, the answers—the data and the analytics—are undeniably important. But how those questions, answers and analytics align, or conflict, with individual and institutional behaviors matters more. Sometimes, even the best analytics can provoke counterproductive behaviors. Don’t fail your analytics.