Subscribe

Follow Us

Over the last 10 years, the Business Intelligence (BI) and Analytics software landscape has experienced two major trends. The first trend included a number of acquisitions consolidating some of the largest players. 2007 was the peak of this trend starting with Oracle’s purchase of Hyperion ($3.3 B), followed by SAP acquiring Business Objects (BO) ($6.78 B), and then ending the year with IBM’s acquisition of Cognos ($4.9 B). Before consumption, Hyperion, Cognos, and BO had positioned themselves as visionaries and leaders in the space, and the behemoths took notice and action.

The second major trend saw the rise of a newer class of tools like Tableau, Birst and Qlik, boosting higher quality visualizations, self-service, speed to delivery, and ease of use. Architecturally, these tools are leaner and require a smaller footprint. They have eliminated many of the burdensome tasks to install and configure, in turn eliminating much of the need for IT intervention. The emergence of these new tools has significantly eroded the legacy vendors’ stronghold on the market, dropping their share to 70% of sales in 2013, as depicted below, and even further in recent years.

In some situations, users adopt these newer technologies as a complement to their traditional offerings. In other situations, they are brought in to replace the older tools altogether. Unlike traditional enterprise BI software, the newer class of applications requires far less overhead while producing largely similar (and at times more appealing) reports and visualizations. Newer tools are capitalizing on features like high speed data compression and in-memory capabilities, allowing users to consume and analyze data more quickly and efficiently.

However, to compare legacy tools to newer ones isn’t inherently fair. The two classes of technology are innately different and not necessarily designed to satisfy the same analytic needs. The newer offerings generally lack capabilities for pixel-perfect reporting, scorecards, events & notifications, report bursting, a robust user portal, as well as a common semantic layer. Both new and traditional technologies promote the same goal of enabling Business Intelligence but can vary greatly within the deployment and support spectrum.

New Technologies, Limited Governance

Although organizations continue to gravitate towards newer BI and analytics technologies, the points above highlight an important issue. While newer technologies eliminate some of the inefficiencies of the past, they expose a number of data governance challenges. This immense challenge presents itself within the explosion of Microsoft Excel, debatably the most prevalent BI & Analytics self-service tool on the market. Excel spread-marts with key business rules flood our analysts’ computers and get buried in the Enterprise. Solutions become departmentalized, and ultimately decentralized, and become one of the largest catalysts for a BI program to deploy and govern a Traditional Enterprise Reporting solution such as IBM’s Cognos or SAP’s Business Objects. When properly managed, a governed solution requires validation of reports and metrics delivered providing end users in different departments of the business with extremely consistent results. For example, everyone throughout the Enterprise using the same business definitions for dimensions like MEMBER, CUSTOMER, PRODUCT and MATERIAL and validating to the same counts and totals. This empirical challenge of getting everyone in the organization on the same sheet of music is further exacerbated by the newer school technologies that quickly enable data consumption.

It’s not uncommon to see an organization adopt a newer tool like Tableau and then discount the reports and analytics they produce due to a lack of confidence and trust in their results. This stems directly from a mismanaged rollout and lack of governance. To instill trust and maximize return on investment (ROI), it is vital that an organization adopts and adheres to a set of standards for the tool’s implementation as well as the development, testing, and governance of the data that sustains it and the reports and analytics it produces.

In part two, we’ll look at how that is done while providing insight on how to govern leading edge technologies with traditional techniques.