David Menninger's Analyst Perspectives

There’s a lot going on in search technology still, or again, depending on your perspective. We’ve analyzed search in a business context periodically over the years. I want to provide some more analysis on the business side of search after many announcements that I have been analyzing over the last two months from Endeca, our analysis of IBM Cognos, MarkLogic and my analysis of QlikView, all of which include significant enhancements to existing search capabilities in their most recent product upgrades.

Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it. With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.

Interest in and development of in-memory technologies have increased over the last few years, driven in part by widespread availability of affordable 64-bit hardware and operating systems and the performance advantages in-memory operations provide over disk-based operations. Some software vendors, such as SAP with its High-Performance Analytic Appliance (HANA) project has been advancing with momentum, have even suggested that we can put our entire analytic systems in memory.

In the weeks leading up to and as part of its Information On Demand Conference that my colleague assessed, IBM introduced version 8.5 of InfoSphere Information Server and several related product updates. As my colleague suggested earlier, IBM has an ambitious agenda to provide comprehensive information management capabilities through a combination of product development and acquisitions. The breadth of this portfolio is impressive, and InfoSphere Information Server 8.5 makes significant strides in tying the various pieces together.

On October 25, IBM introduced Cognos 10 at its Information on Demand and Business Analytics Forum in Las Vegas that I attended to review the technology closer from my examination at its recent IBM Business Analytics analyst summit in September. According to Rob Ashe, IBM’s general manager of business analytics, Cognos 10 has been developed for over six years. You’re probably aware that in that period IBM made a variety of acquisitions including Cognos itself. These acquisitions and their impact on the new product are clearly in evidence as part of the release.

My colleague recently wrote about QlikView, noting its rapid ascent to providing a very robust support of mobile technology platforms among BI vendors and integration with SAP. On the occasion of its release of a major product revision, QlikView 10, I’d like to add my perspective on the company and its most recent release. I first learned of QlikView about five years ago while working on the TM1 product line which, like QlikView, is also a 64-bit, in-memory analytic technology supporting business intelligence needs across business and IT.

If you enjoyed my previous blog, “Hadoop Is the Elephant in the Room,” perhaps you’d be interested in what your organization might do with Hadoop. As I mentioned, the Hadoop World event this week showcased some of the biggest and most mature Hadoop implementations, such as those of eBay, Facebook, Twitter and Yahoo. Those of you who need 8,500 processors and 16 petabytes of storage like eBay likely already know about Hadoop. But is Hadoop relevant to organizations with less data that is still a lot?

Earlier this week I attended Hadoop World in New York City. Hosted by Cloudera, the one-day event was by almost all accounts a smashing success. Attendance was approximately double that of last year. There were five tracks filled mostly with user presentations. According to Mike Olson, CEO of Cloudera, the conference’s tweet stream (#hw2010) was one of the top 10 trending topics of that morning. Cloudera did an admirable job of organizing the event for the Hadoop community rather than co-opting it for its own purposes. Certainly, this was not done out of altruism, but it was done well and in a way that respected the time and interests of those attending.

I attended the IBM Business Analytics Analyst Summit in Ottawa and while I can’t tell you much about what was discussed there due to confidentiality restrictions that will be released shortly, I can share with you some of my own observations regarding the state of BI, particularly what’s wrong with it. By “wrong,” I mean why aren’t adoption rates higher? Why aren’t users more satisfied? Our Ventana Research benchmark research on BI indicates that only 37 percent of organizations are satisfied or very satisfied with their BI efforts.

With the ongoing spate of mergers and acquisitions in the software industry, I’d like to offer some perspective on what these acquisitions mean to software-purchasing organizations. Think of the software industry as a thriving ecosystem, with large software companies at the top of the food chain. Small companies are formed and, if they have an interesting idea and some good marketing, they grow. The really good ones may continue to grow and be independent, but most of the good ideas eventually get bought by larger, more established companies.

Here’s a big shout-out to the Ventana Research community. I’m happy to be here. I think it would be appropriate to introduce myself and tell you a little bit about why I’m here and what I hope to accomplish as a member of the Ventana Research team.

Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it. With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.

Analyst Perspective Policy

Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business, industry and technology vendor trends. Each Analyst Perspective presents the view of the analyst who is an established subject matter expert on new developments, business and technology trends, findings from our research, or best practice insights.

Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to ChiefResearchOfficer@ventanaresearch.com

About the Analyst

David Menninger

David brings to Ventana Research over twenty-five years of experience, through which he has marketed and brought to market some of the leading edge technologies for helping organizations analyze data to support a range of action-taking and decision-making processes.