This is the blog of David M. Raab, marketing technology consultant and analyst. Mr. Raab is Principal at Raab Associates Inc. The blog is named for the Customer Experience Matrix, a tool to visualize marketing and operational interactions between a company and its customers.

Thursday, March 27, 2008

I had an email yesterday from Blink Logic , which offers on-demand business intelligence. That could mean quite a few things but the most likely definition is indeed what Blink Logic provides: remote access to business intelligence software loaded with your own data. I looked a bit further and it appears Blink Logic does this with conventional technologies, primarily Microsoft SQL Server Analysis Services and Cognos Series 7.

At that point I pretty much lost interest because (a) there’s no exotic technology, (b) quite a few vendors offer similar services*, and (c) the real issue with business intelligence is the work required to prepare the data for analysis, which doesn’t change just because the system is hosted.

Now, this might be unfair to Blink Logic, which could have some technology of its own for data loading or the user interface. It does claim that at least one collaboration feature, direct annotation of data in reports, is unique. But the major point remains: Blink Logic and other “on-demand business intelligence” vendors are simply offering a hosted version of standard business intelligence systems. Does anyone truly think the location of the data center is the chief reason that business intelligence has so few users?

As I see it, the real obstacle is that most source data must be integrated and restructured before business intelligence systems can use it. It may be literally true that hosted business intelligence systems can be deployed in days and users can build dashboards in minutes, but this only applies given the heroic assumption that the proper data is already available. Under those conditions, on-premise systems can be deployed and used just as quickly. Hosting per se has little benefit when it comes to speed of deployment. (Well, maybe some: it can take days or even a week or two to set up a new server in some corporate data centers. Still, that is a tiny fraction of the typical project schedule.)

If hosting isn't the answer, what can make true “business intelligence on demand” a reality? Since the major obstacle is data preparation, then anything that allows less preparation will help. This brings us back to the analytical databases and appliances I’ve been writing about recently : Alterian, Vertica, ParAccel, QlikView, Netezza and so on. At least some of them do reduce the need for preparation because they let users query raw data without restructuring it or aggregating it. This isn’t because they avoid SQL queries, but because they offer a great enough performance boost over conventional databases that aggregation or denormalization are not necessary to return results quickly.

Of course, performance alone can’t solve all data preparation problems. The really knotty challenges like customer data integration and data quality still remain. Perhaps some of those will be addressed by making data accessible as a service (see last week’s post). But services themselves do not appear automatically, so a business intelligence application that requires a new service will still need advance planning. Where services will help is when business intelligence users can take advantage of services created for operational purposes.

“On demand business intelligence” also requires that end-users be able to do more for themselves. I actually feel this is one area where conventional technology is largely adequate: although systems could always be easier, end-users willing to invest a bit of time can already create useful dashboards, reports and analyses without deep technical skills. There are still substantial limits to what can be done – this is where QlikView’s scripting and macro capabilities really add value by giving still more power to non-technical users (or, more precisely, to power users outside the IT department). Still, I’d say that when the necessary data is available, existing business intelligence tools let users accomplish most of what they want.

If there is an issue in this area, it’s that SQL-based analytical databases don’t usually include an end-user access tool. (Non-SQL systems do provide such tools, since users have no alternatives.) This is a reasonable business decision on their part, both because many firms have already selected a standard access tool and because the vendors don’t want to invest in a peripheral technology. But not having an integrated access tool means clients must take time to connect the database to another product, which does slow things down. Apparently I'm not the only person to notice this: some of the analytical vendors are now developing partnerships with access tool vendors. If they can automate the relationship so that data sources become visible in the access tool as soon as they are added to the analytical system, this will move “on demand business intelligence” one step closer to reality.

3 comments:

Good post. I whole-heartedly agree that the ETL process is the most difficult problem of any BI implementation. However, I think you miss a key point in regards to hosted BI.

On-demand BI allows companies to have a BI solution without a large upfront cost of hardware, software, and IT department (or specific IT skill sets) that are required of traditional implementations. On-demand therefore becomes a means of greatly opening the BI market to small and medium-sized companies who previously didn't have the resources for a large Cognos implementation, but in their own right are accruing plenty of data.

I feel this is the area where on-demand solutions will be first to take hold. I could be wrong, but it seems a company like Lucid Era is tackling the ETL problem by primarily focusing on the installed base of Salesforce.com. I feel that this integration or other similar vertical integration to create added value "in the cloud" can really help companies offload some of the baggage of maintaining systems while still be able to focus on the important aspects, such as analysis in this case.

In another case, it seems like newcomer Good Data is taking a step-by-step approach to the ETL problem, and allows for "incremental" ETL allowing much faster time to initial results and the ability to incorporate additional data as the need arises. The collaboration features are also worth mentioning, and again this is made possible and easy by using the web as a platform.

So, I wouldn't write off SaaS or hosted BI just yet. It seems as though this space is still in the early stages and there are many promising companies. I think you're absolutely correct, though, in pointing out that preparation of data before analysis is the biggest obstacle to overcome, and it will take a lot of ingenuity to figure this out.

You're right--hosted BI removes many of the financial and technical barriers that small companies in particular face. Above all, they prevent an in-house IT department from becoming a bottleneck because it doesn't have the necessary skills.

Where do you work? I prefer commenters to identify their business affiliations.

At LucidEra, our belief is that BI needs to be simple to set-up and simple to use. The SaaS model is not only attractive due to the subscription-based pricing model, but also because of the focus on prebuilt, yet configurable, analytic applications for business people, not tools for IT and developers.

By focusing on analytic innovation and customer success, the trend from on-premise to on-demand will continue in the BI market.