Microsoft Power BI, Analysis Services, MDX, DAX, M, Power Pivot and Power Query

PASS Summit Day 2: The Aftermath

Well, that last blog post sparked a bit of a discussion, didn’t it? Indeed, I’ve spent the last few days doing a lot of talking to various different groups of people – PASS attendees, fellow MVPs, Microsoft – about was or wasn’t said in the various announcements made at PASS, what I did or didn’t mean, and how people are interpreting or misinterpreting the news. And now it’s time to follow up with another blog post to explain myself better and say what’s happened since Thursday; you may also want to read this official statement about the roadmap from TK Anand here before carrying on reading this post:http://blogs.technet.com/b/dataplatforminsider/archive/2010/11/12/analysis-services-roadmap-for-sql-server-denali-and-beyond.aspx

First of all, let me start by making it clear that Analysis Services overall is alive and well, and in fact has a greatly increased role to play in the BI stack in Denali. My original post pretty much said as much. Some of the confusion, though, stems from the fact that ‘Analysis Services’ in Denali will have two distinct parts:

1) The UDM, or Analysis Services cubes, which is what we have today. Some people refer to it as MOLAP SSAS but I don’t like this description: it highlights the storage mode when in fact I consider its distinguishing feature to be its multidimensional view of the world. Personally I couldn’t care less about storage modes and can’t wait to see Vertipaq replace MOLAP, but I do care about multidimensionality and its advantages when it comes to BI – some BI applications, typically ones which need complex calculations, can only be built using a true multidimensional OLAP database. I’d say anyone that thinks that the point of using the UDM is because MOLAP is (or has been) faster than relational database engines has completely missed the point. However, multidimensionality is complex and somewhat inflexible and that’s what puts lots of people off.

2) The new BI Semantic Model, BISM. This is what’s new in Denali, and features a more relational, tabular way of modelling data as well as the new Vertipaq storage engine. BISM is a little bit multidimensional (it is after all still SSAS under the covers) but not much: that’s exactly why it’s easier to use, more flexible and appropriate for a wider range of BI applications. It will be a massive asset to the MS BI stack and make building many types of BI applications quicker and easier. It will probably not do everything that the UDM does, though, precisely because it is not as multidimensional.

The point I was trying to make in my original post was that the announcements made at PASS, as I and everyone I spoke to there interpreted them, made me very concerned (to say the least) for the future of the UDM and the multidimensional model. First of all there was the news that Microsoft was putting all of its development efforts into Vertipaq and BISM, while the UDM was (for yet another release) getting very few obvious improvements. Then there was the news that Project Crescent was only going to support BISM as a data source and not the UDM, which made it seem like the UDM was a second class citizen in this regard. And finally there was a lack of clarity in the roadmap which meant I wasn’t sure whether BISM was meant to replace the UDM or not, or whether BISM would ever be able to do the same things that the UDM can do today.

This is what caused all the commotion, and I’m pleased to say that after a lot of what’s generally referred to as ‘free and frank discussion’ behind the scenes the guys at Microsoft understand what happened. In part there was a failure of communication because I don’t think the Analysis Services team ever meant to send out a negative message about the UDM and were a bit surprised at my reaction. TK’s recent post that I link to above is a very clear and positive statement about the future of the UDM. But words need to be backed up by actions and Microsoft know there need to be some changes to the Denali roadmap so that customers receive the right signals. As a result I hope to see a little bit more love shown to the UDM in Denali as a result, to prove to all of us who have invested in the UDM to show Microsoft still cares about it; I also know that Microsoft are looking again at ways that Crescent can work with existing UDM applications; and I hope to see a clearer long-term vision to show how anyone investing in the UDM today will have the option, if they want, to move smoothly over to BISM when they feel they are ready. An argument about semantics is in no-one’s interests (I couldn’t help thinking of this); what I care about is that I’ll have all the cool new stuff that BISM will give me and I’ll still be able to do everything I can do today in the UDM, and that we’ll have all the power of relational and multidimensional modelling when we’re building our BI solutions.

So let’s be positive. There was a bit of a bust-up, but we’re all friends again now and I think the SSAS community is better off for having had this all come out now rather than later – and the fact that we can even have this type of discussion shows the strength and vibrancy of the community. I’m not afraid of change and I know it has to happen; I’m confident that the changes we see coming in Denali will be for the better. However I’m also a lot happier now that existing Microsoft BI customers have had this reassurance that they won’t be left stranded by these changes.

Well, it looks like Microsoft talked to you and and wants to “assure” everyone that all is well. Yeah, just like ProClarity, right? I think that a new technology direction is a good thing. But not when you push the timeline out many years to get a good stable solution. Long term, this is probably a good thing, but short term, msft will see a loss in market share….. When the client tools AND server techonology are in flux, how many customers will want to invest in that dream? Check back with msft in about 5 years and they will be winners.

Hi Chris,
I just wanted to thank you for your posts and the open discussion caused by them. I shared your fears about the future of our beloved SSAS as we know them today and when the Microsoft guys in T.K. Anand’s PASS session about “New Developments in Corporate BI” after some discussions seemed to be forced to tell us that “UDM (OLAP) models are not being deprecated”, the only thing I thought about was the unspoken “not yet”.
Therefore I think that it was very important to clarify the roadmap right know before the rumors spread out to our customers and people lose confidence in SSAS. At least I calmed down a little and can now enjoy the upcoming discussions about all the cool new stuff introduced last week.

I personally think that the long term future of the UDM is best highlighted by the lack of a front-end tool. Here MS have released a new model, with new technologies which all look good, but then they also release a new, flashy front end for it at the same time. But they will not put any effort into a proper front-end for UDM.

BUT, MS bought ProClarity to provide this functionality as part of their product suite and have now killed it off without a replacement. They haven’t even got Excel connectivity working properly yet.

Yet for BISM, they are providing all sorts of new front ends for it. In my mind, I still think MS is going down their route of everything should be possible in a single click. And Multi-Dimensional doesn’t fit that.

Very nice follow up post Chris, thank you. I have been guilty of using the term MOLAP SSAS when in fact talking about the UDM and agree with your statements wholeheartedly about multidimensionality, its power and importance.

Another option to the MS BI stack for making things easier is very cool but am anxious to find out where the line is drawn to go with BISM or the UDM for a project. Most projects I undertake start very simple and then get very complex once the client understands the power of dimensionality. It sounds to me or at least my opinion is that BISM is a great stepping stone from traditional transactional reporting but I guess I am just wondering what the “cut off” will be for it. I keep thinking about a granularity decision that if done wrong in the beginning of a project is very painful to change later.

Chris, you were the right guy to say the right thing at the right time. Please keep being honest and say what you think. I think your first post on the BISM direction was spot on and was stated with sincere concern. A little blowing off steam can be a good thing if it gets the right people’s attention. Your follow-up was also completely appropriate and not at all a retraction. This is why so many look to you as an industry leader. I have my concerns abou the new BISM/PowerPivot direction but also trust that the SSAS product team are smart enough to do this correctly with the community behind them – not that it won’t be a bumpy road.
If we just agree with everything and become part of the marketing engine, no one will take us seriously.

Thanks Chris for taking us through this and engaging the community in discussion.

Probably isn’t it time for us to change the way we do BI? Denali ( Is this a jumbled “Denial” – looking at the events post Denali Announcement !!) and the new roadmap would usher a new era for Microsoft and help them to survive fierce competition with the semantic model, real time analytics, columnar databases, stream insight, self service BI driven by Cloud offerings.
Rather than analysing the past, let us find out how we are travelling today. That should be the new BI.

Follow Blog via Email

Social

Need some help?

As well as being a blogger, I'm an independent consultant specialising in Analysis Services, MDX, DAX, Power BI, Power Query and Power Pivot. I work with customers from all round the world solving design problems, performance tuning queries and delivering training courses, and I am happy to work on short-term engagements. For more details see http://www.crossjoin.co.uk