Search results matching tags 'PowerPivot' and 'PASS'http://sqlblog.com/search/SearchResults.aspx?o=DateDescending&tag=PowerPivot,PASS&orTags=0Search results matching tags 'PowerPivot' and 'PASS'en-USCommunityServer 2.1 SP2 (Build: 61129.1)Discount for PASS Business Analytics Conference 2013 #passbac #ssas #sqlpasshttp://sqlblog.com/blogs/marco_russo/archive/2013/03/12/discount-for-pass-business-analytics-conference-2013-passbac-ssas-sqlpass.aspxTue, 12 Mar 2013 12:02:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:48201sqlbi<p>One month ago <a href="http://sqlblog.com/blogs/marco_russo/archive/2013/02/08/first-spring-conference-pass-business-analytics-conference-and-sql-bits-passbac-sqlbits-sqlpass.aspx">I wrote about my sessions</a> at PASS Business Analytics Conference 2013, in Chicago, IL on April 10-12, 2013. If you still have not registered, you can save $200 by using the code <strong>BAC228BL</strong> and you should hurry up, because there is another discount if you <a href="http://passbaconference.com/Register.aspx">register</a> within March 15, 2013.</p> <p>If you are too lazy to click on the previous post, I will speech in two sessions:</p> <ul> <li><strong>Modern Data Warehousing Strategy</strong></li> <li><strong>Self-Service Data Modeling</strong></li> </ul> <p>And now that Data Explorer Preview has been made public I can disclose that Data Explorer will be covered in my Self-Service Data Modeling session! I thought about writing an article about Data Explorer, but there is already a good coverage and I suggest you to read these blogs:</p> <ul> <li><a href="http://www.sqljason.com/2013/03/introduction-to-data-explorer-preview.html">Introduction to Data Explorer Preview for Excel</a> by Jason Thomas</li> <li><a href="http://cwebbbi.wordpress.com/category/data-explorer/">Several posts</a> by Chris Webb</li> <li><a href="http://blogs.msdn.com/b/dataexplorer/archive/2013/02/27/announcing-microsoft-data-explorer-preview-for-excel.aspx">Announcement</a> on Data Explorer Team blog</li> </ul>LASTDATE dates arguments and upcoming events #dax #tabular #powerpivothttp://sqlblog.com/blogs/marco_russo/archive/2012/10/01/lastdate-dates-arguments-and-upcoming-events-dax-tabular-powerpivot.aspxMon, 01 Oct 2012 17:04:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:45415sqlbi<p>Recently I&nbsp;had to write a DAX formula containing a LASTDATE within the logical condition of a FILTER: I found that its behavior was not the one I expected and I further investigated. At the end, I wrote my findings in <a href="http://www.sqlbi.com/articles/usage-of-dates-argument-in-a-row-context/">this article on SQLBI</a>, which can be applied to any Time Intelligence function with a &lt;dates&gt; argument.</p><p>The key point&nbsp;is that when you write </p><p><strong>LASTDATE( table[column] )</strong></p><p>in reality you obtain something like </p><p><strong>LASTDATE( CALCULATETABLE( VALUES( table[column] ) ) )</strong></p><p>which converts an existing row context into a filter context.</p><p>Thus, if you have something like </p><p><strong>FILTER( table, table[column] = LASTDATE( table[column] )</strong> </p><p>the FILTER will return all the rows of table, whereas you probably want to use </p><p><strong>FILTER( table, table[column]&nbsp;= LASTDATE( VALUES( table[column] ) ) )</strong></p><p>so that the existing filter context before executing FILTER is used to get the result from VALUES( table[column] ), avoiding the automatic expansion that would include a CALCULATETABLE that would hide the existing filter context.</p><p>If after reading the <a href="http://www.sqlbi.com/articles/usage-of-dates-argument-in-a-row-context/">article</a> you want to get more insights, read the Jeffrey Wang's post <a href="http://mdxdax.blogspot.com/2011/01/dax-time-intelligence-functions.html">here</a>.</p><p>In these days I'm speaking at <a href="http://www.sqlpass.org/sqlrally/2012/nordic/">SQLRally Nordic 2012</a>&nbsp;in Copenhagen&nbsp;and I will be in <a href="http://www.sqlbi.com/courses/ssas-workshop-cologne-oct2012/">Cologne (Germany)</a> next week for a SSAS Tabular Workshop, whereas Alberto will teach the same workshop in <a href="http://www.sqlbi.com/courses/ssas-workshop-amsterdam-oct2012/">Amsterdam</a> one week later. Both workshops still have seats available and the Amsterdam's one is still in early bird discount until October 3rd!</p><p>Then, in November I expect to meet many blog readers at <a href="http://www.sqlpass.org/summit/2012/">PASS Summit 2012</a> in Seattle and I hope to find the time to write other article on interesting things on Tabular and PowerPivot. Stay tuned!</p>SQL Pass meeting in Zurich after #powerpivot workshop #ppwshttp://sqlblog.com/blogs/marco_russo/archive/2011/03/30/sql-pass-meeting-in-zurich-after-powerpivot-workshop-ppws.aspxWed, 30 Mar 2011 05:44:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:34499sqlbi<P>These are the last days to register for the <A href="http://www.powerpivotworkshop.com/courses.htm#zurich">PowerPivot Workshop in Zurich</A> on April 4-5, 2011. If you are willing to attend, hurry up!<BR>Moreover, on Thursday evening there will be also a <A href="http://www.sqlpass.ch/">Swiss PASS Chapter</A> meeting in Zurich and Alberto Ferrari will present the session we already delivered in Copenhagen two weeks ago. This is the abstract:</P>
<BLOCKQUOTE>
<P><STRONG>PowerPivot / BISM and the future of a BI Solution</STRONG></P>
<P>The next version of Analysis Services will offer the BI Semantic Model (BISM) that is based on Vertipaq, the same engine that runs PowerPivot. DAX and PowerPivot have been created as tools for Excel users but in Denali they will be available on the Corporate BI stack technology for Microsoft, as part of Analysis Services. The impact of this technology is huge, because many assumption that are made today for an OLAP cube (star schema models, surrogate keys and so on) might be no longer the optimal way to design a complete BI solution. <BR>This session is about this impelling change: after an initial introduction about PowerPivot, DAX and Vertipaq changes that are relevant to this topic and some consideration about design impact on Data Warehouse, Data Mart and ETL pipeline, the session will become an open discussion with all attendees, in order to share experience, needs, technical challenges and understand future directions in corporate BI world.</P></BLOCKQUOTE>
<P>If you want to participate to the evening meeting, please <A href="mailto:marco.russo@sqlbi.com">contact me</A> and I'll forward your request to the right people.</P>Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technologyhttp://sqlblog.com/blogs/stacia_misner/archive/2011/03/15/34174.aspxTue, 15 Mar 2011 23:04:53 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:34174smisner<p>It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the <a href="http://www.sqlpass.org/24hours/spring2011/" target="_blank">24 Hours of Pass page</a> for details.</p> <p>And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so <a href="http://www.sqlpass.org/24hours/spring2011/SessionsbySchedule.aspx" target="_blank">check out the schedule</a> to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( <a href="http://erikasblog.datainspirations.com/" target="_blank">Blog</a> | <a href="http://twitter.com/BakseDoesBI" target="_blank">Twitter</a>) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!)</p> <p><strong>Multidimensional Thinking: The Presentation</strong></p> <p>My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract:</p> <p>“Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.”</p> <p>If you watched the presentation and want a copy of the slides, you can <strong><a href="http://datainspirations.com/uploads/MultidimensionalThinking.pdf" target="_blank">download a copy here</a></strong>. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did!</p> <p><strong>Kimball All the Way</strong></p> <p>There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is <a href="http://www.amazon.com/Data-Warehouse-Toolkit-Complete-Dimensional/dp/0471200247/ref=sr_1_2?ie=UTF8&amp;qid=1300145259&amp;sr=8-2" target="_blank">The Data Warehouse Toolkit (Second Edition)</a> by Ralph Kimball. Anything from Kimball or the <a href="http://kimballgroup.com/" target="_blank">Kimball Group</a> is worth reading. </p> <p>Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right!</p> <p><strong>Schema Generation</strong></p> <p>One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I <a href="http://blog.datainspirations.com/2010/09/13/building-a-data-mart-with-integration-services/" target="_blank">blogged about this presentation</a> in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops.</p> <p>Just as an FYI, there were two more Webcasts to finish the story begun with the data – <a href="http://www.idera.com/Events/RegisterWC.aspx?EventID=148" target="_blank">Accelerating Answers with Analysis Services</a> and <a href="http://www.idera.com/Events/RegisterWC.aspx?EventID=149" target="_blank">Delivering Information with Reporting Services</a>.</p> <p>If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the <a href="http://msdn.microsoft.com/en-us/library/ms174954.aspx" target="_blank">Using the Schema Generation Wizard</a> topic in Books Online.</p>The Microsoft BI Roadmap: BISM, UDM and Beyondhttp://sqlblog.com/blogs/marco_russo/archive/2010/11/15/the-microsoft-bi-roadmap-bids-udm-and-beyond.aspxMon, 15 Nov 2010 13:05:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:30581sqlbi<P>Microsoft recently announced a new roadmap for its BI architecture. The next version of SQL Server, codenamed “Denali”, is going to introduce a new semantic model named BISM (Business Intelligence Semantic Model). Analysis Services will host it and it will be queryable through MDX and DAX. DAX has been introduced in PowerPivot as an expression language, but it will be extended in Denali to provide also query capabilities, but it will keep its nature of a “functional” language.</P> <P>A more complete description about this roadmap has been published in a <A href="http://blogs.technet.com/b/dataplatforminsider/archive/2010/11/12/analysis-services-roadmap-for-sql-server-denali-and-beyond.aspx">blog post made by the SSAS development team</A>. Since we still don’t have a working beta product to test (CTP1 of Denali doesn’t include any new feature in SSAS) I can only make some consideration based on the many information I gathered at PASS Summit 2010 and during private meetings and conversations with members of the SSAS development team. You can of course read other interesting posts from <A href="http://cwebbbi.wordpress.com/2010/11/14/pass-summit-day-2-the-aftermath/">Chris Webb</A> and <A href="http://prologika.com/CS/blogs/blog/archive/2010/11/13/business-intelligence-semantic-model-the-good-the-bad-and-the-ugly.aspx">Teo Lachev</A> to look at some concerns the announcements made at PASS have been raised up in the MS BI Community.</P> <P>In the long term, the Microsoft strategy is to provide a platform for BI to everyone that will provide the same basic building blocks to any user interested in building a data model for any kind of reporting or analytical needs. Many tried to do the same in the past, and Microsoft tried the same too by introducing UDM (Unified Dimensional Model) several years ago. UDM is great to build models that can be expressed in a multidimensional way, but it might be too complex to be used for simple reporting purposes. Its learning curve requires a certain investment just to start with a simple project. And many developers that are used to SQL simply refuse to approach MDX and UDM just to build a few reports. For these reasons, and also to contrast other vendor’s products, Microsoft is going to introduce a new “big thing”, which is BISM.</P> <P>To describe BISM, the best thing is looking at PowerPivot today. You can define a model by simply defining tables, relationships and calculations, which are made by using DAX. These concepts are very familiar to both Excel users and developers who are used to relational databases. So, why not using SQL? The reason is that in PowerPivot (and then in BISM) the relationships are part of the model, whereas in a RDBMS a relationship is just a relational constraint. And, most important, DAX is a language that is very simple at the beginning, and that can be learned in a very incremental way. Under the cover, there is a calculation engine called Vertipaq. It is very fast. Faster than any competitor and also faster than columnar indexes that will be implemented in SQL Denali. But BISM will also allow querying an underlying relational database in pass-through mode – in Denali only SQL Server will be supported for this type of real-time usage. Something that is very important to enable BISM as the “unified model” for any reporting need. Finally, to query BISM you can use MDX and, in Denali, also DAX (which will be extended for this purpose), making it easier to express a query over a set of unrelated tables, something that would be nearly impossible in MDX and UDM today. </P> <P>BISM sounds very promising and the long term strategy is very consistent. What caused many concerns in many of us is the transition strategy. After many discussions and many thoughts, I have this roadmap to share with you:</P> <UL> <LI><STRONG>UDM is here to stay</STRONG>. It is a full multidimensional model that can be used to create complex models with complex calculations. If your business model fits well in a multidimensional model, this is something that can make your life easier </LI> <LI><STRONG>BISM will not replace UDM</STRONG>. At least, it will not replace all the feature of UDM very soon. In the long term, BISM will be able to satisfy all of the requirements of any data analysis and reporting needs. But in its first release it will not have this level of coverage. </LI> <LI><STRONG>BISM will be far better of similar products of other vendors</STRONG>, even if UDM will be more advanced of BISM for very specific requirements. At least, this is the goal for Microsoft. If you look at BISM and UDM in this perspective, it gives much more sense to the overall architecture. BISM will be much more interesting than UDM to customers that are used to other BI technologies, which are less advanced than UDM but good enough for their own requirements. </LI> <LI><STRONG>Existing UDM implementations will continue to work in SSAS</STRONG>. There are no reasons to plan a migration by now. Only after BISM will be released in a version that will be able to satisfy all the existing requirements for your project, than a migration might be considered. But it will not be required, because UDM is not going to be deprecated. The recent case-study of a 12TB cube implemented by Yahoo! should be a good point to support this statement. </LI> <LI>New projects starting before the Denali release should be implemented by using UDM. Only in case where UDM doesn’t fit the requirements (i.e. massive leaf-level calculations resulting in low performance) then an early adoption of Denali should be considered. </LI> <LI>New projects starting after Denali release should be implemented in BISM if it fits all the requirements. Probably, many projects that wouldn’t have implemented in UDM today (because some SSRS reports on a RDBMS are “good enough”,) might be considered for a BISM implementation. This is probably the key selling point for Microsoft: getting <B>new customers</B> for Analysis Services by offering BISM as a more affordable entry point for a BI solution than UDM. Ideally, this category will contain also all those projects that today would be implemented in UDM just because it is the only “semantic model” that they have today to make a user able to navigate data by using Excel. </LI> <LI>In the years to come, as long as the BISM will be always more feature-complete compared to UDM, it will become a viable alternative to UDM. Only time and user adoption will tell if BISM will be able to completely replace UDM. From my point of view, it will require at least three release cycles to reach a point of real competition. It means that we will see new projects starting in UDM at least since 2015. Considering the traditional policy support of Microsoft, any investment made on UDM will be safe at least until 2025/2030. It’s a very long time. </LI> </UL> <P>Thus, I’m really confident with the strategy about the server side. I still need to hear more news about the client-side, even if rumors seem better than actual evidence.</P> <UL> <LI><STRONG>Excel</STRONG> is the primary BI client tool. It navigates data by using MDX. It natively supports both UDM and BISM. It seems that there is an important ongoing effort that will see the light in the next release of Excel. I really don’t have any other information here and I can only speculate about some of the former ProClarity features will be implemented inside Excel. What I know is that the resources that are involved in the BI client part of Excel are higher than ever today. </LI> <LI><STRONG>Crescent</STRONG> is the codename for a new ad hoc reporting and data visualization tool that functionally resembles <A href="http://en.wikipedia.org/wiki/Microsoft_Data_Analyzer"><STRONG>Data Analyzer</STRONG></A>. Yes, it is completely new, much more graphical, more interactive… but the basic idea is fundamentally the same. It is (like Data Analyzer was) a complementary tool to Excel, and not an alternative one. This tool was supposed to generate queries only in DAX. This would exclude the possibility of querying an existing UDM model. However, I would wait a few weeks for an official statement by Microsoft about Crescent support of existing UDM models. </LI> <LI><STRONG>Reporting Services</STRONG> and <STRONG>Report Builder</STRONG> should support BISM in a native way. Today it already supports UDM through MDX. It should be able to query BISM in MDX as well, but supporting DAX should be considered to make life easier to those developers who are not used to MDX. I don’t have information about this kind of support, but it should be the natural evolution. </LI> <LI>I haven’t heard any news about <STRONG>PerformancePoint</STRONG>, but I can imagine it will have BISM support as a natural evolution as well. However, because PerformancePoint should be aligned with Excel, we should see a new version of Excel and PerformancePoint only in 2013, I suppose. However, MDX will be available to query BISM from PerformancePoint, in case a Service Pack with BISM support will not be released in time. </LI> </UL> <P>As you can see, we are just at the beginning of a major wave of innovation in the BI space. In this case, the innovation start from the Self-service BI and will grow-up until it will reach the corporate BI at a more pervasive level. A key point of the Microsoft strategy is the Vertipaq engine. Only in these days I started to understand how much disruptive this technology can be. I know very well that several UDM cubes in these days run on server that have more RAM than the cube size. Not every project is inside these boundaries, but many are. And with Vertipaq compression, the bar is simply higher.</P> <P>Finally, these are my advices for the current and future BI developments:</P> <UL> <LI>If you are a company who want to start a BI project, don’t wait and go to UDM now. </LI> <LI>If you are a BI firm or consultant, start your training for DAX by using PowerPivot. It is an excellent tool for prototyping and you can use it to train yourself and to prepare proof of concepts of BI models for your customers. Then continue the implementation using UDM by now. Commercial: my <A href="http://www.amazon.com/dp/0735640580/?tag=se04-20">recent book</A> has several chapters about DAX, too. </LI> <LI>When a feature-complete CTP of Denali will be available later next year (maybe not very soon) start to explore it to understand its capabilities and whether they can fit your requirements or not. </LI> <LI>Once BISM will reach a feature set that satisfy your requirements for a new project, start to consider it because development time might be considerably lower and skills required could be easier to build, especially if your data model is not too complex. </LI> <LI>Whatever you do in your professional life, if you are reading this blog you have to learn DAX. You can start today and my recent book can be a good start point also to cover more advanced data models and calculations. </LI> </UL> <P>A final thought is about MDX. I know that mastering MDX is hard, but I cannot say that DAX is so simpler. Yes, it is simpler at the beginning, but for more complex calculations, the required DAX expression might be more complex than the corresponding MDX one. Coming from a relational background (SQL) DAX is more intuitive at the beginning, but coming from an MDX background it is easier to learn the more advanced part of DAX that allows you to create the more complex and powerful expressions that solve real-world complex problems in a very efficient way. Thus, also your investments in MDX are preserved. Your MDX queries will still run and you will still be able to write new MDX queries. But the more important asset you have is the MDX knowledge and understanding, which puts you in pole position to really master DAX too, even if a further study will still be required.</P>PASS Summit 2010 approachinghttp://sqlblog.com/blogs/marco_russo/archive/2010/11/04/pass-summit-2010-approaching.aspxThu, 04 Nov 2010 22:41:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:30187sqlbi<P><A href="http://www.sqlpass.org/summit/na2010">PASS Summit 2010</A>&nbsp;is approaching and I'm facing the dilemma of choosing concurrent sessions in certain timeslots. I'm not alone in this situation and discussing it with other people only enhance my doubts because I see only more reasons to look <EM>also</EM> other sessions, instead of finding clues to exclude something. However, it seems that there will be some interesting announcements about Business Intelligence during the <A href="http://www.sqlpass.org/summit/na2010/Agenda/Keynotes.aspx">keynotes</A> (yes, there will be 3 keynotes, one per day), which <A href="http://www.sqlpass.org/summit/na2010/livekeynotes.aspx">will be available in Live Streaming</A>!</P>
<P>In the meantime, we're also preparing the <A href="http://sqlblog.com/controlpanel/blogs/www.sqlbi.com/workshop">PowerPivot workshop</A> in Amsterdam (December 1-2, 2010). The Early-Bird discount expires in a week, so don't wait and register today! <A href="http://www.sqlbi.com/workshop">www.sqlbi.com/workshop</A> </P>Many-to-many relationships in PowerPivot – more samples!http://sqlblog.com/blogs/marco_russo/archive/2010/10/20/many-to-many-relationships-in-powerpivot-more-samples.aspxWed, 20 Oct 2010 16:55:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:29481sqlbi<p>Alberto <a href="http://sqlblog.com/blogs/alberto_ferrari/archive/2010/10/19/powerpivot-and-many-to-many-relationships.aspx">wrote a really fine post</a> that should help you to better understand how many-to-many relationships can work in PowerPivot, using a lot of images that explain some concepts better than a thousand words. The example shown by Alberto is really interesting also for another reason: it can open up your mind about why Vertipaq (the engine of PowerPivot) is so important for some scenarios that today cannot be covered very well by the MOLAP engine of Analysis Services. If you think to the many complex calculations that cannot be simplified by working on aggregations and to the many complex models that cannot be correctly represented through a star-schema, you will start to understand why Vertipaq will be so useful when it will be implemented in Analysis Services for Corporate BI.</p> <p>If you will attend the <a href="http://sqlpass.eventpoint.com/topic/details/BID321">PASS 2010</a> in Seattle or the <a href="http://www.sqlbi.com/workshop">PowerPivot Workshop</a> in Netherlands, I will be happy to share thoughts and ideas about possible future uses of Vertipaq!</p>PowerPivot book in stock!http://sqlblog.com/blogs/marco_russo/archive/2010/10/10/powerpivot-book-on-stock.aspxSun, 10 Oct 2010 15:28:00 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:29282sqlbi<P>My <A href="http://www.amazon.com/dp/0735640580/?tag=se04-20"><FONT color="#0066cc">Microsoft® PowerPivot for Excel® 2010: Give Your Data Meaning</FONT></A> book has been printed and is now in stock on many bookstores!</P> <P>This is another good news of the past week, who have been dense of good news. I and Alberto <A href="http://www.powerpivot-info.com/post/578-interview-with-marco-russo-and-alberto-ferrari-about-their-new-book-about-powerpivot">have been interviewed by Vidas Matelis</A> about the book and we’ve received several subscriptions for our first <A href="http://www.sqlbi.com/workshop/">PowerPivot Workshop</A> in Europe in December.</P> <P>Our book is also present in the AppStore for iPhone, iPod Touch and iPad. I tried it and maybe pictures are too small for an iPhone screen, but text looks very good and you can search through the book in an easy way.</P> <P>Meanwhile, I’ve been working on several proof of concepts for new modeling techniques using the PowerPivot engine, which are very promising looking at the next version of Analysis Services. But I can’t say much of that. We have to wait another month to get more news at the next <A href="http://www.sqlpass.org/summit/na2010/">PASS Summit</A> in Seattle. I will further describe in another post the two sessions I will present there later this week.</P> <P>Now, I look forward to getting feedback from actual readers of the book!</P>PowerPivot book RTM! And some DAX thoughts…http://sqlblog.com/blogs/marco_russo/archive/2010/09/17/powerpivot-book-rtm-and-some-dax-thoughts.aspxFri, 17 Sep 2010 08:40:03 GMT21093a07-8b3d-42db-8cbf-3350fcbf5496:28828sqlbi<p>Yesterday I had a nice time delivering the <a href="http://www.sqlbi.com/sqlbimethodology.aspx">SQLBI Methodology</a> session at <a href="http://www.sqlpass.org/24hours/fall2010/">24 Hours of PASS</a>. A few hours later, I had the news about the official RTM of the PowerPivot book I wrote with Alberto. The complete title is <a href="http://www.amazon.com/dp/0735640580/?tag=se04-20">Microsoft® PowerPivot for Excel® 2010: Give Your Data Meaning</a> and we hope it will be a good book for both advanced Excel users (who are the primary target) and any BI developer/analyst who want to learn how to use this tool.</p> <p>You can read the introduction in the <a href="http://blogs.msdn.com/b/microsoft_press/archive/2010/09/16/rtm-d-today-microsoft-174-powerpivot-for-excel-174-2010-give-your-data-meaning.aspx">Microsoft Press announcement</a>. As you can see, DAX is a first class citizen in this book. Many users will not require to read more than half of the book (even if they will probably just copy/paste some formula from other chapters), but if you want to make complex models with PowerPivot, than you have a lot of examples and advanced models to investigate and learn how they work. Several times, the better solution is a mix of good model design, data cleansing and DAX expressions.</p> <p>Now, I know that many people think that PowerPivot doesn’t matter too much. They work on Corporate BI and Self-Service BI is not an option in their company. Well, if you are using SSAS and you want to be ready when the next version of Analysis Services will be ready (maybe next year?), then <u>you have to learn DAX</u>. Learning it today with PowerPivot will put you in pole position when the new SSAS version will be released (and DAX will probably be improved). And if you, like me, have a strong background of MDX… well, remember the days you learned MDX? When you were trying to use a SQL approach hitting the wall of MDX difference? The same will happen with DAX.</p> <p>Thus, take your time. You will learn the syntax of DAX in a few hours or days. But you will really learn how to use DAX in a few weeks or months. Well, I needed months, really. There were no documentation when I started. Now you have Books On Line, Video, Articles, Blog posts and, also, a brand new book! No more excuses now!</p>