This release includes a number of small enhancements and fixes including the following:

Enhancement: Allowing "Unlimited" Dataset sizes from PowerPivot – previously results were buffered through an internal memory structure that had a 2Gb limit which resulted in most typical queries failing at around the 2 million row mark. We've now implemented a new streaming interface which removes this limit and have run tests exporting over 6 million records and creating a 6Gb csv file.

Fix: Default Separators so that the option run queries with non-US separators is used correctly when set in the Options screen

Fix: Tracing Query Plans for PowerPivot

Fix: Setting Focus to the Find box after typing Ctrl-F

Fix: "Linked Excel" output when connected to an Analysis Services Multi-Dimensional cube so that it always includes Cube=[CubeName] in the connection string

Fix: Crash in "Define and Expand Measure" when run against a Power BI model with a measure with the same name as one of the column names

In addition to the above specific issues that have been fixed numerous stability enhancements have been added as a result of crash reports that have been logged. Thanks to those of you that have submitted these reports when the program crashes, specially those of you that have taken the extra time to note down some extra information about what you were doing when the crash occurred. With some of the crash reports it's easy to figure out what happened from the stack trace and screen shot, but in other cases it's quite difficult. We have also seen some reports that appear to be from .Net faults or issues in some of the third party libraries that we are using.

We've just found out that a recent security update to the .Net framework in January 2018 for Windows 7 SP1 has been causing crashes in DAX Studio when accessing the File menu. Unfortunately this issue is outside of our control and affects any WPF based windows app which references the Windows Font collection (which DAX Studio does in the Options window)

This fault typically manifests as a fatal DAX Studio crash with a CrashReporter dialog which reports an "MS.Internal.FontFace.CompositeFontParser.Fail" exception and a message saying "No FontFamily element found in FontFamilyCollection that matches current OS or greater: Windows7SP1"

Hopefully a follow-up patch for this will be released soon that will remove the need for people to apply manual fixes for this.

The major change in this version is to the tracing engine. We’ve introduced a new trace type and made some changes to the way the tracing windows operate and incorporated some enhancements to crash reporting and enabling logging.

Changes to the way trace windows work

Previously when you clicked on a trace button, the window opened and the trace was started and when you switched off the trace the window closed. The running of the trace and the visibility of the window was closely linked.

In v2.7 we have removed that tight linkage, when you click on a trace button the window opens and the trace still starts as it used to, but when you switch off the trace the window now remains open. The table below shows the 2 new states that trace windows now have.

v2.6 and Earlier

V2.7 or later

Window Visible - Trace Running

N/A

N/A

Window Closed – Trace Stopped

Window Visible - Trace Running

Window Visible – Trace Paused **

Window Visible – Trace Stopped **

Window Closed – Trace Stopped

All trace windows now have a number of additional controls in their title area.

Starts a paused or stopped trace

Pauses a running trace

Stops a running trace

Clears any information captured in the current trace window

The tabs for the traces now also have an indicator to show their state so that you can see the state of a given trace at a glance. In the image below you can see that the All Queries trace is stopped, while the Query Plan trace is running and the Server Timings trace is paused. Note that while a trace is paused the server side trace is still active it’s just the DAX Studio UI that is paused, so expensive trace events like Query Plans can still have an impact on the server.

The other side effect of this change is that if a .dax file is saved while a trace window is open, when that file is re-opened the trace window will also re-open with the saved trace information, but now the trace will be in a stopped state (previously the trace would open and re-start). This prevents accidentally overwriting the saved information and also means that the saved trace information will open even if you cancel the connection dialog (which would not happen in v2.6 or earlier, cancelling the connection would cause the saved trace information not to open)

The “All Queries” trace

The new trace type is called “All Queries” – which captures all queries against the current connection, regardless of the client application. This is useful for capturing queries from other client tools so that you can examine them.

When the trace is active it will capture all events from any client tool. The screenshot below shows a capture session that was running against a PowerBI Desktop file. When you hover over the queries the tooltip shows you a larger preview of the query and double clicking on the query text copies it to the editor

The “All Queries” trace has a few additional buttons in the title bar area.

The following button in the trace window title will copy all the queries matching the current filter to the editor pane.

The Filter button shows and hides the Filter controls, the clear filter button will clear any filter criteria from the filter controls.

Filters can be set for a specific type of query DAX/MDX/SQL/DMX, for a duration range, username, database or query. The filter all do a “contains” style search that matches if the text you type is anywhere in the field.

Note: You cannot have the "All Queries" trace running with either the Server Timings or Query Plan traces as DAX Studio currently only runs one trace session per query window and these traces apply different filters to the trace events. We expect that the normal workflow would be to run the All Queries trace and collect a set of queries. Then you will stop the All Queries trace and look at the long running queries, maybe re-running some of the captured queries with one or both of the other trace types to help isolate performance issues.

Enhanced Tooltips

The amount information in the tooltips for columns has been greatly extended. By default we now show the format string, min and max values, the number of distinct values and a sample of 10 values. There are now settings under File – Options that let you turn off either one or both of the basic statistics or the sample data information in the tooltip.

Crash Reporting

We’ve added the CrashReporter.Net component that will catch fatal crashes and give the user an option to submit a crash report containing the exception message and stack trace information that may help us resolve the issue.

Logging Improvements

We also now support a logging hotkey – holding down the left SHIFT key while starting up Excel or the DAX Studio standalone will start debug level logging. There is a link in the Help – About dialog that will open the log folder or type Win+R then enter %APPDATA%\DaxStudio\logs and click open or put that address into the address bar in Windows Explorer and hit the enter key.

Other Miscellaneous Fixes

Added support for opening .msdax files generated by SSMS

Fixing output of time portion of datetime columns when exporting to csv

Fixed a bug where table list sometimes did not update after changing the connection

Fixed missing define measure option for hidden measures

Fixed crash when right-clicking on the Query History window

Added installer support for the SQL 2017 versions of AMO and ADOMD – if you have both of these the installer should no longer insist on downloading the 2016 versions of these libraries.

Yesterday's release of DAX Studio 2.6.0 started off OK. We've had over 400 downloads, but then at some point the browsers seem to think the installer was a malicious file. We're not sure why, but all of them have started throwing up warnings, Chrome seems to be the worst offender, saying that the file is malicious and only giving you the option to discard it

And both Firefox and Edge give you very scary warning, but will let you download the file.

The advice on the Chrome "learn more" link is very generic and there does not appear to be any way of submitting for a re-assessment of this judgement.

I checked both the original file and then downloaded the file from codeplex and checked again (just in case something had infected the file after it was uploaded to codeplex) using Googles virustotal.com site and both times 0 out of 55 antivirus scanners reported any issues.

We seem to somehow have gotten into the habit of doing pre-Christmas releases every year, so why break with tradition now :)

The latest release of DAX Studio has a couple of new features as well as a bunch of small fixes.

The biggest single feature is the support for multiple result sets. Both SSAS 2016 and Power BI support sending multiple EVALUATE statements in a single batch. You can now do the same thing in DAX Studio and we will generate a numbered tab for each result set.

We've changed the Connection dialog so that connection types that are unavailable are just disabled instead of being hidden. There is also a little help icon with a tooltip which indicates why the particular option is disabled to help those that are new to DAX Studio.

We've added the ability to connect to SSDT Integrated Workspaces.

There is now a setting under File – Options to allow you to opt-in for notifications of pre-release builds. So when you launch DAX Studio if there is a new pre-release version available you will get a toast notification. I don't think we will always do a pre-release build, but there have been a number of times where it would have been nice to get a few more people testing out new functionality before doing the final release.

When querying measures the Formatting from the data model is now applied. Note that to do this we look for matches between the column names in the result set and the measures in your model. So if you use functions like ADDCOLUMNS or SUMMARIZE you need to make sure to give the output columns the same name as the underlying measure if you want formatting applied.

And there have been a bunch of minor enhancements:

A link has been added to the dowload page from Help - About when a newer version is available.

Added parsing of record counts in Server Timings for SQL 2016 / Power BI Desktop.

Improved metadata search performance and the search box now stays open when it has keyboard focus.

There are also a number of fixes in this release, some minor, but some also address some annoying crashes:

Fixed an issue where Query Plans and Server Timings would not start when connected PowerPivot.

Fixed an error when using the locale setting in the connection dialog

Fixed an issue with hidden animations running causing background CPU usage even when the app was idle.

Fixed crashes when refreshing metadata (also affects the automatic refresh when switching back to DAX Studio after editing your model).

Fixed PowerPivot connections so that they stay connected when you open another Excel file.

When I ran Redgate Reflector over the Microsoft.AnalysisServices.Core.dll I
came across this little gem:

Where it checks if the object is of a type IQueryBinding from the Miocrosoft.AnalysisServices.Core namespace and as far
as I can see nothing currently implements this interface. What
this means is that if you pass any of the built-in binding classes to the
Process method - it will always throw a NotImplemented exception.

The next version of DAX Studio has just been released. You can download this release and read the release notes here

Note: In this release we have updated the versions of the Microsoft ADOMD.Net and AMO libraries we reference to use the SQL 2016 versions. This gives us scope to access some of the new functionality in Power BI and SQL Server 2016, but may mean that you are prompted to download these when you upgrade from a previous version.

Some of the highlights of this release are:

New Features

Added an option to trace Direct Query events

There is now an option under File > Options where you can enable extra events in Server Timings for Direct Query based models. These events add extra overhead so you should only enable this option before you start tracing a Direct Query model and you should disable this option once you are finished.

Added Dynamic syntax highlighting

Earlier versions of DAX Studio contained a hard coded list of syntax highlighting keywords and functions. The lists of keywords and functions used for syntax highlighting is now dynamically discovered from the data source. This has advantages when dealing with Power BI in particular which can get new functionality added from one month to the next.

Added rows and KB to server timings tab

Analysis Services 2016 and Power BI have added information to the server timing events that includes information about the number of rows and the size of data returned from each of the timing events. If this information is found it is now surfaced in the server timings tab.

Optimized DaxFormatter calls

The old version of the DaxFormatter.com API required a second call if there was an error to find out the details of the error. The nice people at DaxFormatter.com have updated their API so that this is no longer necessary.

Added an option to specify the default separator style

In the 2.4 release we introduced an option where you could convert on demand between the 2 different separator styles. But all queries had to be executed using the UK/US style.

The UK/US style is where a comma (,) is used as the list and thousands separator and the period (.) is used as the decimal separator.

eg. EVALUTE FILTER( 'Product' , 'Product'[List Price] > 1.25 )

The European/Other style is where a semi-colon (;) is used as the list separator, the thousands separator is a period (.) and the comma (,) is used as the decimal separator.

eg. EVALUTE FILTER( 'Product' ; 'Product'[List Price] > 1,25 )

Now you can choose which style you want to use as your default in File > Options menu.

Added an error message when you attempt to open a .dax file that no longer exists

Prior to this version if you clicked on an entry in your recent file list which pointed to a file that had been renamed or deleted you would just get a blank window with no idea what went wrong. Now there will be an error posted to the output window tell you what went wrong.

Bug Fixes

Fixed a bug where server timing traces were also listening for query plan events

Fixed incorrect removal of square brackets from MDX results

Fixed a race condition that happened sometimes when trying to capture both Query Plans and Server Timings

The next version of DAX Studio has just been released. You can download this release and read the release notes here

Some of the highlights of this release are:

Toggle Delimiters

A toggle delimiter feature has been added to switch between regional settings styles (commas vs semi-colons). Currently queries still need to be executed using the "US" style comma delimiters, but if you have expressions using semi-colon delimiters then you can easily switch back and forth. This can also be handy if you find an example expression online that is using different delimiters than what you have in your model.

this will swap between using comma delimiters

and semi-colon style delimiters

Intellisense Improvements

Added a Function Insight tooltip to intellisense which willl appear after the opening bracket for a function that will show the parameters for a function as well as it's description

The loading and caching of intellisense data has also been vastly improved. The slightly delay when starting to type a query against a large model should be gone and you should also not see issues when trying to edit a query in a window that is currently executing a query.

Often when you have a requirement to show a measure for the TopN of a given attribute the users will want to see the subtotal of all the other items.

With multi-dimensional cubes and MDX you can create calculated members in a dimension. But you can’t do this in a tabular model using DAX . You could create a calculated column, but then your TopN would be fixed and could not respond dynamically to changes in filters as calculated columns are calculated and stored at processing time.

So one way to make this work is to have a placeholder row (or rows) with "Others" in the table which you are doing the TopN over.

The first step in this technique is to create a Rank measure (this saves having to re-type the rank logic multiple times). In this example I’m doing a top 5 Brands.

Brand Rank:=RANKX ( ALL ( Brand[BrandName] ), [Sales Amount] )

Then we create a top 5 measure that returns the sales amount if the rank is less than or equal to 5 and will return the sum of all ranks if the current brand is "Others" otherwise it will return BLANK()

Then if we create a pivot table with this measure we get the following result. In the picture below I’m showing the raw Sales amount so you can see how the amounts get re-assigned in the Top5 calculation.

This also works as you change the pivot table. When we introduce the Segment on the rows we can show a figure for “Others” within each segment. (Note that for this to work I had to create 2 placeholder “Others” rows, one in each segment). There is a link at the bottom of this post to the file that these screenshots were taken from.

At this point, I’m pretty happy. It seems to be working well although calculating ranks is rather expensive so I’m wondering how it will work against larger data sets. I know that there are definitely ways to make this faster if you are using the current generation of DAX engines (Excel 2016, Power BI or SQL 2016) as we have access to things like variables and the INTERSECTS function which we could use in conjunction with TOPN. But I think there are also opportunities to make this faster with “v1” DAX

Running this algorithm against the Adventure Works Product Name column results in a performance profile like the following (using the server timings feature of DAX Studio)

Which shows 86% of the time being spent in the formula engine. I’m pretty sure that a fair bit of this time is spent calculating ranks greater than 5 for the “Others” calculation. So I thought it might be faster to a TOPN in descending order and get everything except for the first 5. The formula for that is the following

This latest release includes a lot of UI polish features and we also now have support for SQL 2016 and Excel 2016 which is mainly an update to the installer and some changes to the Query Plan and Server Timings features to deal with some changes to the xmlSql that comes back in the trace events.

Following the theory that a picture is worth a thousand words – below are screenshots of the changes in this release.

The File –> Open menu now includes a list of recently opened files.

For performance testing you can now set the run button to always doe a clear cache before executing a query. This mode is easily selectable using the new arrow menu on the run button.

The model dialogs all have updated styling and now including a shaded overlay so that the active portions of the screen are clearly visible.

An options pane has been added to the File menu for setting global program options

A Query History pane has been added which records all of the queries run by Dax Studio. If you have the Server Timings feature enabled the Server duration and the FE / SE timing are also tracked. You can double click on an item in the query history to insert it back into the editor. This is great for performance tuning as you can easily see which variation of the query was fastest and returned the expected number of rows and then bring it back into the editor.

The metadata pane now loads asynchronously. In earlier versions the loading of the metadata pane was a blocking operation and the user interface could go unresponsive for a short time while loading large models. Now the metadata is loaded on a background thread so the interface remains responsive and the pane that is updating is greyed out to indicate that the load is still in progress. .

The new “Define Measure” feature, which is a right-click option on a calculated measure, is a great way to either see how a measure was defined without opening up the whole model. Or you can use it as a starting point to test some variations on the logic.

There are also a number of small bug fixes and tweaks and a number of issues that were raised on codeplex that have been fixed (we always tag closed issues with the release they were fixed in)

Just release an update to DAX Studio bundling up the following fixes and enhancements:

Support for the upcoming release of Power BI Desktop

Dynamic Ribbon resizing

Remembering of Window size and location

fixing display of hierarchies in Excel 2013

fixing quoting of keywords in table names

fixing an error when duplicate column names are returned in a query

fixing an issue when changing databases when connected to a Multi-Dim server

fixing an issue when connecting as a non-admin

I’m particularly happy with the dynamic resizing that is part of the Fluent.Ribbon control. It took a bit of effort to figure out, but it provides a much better experience than having to scroll the ribbon when window size is reduced.

So on a wide screen display you still get the full ribbon in all it’s glory

Then on an older 1024px wide screen a few of the labels in the Edit and Find groups drop off

Then things dynamically shrink then the groups finally start to collapse into dropdown menus. So if you want to go split screen on a small display everything still works even at 500px wide without needing to scroll the ribbon.

Below is an outline of the new functionality you will find in this release.

Intellisense

DAX Studio now has intellisense support, it's a fairly basic implementation, but I think I've got it working fairly smoothly. Note that it’s based on the best practice of always prefixing columns with the tablename and never prefixing a measure with a table name. So if you type ‘table name’[ you will get intellisense for all the columns in the ‘table name’ table. But if you just type [ you will be presented with a list of all the measures in the model.

The intellisense brings up table, column and measure metadata information from the current model:

As well as functions and keywords, including the descriptions for functions. The function metadata is coming from the DICOVER_FUNCTIONS dmv so it should accurately reflect the capabilities of the current data source (So you will see some additional functions when connected to newer data sources such as Power BI Designer):

Tracing (Query Plans & Server Timings)

One of the most significant new features is support for the tracing features (Server TIming & Query Plans) against PowerPivot. This has involved a significant amount of work re-building the internal code which handles this, but the end effect is that the Query Plan and Server Timings buttons are now enabled when connected to a PowerPivot model.

We've also altered the layout of the server timings information

I found when I was reading the results that I was always comparing Total Duration vs Total CPU and comparing the ratio of these two figures, so I've added a little "parallelism factor" under the Total CPU. I also found that I was looking to compare FE vs SE duration and the number of SE Queries to the number of SE Cache hits, so I've moved them onto the same lines.

Support for Multi-Dimensional SSAS Servers

Although it’s a little bit of a novelty, DAX Studio now properly supports connecting to multi-dimensional servers. This kind of worked previously if the default database for your multi-dimensional instance only had a single cube, but threw an error if it had multiple cubes. I’m now injected in a CUBE= parameter in the connection string when connected to a multi-dimensional server so this should no longer be an issue.

Support for Multiple Power BI Designer instances

We now have support for multiple instances of Power BI Designer, so if you have more than one copy of the designer open you can see the names of the .pibx files that you have open and choose which one to connect to.

Highlighting of unnatural hierarchies

Unnatural hierarchies can have a negative impact on performance so we highlight those in the metadata pane.

Bug Fixes

There have been a number of minor bugs that have been fixed along the way, including a number that were not officially reported, but were discovered during testing.

FIXED - Doubling up of backslash characters when using DAX Formatter

FIXED - Unable to connect to SSAS servers when not an administrator

FIXED - changing the connection for a query window would sometimes throw an error

FIXED - cannot connect to a PowerPivot model where the file path contains single quotes

Known Issues

The following issue which was first identified in 2.1.1 still remains:

"Linked Table" output for the Excel add-in only works for PowerPivot models if there is at least one table in the model that was created by linking data from an Excel table. We have not been able to find a workaround to this yet, it looks like it might be due to a bug in the Excel object model.

I saw a question yesterday on the MSDN forums ask how to go about exposing the LastProcessed datetime per table. Marco has a great post here (https://www.sqlbi.com/articles/last-process-date-in-ssas-tabular/) about a number of ways to get the last processed date at the database level. But none of these techniques work at the table level.

Having a look through the various DMVs that are available and none of them seemed to have this information apart from DISCOVER_XML_METADAT which returns a large XML document which can be tricky to query. However the information in this DMV is also used to populate the AMO object model. So it should be possible to use AMO to pull out the last processed datetime per table and then push it in using a variation of the processAdd push mode processing.

I started by creating a new table in my database called LastProcessed using the following query as a source

SELECT CAST('' as varchar(256)) as TableName, Getdate() as LastProcessed WHERE 1=2

This generates an empty table with a string column and a datetime column which will be the target for our processAdd push processing.

I was not exactly sure how to construct the XMLA for the processAdd command so I create a simple SSIS package using a Dimension Processing Destination in a DataFlow task and captured the XMLA it produced using SQL Profiler. The command that Profiler captured was missing the <parameters> section that you would use for pushing a batch of data so I adapted one of Greg Galloway’s excellent XMLA processAdd examples and then wrapped that in a powershell script.

Once that succeeds you can either expose the whole table or you could hide it and build measures to just expose the dates for specific tables.

Below is an example of querying this table in my adjusted Adventure Works database. You can see that ‘Internet Sales’ and ‘Reseller Sales’ have been processed at different times to the other tables in the database.

Biml Details

There are a huge number of changes to Biml in this release. Below are highlights of some of the more notable changes and those changes that were most requested by users. BREAKING CHANGES

RootNode.PackageProjects has now moved entirely into RootNode.Projects, including all namespaces. This is done to better support other project types such as analysis services projects. The mitigation is to change all instances of the <PackageProjects> tag to the <Projects> tag.

MITIGATION: If backwards compatibility with older versions of Mist is required, the PackageProjects/Projects element name can be conditionalized based on the tools version.

Removed QuerySource and Destination. These were abstractions that wrapped OleDbSource/AdoNetSource and OleDbDestination/AdoNetDestination in a single interface that offered the intersection of their features and automatically chose the correct component type to emit based on the referenced connection type. These language elements were not particularly useful, were very narrowly used, and their removal will prevent unintentional use by new customers.

MITIGATION: Change any instances of QuerySource or Destination to the appropriate OleDbSource/AdoNetSource and OleDbDestination/AdoNetDestination

All instances of the word Hadron have been removed from the project. Most notably, namespaces that included “Hadron” have 1:1 replacements with “Biml”

MITIGATION: Empty classes were added in the removed namespaces to prevent code with legacy namespace imports from breaking scripts

AstDimensionNode and AstFactNode have been removed. SSAS metadata is now added underneath the AnalysisMetadata element of table objects.

This is unlikely to affect most BIDSHelper users, since SSAS emission is only supported in Mist.

AnalysisServices connection property OverriddenConnectionString is renamed to ConnectionString. The Provider property is removed as it is no longer needed

This is unlikely to affect most BIDSHelper users, since SSAS emission is only supported in Mist.

Biml Language Changes

Directives

Added code directive to reference C# or VB files with code directly from Biml files. This enables a variety of scenarios around creating more complex and reusable .NET code for use within BimlScripts

Nested include files now work

Other

Added ObjectTag property to all Biml API classes. This is essentially a .NET Dictionary object that enables storage of .NET objects in a similar way to annotations in Biml.

Added support for external script project files in the language

Added GetPropertyValue method to all Biml API objects. This provides another option to get object property values when doing dynamic code generation.

Added Parse methods to all Biml API objects. This enables easy parsing of Biml code snippets to create Biml API objects. It is the inverse of the GetBiml functionality.

ConstraintMode has been set to default to Parallel, so that it no longer needs to be specified for most package objects

Added BuildSettings object to AstRootNode to access build settings in a central location

Added FriendlyName convenience property to columns

Fixed Biml reference resolution code so that references are updated even when DisableAllChangedEventInfrastructure is set to true. This is essential to enable the use of utility methods like IsUsedInPrimaryKey within live scripts on tables also imported within the live scripts

We are currently looking decommission an old reporting portal based on another technology stack and replace it with SSRS running in Sharepoint integrated mode. However one of the requirements was to support a “pipe delimited” CSV export format.

There are a number of posts you can find about changing reportserver.config when running SSRS in standalone mode, but I could not find much information about how to do this when running in Sharepoint Integrated mode.

This code loops over all the service applications in the current farm so even if you have a multi tenancy type situation it *should* work. Although if you are in this situation you should do your own testing. Our farm only has the one SSRS service application.

News

Legal
Any and all code, software, examples, suggestions and anything else on this web site is available for you to use at your own risk. No warranty is expressed or implied.
Views and Opinions
The views and opinions expressed on this web site are not necessarily the views or opinions of my employer.