I got asked a question a while ago on whether SSAS supports the transaction settings in SSIS. To answer that we really need to understand what happens when we set the TransactionOption property to Required on an SSIS container.

When you switch on transactions in SSIS it uses the Distributed Transaction Coordinator (DTC) to coordinate the transactions. The Distributed Transaction Coordinator is a Windows service that coordinates transactions across multiple processes and/or machines.

I did not expect that SSAS would support DTC transactions, but I could not find a source that stated this explicitly. I'll talk a bit more about why I did not expect this to be supported later. SSAS does use a transaction system internally to keep things consistent within itself, but that is different to DTC transactions.

So I setup a simple package to test this. I added a sequence container with the TransactionOption set to Required. Then I added two SSAS processing tasks with an empty sequence container that I could use in order to create a breakpoint between them.

Then I started the package and queried the DISCOVER_LOCKS DMV so that I could see the locks that were created during the processing.

If SSAS supported DTC transactions we should see these locks held for the life of the transaction. The locks are important as they will prevent other processes from reading/writing to the data that is not yet committed. You can see in the screen shot below that we have lots of locks with a LOCK_TYPE of 2 or 4 (which are processing read and write locks)

But when we get to the breakpoint in between the two processing operations you can in the screenshot below that all the processing locks have been released. The only locks that are visible are the LOCK_COMMIT_READ (type 8) locks that are held by the DMV query itself. (all queries take out this sort of lock to prevent a commit operation occurring during the middle of a query). At this point the updated version of the Products dimension is available for querying, even though our SSIS transaction has not committed.

Then when we proceed to the next processing step the locks for the next operation are taken out.

Then once everything has finished all the processing lock are again released.

I mentioned earlier that I did not expect SSAS to support distributed transactions and the reason for this is that If you inserted data into your data warehouse and processed your cube(s) in one transaction, a failure would mean that the whole transaction, including the insert into the data warehouse should be rolled back and I'm struggling to think of any time when this would be useful. It would make it really hard to trouble shoot issues, when the data that caused the issue has been removed.

So hopefully if you've made it this far you can see that SSAS does not support distributed transactions (which is what SSIS transactions are) and that this is a good thing.

If you do want to process your SSAS objects within an SSAS transaction the default behaviour is to do this, processing all objects sequentially in a single transaction. But you can change this if you like to process in parallel or to process each object within it's own transaction. Generally you will want to process everything in a single transaction as this will ensure that everything will remain in a consistent state.

Below is the processing options dialog from the SSIS "Analysis Services Processing" task showing the various transactional options.

Comments on this post

Hi Darren,I wasn't surprised either. However, I would have liked distributed transactions to include SSAS. I feel that it is a common requirement to incrementally process a measure group and update an SQL control table with the high water mark.For example, I would like to incrementally process a mg (which might take a while) then a singleton update on an SQL control table with the new high water mark, within the same transaction.

Hi Richard,I think you could probably achieve the result you are after by using explicit transactions in SSAS. Obviously if the processing fails it's easy enough to detect that and not update the control table. But if there is an issue updating the control table I gues you want to rollback the processing? I think this could be done with a acript. I'll have to see if I can find some time to do some experiments.

News

Legal
Any and all code, software, examples, suggestions and anything else on this web site is available for you to use at your own risk. No warranty is expressed or implied.
Views and Opinions
The views and opinions expressed on this web site are not necessarily the views or opinions of my employer.