Overview: Quick tip about using T-SQL in an Agent Job to refresh a SQL Server Reporting Services snapshot report, rather than a schedule.

SSRS has the capability to schedule the refresh of report snapshots via a report-specific schedule or a shared schedule. However, what if you don’t have a specific time you want the refresh to run? Alternatively, what if you want the snapshot to be refreshed after an event occurs, such as ETL completion?

The first step is finding the ReportID (aka ItemID) assigned to the report by ReportServer. The following query will return several pieces of information, including ReportID (aka ItemID):

Using this technique, you could add one or more T-SQL step(s) to kick off the refresh of subscriptions after ETL completes (or whatever other event you wish to trigger the refresh).

If you are controlling refreshes via an Agent job step, you probably also want to make sure the report-specific schedule is set to “Once” or turn it off completely. If you leave the schedule active (such as the “Once” option shown below), you will still have an Agent Job present that is associated to this schedule (same as how subscriptions are handled). The name of this Agent job can be found by referring to the Sched.ScheduleID field in the query above.

Overview: An introduction to using snapshots in SQL Server Reporting Services, including an explanation of the differences between a “Report Execution Snapshot” and a “Report History Snapshot.”

Intro to the Ways Users Can Run Reports in Reporting Services

1. On Demand. Most reports typically are executed on demand – i.e., the query in the dataset runs and the report is displayed at the time the report is requested by the user. This is the default in “Processing Options” within Report Manager (accessed via the “Manage” menu option).

2. Cached Reports. If you have a long-running query that cannot be tuned further, one option you might look into is caching. With this method, the first user that executes the report will wait for the report to render. Subsequent users, until the cache expires in minutes or on a schedule, will view the cached report. Getting the benefit of caching depends on the subsequent users having the same security and same parameter choices that the first user had.

3. Snapshots. If you have a long-running query, or perhaps a dataset you don’t want users accessing at any & all times of the day, then a snapshot might be a great option. A snapshot can be thought of as a pre-executed report, usually run on a schedule. The rest of this blog entry focuses on using snapshots.

Prerequisites for Using a Snapshot in Reporting Services

There’s a couple of requirements in order to use a snapshot. If you are familiar with subscriptions in SSRS, these will sound familiar.

a. Stored credentials for the data source. Windows authentication is not acceptable when a report is scheduled. If you aren’t using a data source with the ID and Password stored (so that data level security is a non-issue when it runs), you’ll get the message “Credentials used to run this report are not stored.”

Tip: Create a separate shared data source that has a name such as “MCGardenCenterDW_StoredCreds” so you know at a glance it’s using a specific ID and Password as opposed to Windows Authentication (assuming Windows auth is your default). You may even put these any data sources with stored credentials into their own folder with limited permissions.

b. Defaults for all Parameters. Another prerequisite for using snapshots is that all parameters need a default value defined. This makes sense as there’s no user interaction when a snapshot is being populated at, for instance, 4am.

Tip: Sometimes you might have to be a bit clever in order to ensure all parameters have a default. For instance, if the default is always “current fiscal period” that continually changes, you can make this happen with an intermediary hidden parameter – it might just take a bit of extra time to think up a good solution.

Scheduling a Snapshot

As you are perusing the options when you “Manage” a report, you will notice there’s two different schedules associated to snapshots. First, there’s the scheduling options on the “Processing Options” page:

Secondly, there’s scheduling options on the “Snapshot Options” page:

Initially you might say “huh?” or wonder which should be set. The key here is that there’s really two kinds of snapshots…

Two Different Types of Snapshots in Reporting Services

On the Processing Options page, you would schedule a “Report Execution” snapshot. The primary purpose for a Report Execution snapshot is usually to improve performance by reducing report rendering time, or to ensure queries are passed to the source database at very specific times.

Conversely, on the Snapshot Options page, you are setting up a “Report History” snapshot. The primary purpose for a Report History snapshot is to keep a copy of the report at a specific point in time.

So, basically you want to make sure you set the schedule associated to the purpose you’re trying to accomplish. Usually it’s just one or the other, but it could be both depending on the requirements. Microsoft explains it like this:

“Snapshots that are generated as a result of report execution settings have the same characteristics as report history snapshots. The difference is that there is only one report execution snapshot and potentially many report history snapshots. Report history snapshots are accessed from the History page of the report, which stores many instances of a report as it existed at different points in time. In contrast, users access report execution snapshots from folders the same way that they access live reports. In the case of report execution snapshots, no visual cue exists to indicate to users that the report is a snapshot.”

Avoiding the “Selected Report is Not Ready for Viewing” Error

Let’s say on the Processing Options page you checked the radio button to “render this report from a report snapshot.” Then you scheduled it on the Snapshot Options page. Sounds reasonable, right? You can even view the report within the Report History just fine. However, you click on the report name (i.e., the normal way to run a report) you get a message: The selected report is not ready for viewing. The report is still being rendered or a report snapshot is not available.”

Why the message? And where’s the report? What happened is with the setup just described, a Report History snapshot now exists but no Report Execution snapshots exists. So, SSRS has no report to render given this circumstance. To resolve, it does depend on what you are trying to accomplish, but you probably want to schedule it on the Processing Options page instead. (Alternatively, if the data rarely changes you could control when it’s created by using the Apply button option. The Apply button is also useful for generating the first snapshot for testing.)

Viewing Report Execution Snapshots within Report History

There’s one more really important thing to be aware of. Let’s say you schedule your Report Execution snapshot within the Processing Options. The report runs fine, but you don’t see it within Report History.

By default the option to “Store all report snapshots in history” is not checked. If you want to see a Report Execution snapshot within history, you’ll want to check the box to do so within Snapshot Options. If you’re only using Report Execution snapshots for this report, you’ll probably want to store them in the history. However, if you happen to be using both Report Execution and Report History snapshots for the same report, you might want to leave it unchecked so they don’t get mixed up – i.e., if you are doing both, then the schedules will differ for a reason.

Letting the User Know When the Data was Refreshed

Since a snapshot will render data at a particular point in time, it’s very kind to the end users of the report to display the “data as of” data in the report header (or footer, however you have it standardized). To ensure it displays when the snapshot was created, you’ll want to use the ExecutionTime global field (as opposed to Now() or Today() type of functions). For example:

Overview: Quick tip about using Power View in Excel 2013 re: how data refreshes are handled, as well as requirements for data connectivity and Internet connectivity.

Power View Refresh Behavior Upon Opening the Excel File

Recently I had an “aha” moment when I realized Power View acts differently than a PivotTable when you open a previously saved Excel 2013 file. When you open an Excel file that has previously been saved, a PivotTable will render its last saved state (if the external data connection is *not* set to “Refresh data when opening the file”). I usually explain to new self-service users that this behavior of a PivotTable can be a bit of a security hole if an Excel file is emailed to someone who doesn’t have formal security to see the data – the recipient can see the data last saved in the PivotTable but they couldn’t refresh it unless they have permission to do so (if the external data connection is *not* set to “Refresh data when opening the file”).

However…Power View does not behave the same way. When you open a previously saved Excel file, it immediately wants to refresh the Power View sheet when you click on it. As far as I’m aware, Power View cannot render its last saved state, nor can it render a snapshot as of a point in time.

For example, the following screen shot displays a Power View sheet where the external Tabular model data source is unavailable. The Power View message says “Sorry, something went wrong while loading the model for the item or data source ‘DataSourceName’. Verify that the connection information is correct and that you have permissions to access the data source.”

This behavior tells us we need to have connectivity to our external data source in order to view the Power View report.

Online Connectivity Requirements for using Power View

From what I’ve learned so far, there are the following online requirements for Power View reporting in Excel:

Access to External Data Sources (i.e., if accessing an external Tabular model instead of an embedded PowerPivot model). This might mean a worker has to VPN into their office if using an Excel file while at home, for instance.

Internet Connectivity if using Maps (Power View integrates with with Bing Maps).

Internet Connectivity if using Image URLs (as opposed to binary images embedded in the data model).

Noticed anything else along these lines? Please leave me a comment and I’d be happy to add it.

I am delighted to announce that Microsoft has granted me (me?!?) the MVP Award. The official description of this award: “Microsoft Most Valuable Professionals are exceptional technical community leaders worldwide who actively share their high quality real world expertise with others.” The Microsoft MVP public site is at: http://mvp.microsoft.com.

As the week has progressed since the announcement, finding the words to express my feelings about it is becoming harder rather than easier – there are so many people in the community doing great things! Not sure how I made the cut, but I’m looking forward to learning from all the other MVPs and Microsoft folks I’ll be exposed to, and continuing to contribute BI content to the community.

Many times I’ve said the Microsoft user community is very different from others. The extent to which information is freely shared is astounding. We should all greatly appreciate the ability to fire up our search engine and find advice which saves us time and teaches us something. So many people invest time and effort to make that happen. There are blogs, user groups, books, SQL Saturdays, webinars, Twitter, forums – with many great friendships made along the way.

This week I’ve been thinking a lot about how much our work lives depend on being fearless. What I mean by that – technology changes all the time, business methods vary, best practices evolve, we change, and the people around us change and grow. Many times we find ourselves doing something unfamiliar. The more experience I gain, the less I fear the unknown and the more confidence I have that we as a team can “figure it out.” The sheer volume of information that’s available to us in the Microsoft community to improve our skills makes “figuring it out” faster, easier, and with less risk for poor decisions.

Now I don’t want to pull a Sally Field moment here, but there’s a few people I want to call out. A heartfelt thanks goes to Rafael Salas who submitted an MVP nomination on my behalf. Wayne Snyder was instrumental in giving me the confidence to start blogging. My good friends Javier Guillén and Jason Thomas – these guys are always willing to review a draft, exchange ideas, or just go for a beer – they make me smarter by association. My employer, Intellinet, has been extremely good to me - the level of support and encouragement I’ve received from Leo Furlong and everyone at Intellinet has been amazing. Finally, my husband Bob is my rock. That says it all.

Today I received a really nice plaque and certificate. See the little 2013 disc on the left side of the MVP plaque? If an MVP award is renewed the following year, the recipient receives another disc to slide onto the plaque. Cool, huh?

Overview: Quick tip about how to insert a Power View worksheet in Excel 2013 that points to an external data connection (as opposed to an internal PowerPivot model). Also includes a tidbit at the end about why the behavior of data connections makes Power View unable to be used for snapshot reporting.

What Doesn’t Work With an External Data Connection

You’ve probably noticed on Excel’s Insert menu a choice for Power View.

The Insert menu works beautifully when a PowerPivot model is embedded within the same Excel workbook that is open. However, if an embedded PowerPivot model is not available, instead you get a message: “Power View needs data to work with.”

So, let’s investigate a different technique in order to make an external connection to an Analysis Services Tabular model.

Connecting to an External Data Source

If your connection already exists, choose it in Existing Connections and skip to Step 4.

1. On the Data menu, choose From Other Sources > From Analysis Services.

2. Select the Tabular Model you want, then Next.

3. Modify the File Name and Friendly Name if you’d like, then Finish.

4. Change the radio button to Power View Report (instead of the default PivotTable Report).

You probably also want to visit the Properties pane. By default, the checkbox to “Refresh data when opening the file” is not selected.

If you leave the “Refresh data when opening the file” option unchecked (which is the default), when the file is reopened Power View will render a message “Please refresh to see the data for this Power View sheet.” Doing a Refresh fixes it, but it’s one extra click every single time you open the file.

Inherently what this means is that Power View cannot be used as a tool for snapshot reporting. With an Excel PivotTable, unless you choose to refresh the data you see the last saved state. Power View, however, does not behave the same way – it won’t render the last saved state. It must be refreshed. Therefore, this means to save a snapshot at a point in time the report would have to be exported.

To change the setting after it’s been set up, go to Connections on the Data menu, then choose Properties.

Overview: How to produce a trace file for the IT or BI Support team to review when troubleshooting an error within PowerPivot running on an individual user machine.

Let’s say you’ve been working with PowerPivot for a while and suddenly an error starts to occur. Perhaps this error has popped up when you try to refresh the data. In the following screen shot, the error message states “We couldn’t refresh the connection. Please go to existing connections and verify they connect to the file or server.”

In the above screen shot, the Work Item of “AutoSales” is the friendly name given in the PowerPivot model for the first table it’s trying to process – it’s not the actual name of the data connection. So, the friendly table name may or may not be descriptive enough to infer what the actual data source is. In this situation, you will probably be able to resolve this issue by going to Existing Connections on the Home tab > select the appropriate PowerPivot Data Connection and choose Edit > then Test Connection. The test will almost certainly fail if the data refresh failed.

The screen shot above which displays the Server and Database name is excellent information to provide to the IT/BI team for troubleshooting connectivity.

But…what if it’s not so simple? What if you need more information to figure out what the workbook is actually doing? Enter the trace file…

Creating a PowerPivot Trace File

PowerPivot has an option to create a trace (*.trc) file which will describe, in excruciating detail, each step that is happening behind the scenes. In certain situations, having all of this detail this can be tremendously helpful.

To generate a trace file, go to the PowerPivot menu in the Excel window (not the PowerPivot window) > select the “Client tracing is enabled” checkbox.

This will enable tracing for this one workbook, while it remains open - i.e., for this session only. This behavior is nice because you don’t have to remember to turn the tracing off; it terminates when the workbook is closed.

Finding Information in the Trace File

In our situation, we can quickly scroll down and spot the error in the trace file. The descriptive message appears at the bottom of the window for each line.

The full error is a mouthful, but the description from the trace does give more info than the initial PowerPivot screen did: OLE DB or ODBC error: Login timeout expired; HYT00; A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.; 08001; Named Pipes Provider: Could not open a connection to SQL Server [53]. ; 08001. A connection could not be made to the data source with the DataSourceID of 'b3c4d1db-0d27-477c-bff6-8a020d0841c4', Name of 'SqlServer SQLIMAGE2012 AutoSalesSourceDW'. An error occurred while processing table 'AutoSales'. The current operation was cancelled because another operation in the transaction failed.

If you scroll up to earlier entries in the trace file, we can find the actual query being executed against the source:

It also captures information such as how calculated measures are being created:

There’s can be a lot of volume to scroll through, but there is lots of excellent information captured by a trace file. In some cases, sending the trace file to the IT/BI Support Team will provide enough details to figure out what’s gone wrong.

Reading a Trace File

An application which is able to read a *.trc file is required. Common applications like Notepad or Excel cannot read all the contents of a trace file:

To read *.trc files on your machine, you’ll need the SQL Server Profiler which comes as part of a SQL Server installation. If you are a business user who uses PowerPivot but not SQL Server, installing the Developer Edition of SQL Server is probably overkill, but it is an option if you don’t have IT support to help.