SSRS Export En Masse

Have you ever found yourself in need of exporting all of the reports from a Report Server? I have found myself in need of extracting all RDLs from a Report Server on more than one occasion. Reporting Services 2008 would rather have you do that in a rather monotonous and repetitive style. That doesn’t work very well for me – and especially not if you have a ton of reports to export.

I combed the internet to see if there was a quick to implement way to do this – and found some recommendations. Most of these required the installation of another utility or the purchase of other software. I found one site however that was extremely useful for what I wanted. Brett Stateham used some queries to display the XML for these reports that is stored in the Catalog for Report Server. You can read about that here.

Reading his blog post on the topic, I got the distinct idea that I could use a query to extract the xml and save it to an RDL file. To verify this, I copied the XML (as returned by the queries on Bretts page) to a notepad file and save the file with an RDL extension. Then I added that file back into a solution in Visual Studio and ran the report – sweet. It worked.

Now that I know it can be done as simply as he displayed, it was time to take it a step further. I now need to create something that will help me export numerous reports to their own individual files. Hmmm, what could I use? Having done something similar in the past for a single xml file in SSIS, I decided I would just use SSIS. Thinking it through, I figured this should be pretty simple in appearance (a file destination object, an execute sql task, a data source, a file source, a foreach loop – you get the idea).

As I started to work through the solution, I found that I was over thinking it a bit and the solution could possibly be easier. That made me wonder if a script task would be helpful here. I decided to research and see if there was a quick way to write the xml to a file via a script task. Guess what, there is. Using VB .Net, there are simple methods inside a script task to write to the filesystem.

So, without further ado, let’s take a look at the package.

First, let’s create a few variables for use throughout the package.

objReport – This is of type object and will be used to store the results of our TSQL statement from an Execute SQL Task.

ReportExportPath – Type is String and will be the destination file path. The last character for this variable should be a “\”

ReportName – Type is String and is for use by the ForEach Loop. This variable will receive the report name from each iteration through the objReport and then be used by the Script Object to later give a name to the output RDL file.

ReportSourcePath – Type is String. This variable is to help extract only the reports from a specific location. In many cases, reports are stored in different folder paths and you may not need all folders’ contents.

ReportXML – Type is String. Similar in function to ReportName

DBName – Type is String. This value will be used to override the Initial Catalog of the Connection String

ServerName – Type is String. This value will be used to override the Server Name of the Connection String.

Next, we need to create a connection object to create a data source for this package. I created an ADO.Net connection object and named it ReportSourceDB. Once created, use expressions from the properties tab to overwrite the Initial Catalog and Server Name values with the already mentioned variables.

Now that we have that stuff out of the way, we can begin work on creating the data flow.

Above is displayed the entire data flow for this package. I adapted, albeit ever so slightly, the script we discussed at the beginning to the following for use in this package.

Transact-SQL

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

WITHItemContentBinariesAS

(

SELECT

ItemID,Name,[Type]

,CASEType

WHEN2THEN'Report'

WHEN5THEN'Data Source'

WHEN7THEN'Report Part'

WHEN8THEN'Shared Dataset'

ELSE'Other'

ENDASTypeDescription

,CONVERT(varbinary(max),Content)ASContent

FROMReportServer.dbo.Catalog

WHERETypeIN(2,5,8)

Andleft(Path,len(@ReportPath))=@ReportPath

),

--The second CTE strips off the BOM if it exists...

ItemContentNoBOMAS

(

SELECT

ItemID,Name,[Type],TypeDescription

,CASE

WHENLEFT(Content,3)=0xEFBBBF

THENCONVERT(varbinary(max),SUBSTRING(Content,4,LEN(Content)))

ELSE

Content

ENDASContent

FROMItemContentBinaries

)

--The outer query gets the content in its varbinary, varchar and xml representations...

SELECT

NAME

,CONVERT(xml,Content)ASContentXML--xml

FROMItemContentNoBOM

This script is in the ExecuteSQL task. The task is configured to retrieve the full result set and store it in objReport. A parameter is specified and passed to the above query in the first CTE. The following images should help clarify the configuration a bit.

General Tab:

Parameter Mapping Tab:

Result Set Tab:

So far so good. This is now getting all of the data that we need from the ReportServer database. We can now pass this information to the ForEach Loop container and write the XML out to RDL files. The configuration of the ForEach Loop is not too complicated and looks like this:

Collection Tab:

Variable Mappings Tab:

Inside of this ForEach Loop container we have that script task that was displayed. This was probably the trickiest (yet extremely simple) part of the whole exercise for myself. I have used script tasks in the past for various tasks and was worried this might be a bit more complicated. So let’s start with a couple of pictures.

There is no need to use a ReadWrite variable in this task. This task simply reads the variable and then writes the contents of that variable out to a file. Note that the script language is set to VB 2008. The default (for me) was C# – and I changed that. Once the variables are specified for ReadOnlyVariables, click the Edit Script button at the bottom of the screen.

In the new window that opens, highlight everything and overwrite it all with the following.

The line that is important here is the line containing My.Computer.FileSystem inside Public Sub Main(). Note how I am using the variables we have created to this point to create the file name (first parameter series inside WriteAllText()) and also to write the contents of the file based on the ReportXML variable (that is the second parameter inside WriteAllText()).

From here, it is simply a matter of testing the package. Run the package and check the directory you specified in the ReportExportPath variable. This little project will save me an enormous amount of time in just exporting the reports to files from the database. Yay, no more repetitive download task done through report manager.

Jason, Glad you found my article useful, this is a great use of it. As a side note, have you looked at SSIS’s “Export File” data flow transformation component? It might simplify your SSIS package and give you a no code solution.

SQL Server is full of good stuff. There are plenty of features to be used. Plenty of applications to help it. And there is even plenty of metadata within SQL Server to help you better understand your data and the queries that are being run. It just so happens that a couple of clients requested[…]

Today we have another installment in what is known as TSQL Tuesday. This month we have an invitation and topic given to us by the infamous Kenneth Fisher ( blog | twitter). Today, the invitation is for us to share our stories on how we like to manage security. Or at least that is the[…]

Recently I wrote an article about Capturing Online Index Operations. In that article, I discussed a problem that I had encountered. Well, there were multiple problems. One was an issue with a vendor app that had some hidden module that was performing online index defrags that was causing corruption in a couple of indexes every[…]

One of the things that DBAs love to do is keep their servers running and healthy. A healthy server, after all, is your ticket to a stress free day and a full night’s sleep. Granted this not a guarantee but it sure helps make life easier. We are always looking for the big ticket items[…]

What a fun week we have tuned up for the folks in Las Vegas. It is the first full week of January and there is this huge convention going on near the strip. And as timing would have it, this week is also the perfect time to have our User Group meeting. What major conference[…]

Tis the season for TSQL Tuesday. Not only is it that season again, but it is also the Holiday season. During this season, many people start to think about all of the things for which they are thankful. Many may start to think about their families and friends. And many others will focus more of[…]

This past weekend I had the opportunity to go visit Washington DC. It was just the second time I got to stay in the Nation’s capitol for more than just a few hours. The previous opportunity came with last years event which I talked about here. Sadly, my time was far too limited this trip and[…]

I am about to set sail on a new venture with my next official whistle stop. This year has been plenty full of whistle stops and I plan on continuing. You can read (in full) about previous whistle stops and why they are called whistle stops here. Suffice it to say at this point that it all[…]