Answered by:

Shared Data Set for improving performance?

Question

I have a report for members of a community. That report will show all the data related to all the members of that community. What I have done is I used a Main report and a Sub report.

Issue:

In my main report I show details of community, while in Sun report I pass the MemberId(Key In my DB) to find all the information about that particular Member. For finding all the need I have written 7 Stored Procedures. Therefore If I have 100 members in
the community, I am hitting the DB 700 time for generating my report.

My Solution:

I am now thinking of making a shared dataset and bring all the data data on the server, And then In my subreport I'll filter data related to particular members.

Question:

1. Do I need to change my implementation, as I think my report server is on DB server only, so we are hitting the DB 700 times but we are not increasing network traffic, Please explain this also?

2. Is this the right approach?

3. Our project is using cluster environment, So can we cache the data on the server?

Answers

Hitting the DB 700 time for generating report would, for sure, result in bad performance.

Alternatively, for the sub report, first make the dataset as shared one and query entire data of 700 members in one connection and cache it (say for 30 minutes). Since data is cached in first request, any subsequent request can use this cached data
and filter as per the value passed from main report.

Here are the steps to achieve this:

1. Create shared datasets. Since you are using 7 stored procedures to get data, you need to create 7 shared datasets.

2. Update your stored procedures to give complete details meaning you need to bring details for all members of the selected community (700 members)

3. Add cache setting for these datasets by browsing report server

4. Add these datasets in your sub report and filter as per parameter passed from main report (in your case I believe this is community member)

Hitting the DB 700 time for generating report would, for sure, result in bad performance.

Alternatively, for the sub report, first make the dataset as shared one and query entire data of 700 members in one connection and cache it (say for 30 minutes). Since data is cached in first request, any subsequent request can use this cached data
and filter as per the value passed from main report.

Here are the steps to achieve this:

1. Create shared datasets. Since you are using 7 stored procedures to get data, you need to create 7 shared datasets.

2. Update your stored procedures to give complete details meaning you need to bring details for all members of the selected community (700 members)

3. Add cache setting for these datasets by browsing report server

4. Add these datasets in your sub report and filter as per parameter passed from main report (in your case I believe this is community member)

Hitting the DB 700 time for generating report would, for sure, result in bad performance.

Alternatively, for the sub report, first make the dataset as shared one and query entire data of 700 members in one connection and cache it (say for 30 minutes). Since data is cached in first request, any subsequent request can use this cached data
and filter as per the value passed from main report.

Here are the steps to achieve this:

1. Create shared datasets. Since you are using 7 stored procedures to get data, you need to create 7 shared datasets.

2. Update your stored procedures to give complete details meaning you need to bring details for all members of the selected community (700 members)

3. Add cache setting for these datasets by browsing report server

4. Add these datasets in your sub report and filter as per parameter passed from main report (in your case I believe this is community member)

Is there any problem in cashing, since cache becomes to large for 100 or more members. As I think SSRS stores it's cache
in RAM on the server. Becayse of this, Can there be any -ve effects??