SSRS - simulate slicers using filtering - reporting-services

I've seen websites explain how to use a tablix to list possible parameters for a report (another tablix) to simulate slicers. This technique relies on passing parameters and the report being refreshed.
The issue is, this is slow for where I want to use it. I am wondering if there is a way to load all the data for my report and then use a similar technique to create a slicer to limit the data using the filter on the dataset.
Another option I thought about was simply hiding the rows that didn't match what was selected in the "slicer".

A good way to speed up the processing time is to use caching. Go to "Manage Processing Options" in your report manager (or SharePoint, depending on what you're using). Set it to use cached data. Play around with the duration and cache refresh schedule settings to suit your situation.
You may also benefit from applying your parameters as Dataset filters as opposed to passing them into the query. This can help ensure that a cached version of the report will be available. It really depends on how big the dataset is and how many combinations of parameters you're trying to allow.

Related

Using Row Group Expressions In SSRS Causing Preview To Run Very Slow

I have a stepped report that has 5 row groups. Each one uses an expression as the value to group on. The user selects a value for up to 5 different parameters and based on those parameters is how the report is grouped.
With certain combinations of parameters the report may take 2min + to run in the preview pane in Visual Studio. When I deploy the report to the SSRS server the same combinations of parameters runs in 10 seconds or less. Out of curiosity, I created a copy of the report and removed the expressions from each of the row groups and specified a field to group on. In that case it previews just as fast as when viewed on the server.
Anyone have any idea what may be going on here?
I'm currently using Visual Studio Enterprise Update 3.
I realize this is an older issue, but maybe this will be some help to you.
I am experiencing similar symptoms to what you have described and it comes down to using dynamic fields in grouping statements. Apparently, any other statement can use dynamic fields without issue, but groupings incur a huge performance hit.
This link outlines the symptoms and causes.
I don't know if it was in that particular link or not but I read something about how the ReportViewer control when it renders to HTML has a bug in it that only happens when the control is running under .Net 4.0 or higher. In .Net 3.5 the performance is pretty much the same as it is if you were to run the RDL in the Report Viewer application.
If you are able to do so, the quickest way to get the report viewer component to work is to have the application pool IIS is using to run under .Net 2.0 (or 3.5 if that option is there, since they're basically the same thing).
If your application uses .Net 4.0 or higher for other functionality, there isn't much I've been able to figure out, short of rewriting the report to remove dynamic references from groupings, but that significantly reduces the interactivity of the report (no expand/collapse options in table rows/columns).
For reference, I had a report, where the query took about 2 minutes to execute, and then rendered almost instantly afterwards if I loaded the RDL in SSRS. That same report rendered using the ReportViewer control in .Net 2.0 took about 2 min 30 sec to fully render. Using the exact same code, but changing the app pool to use .Net 4.0, the report has been "rendering" for roughly 15 minutes now and still hasn't come back.
First of all and a bit obvious, if you are running your report in a development machine it won't have the same computing power as a server. Even if you are running VS from the server it won't run as fast as a deployed report.
The cause -as you may notice- for the slow performance is the grouping expression. It seems your report is taking to long to process so check this article from a Technet post. I copied and pasted a paragraph below:
Many levels of nested and adjacent groups in a Tablix data region can
affect report processing performance. Consider both the level of
grouping, the number of group instances, and the use of aggregate
functions which require evaluating after group, filter, and sort
expressions are applied.
I recommend you try to perform the grouping in the datasource at SQL level, you can also pass parameters to the query to set the desired grouping there.
Let me know if this helps.

MS Access - Adding unbound fields at design time

I would like to create an Access report in which the record source is created via ADO code and then set as the record source for the report when the report is run. The problem I am running into is how to add fields to the report since the recordset is bound to the report at run-time and not design time. Is there way I can manually add the field and make sure the field name matches what will be provided in the recordset field collection? Thank you.
The standard solution to this problem is to add all your fields in design view, up to the max available, and hide all of them, and show only the ones you need. Michael Kaplan explained that when he designed the Access Replication Conflict Resolver, this is the approach he used, precisely because adding controls at runtime quickly uses the lifetime limit on the number of controls on a form (700+, but I can't recall the exact number).
It's also just a bad idea, as #Jeff O says, to do design changes at runtime. For one, it means you can never distribute an MDE.
Several ways to do it, but all of them have their issues. create-dynamic report using vba
Other questions have found the folley in working in design mode programatically.

Dynamically loading SubReport data in SSRS

This is in SSRS 2008.
I've created a report with a tablix, embedded in the tablix there is a subreport. This subreport contains a lot of information (and I mean a lot – it takes more than 45 sec to load it).
I don’t want to show it, or to be more precise – to process/load the data when the report initially runs. Note that I dont want to just not display it - because then all the data is still processed at initial load time.
Instead, I only want the subreport to be processed (and the data pulled down), individually, when I display it (e.g. clicking a + sign to toggle it, or any other option such as clicking an image/ link, while staying in the same report). I also don’t want to open the subreport in a different tab or something like that....
Does anyone know a solution? Maybe there is an onLoad method, dynamic subreport or something like that?
I tried looking into the DataElementOutput attribute, but that's readOnly... anything else?
Before you jump through a lot of hoops to implement a workaround, have you analyzed the execution plan for the query which is delivers the data for the subreport? If you haven't, it's worth it to see if performance can be improved by adding a covering index which will help the query optimizer to deliver the data quickly.
Hope this helps,
Bill
So, as it turns out, the problem was in fact because I had the subreport repeated for each row in the parent report. Thus, the subreport was called numerous times...
And according to MS, there is no way to dynamically process individual subreports. Had to solve this issue by splitting out the subreport (at least the data-heavy parts) to a seperate page. :(

Should I use SQL Reporting Services 2008 for my reporting engine?

I would like to use SQL Reporting Services 2008 to generate my reports, but I want to use my own UI for specifying the report type, columns, parameters and everything. I want to be able to take these criteria, and then kick off an asynchronous request to SSRS and have the report emailed to me. Is this possible? I don't want to go all the way down the road of researching SQL Reporting Services 2008 only to find that it doesn't do what I need it to do. Also, I will have a ton of DB partitions that the data will need to be pulled from. Some reports will need to pull data from only one of these, but other ones may actually need to span different databases. Is it possible when sending a report request to SSRS to specify what servername/database to pull the data from? Is it possible to tell it to take the data from multiple databases and combine it? Thanks.
Like Crystal Reports, ActiveReports and other report generators, SSRS has two basic elements behind each report: the SQL query and the report layout. No matter what tool you use for the SQL -- it can be inline SQL in the report or a call to a stored procedure -- it's going to be the same query. Multiple databases are fine as long as you can specify them up front.
You can have parameterized queries, so the user is prompted to input the relevant filters (customer ID, product group, date range, whatever).
Doing the report layout is similar to other tools -- you drag and drop controls like labels onto the report, and set their formatting.
SSRS does provide a lot of options for distributing the report, including email. You can embed the report in an ASP.Net web page, leave it on the report server site for users to browse to, run it in the wee hours of the morning and cache it so every user doesn't have to wait for the lengthy query to run.
It's a great tool. I think it will be worth your effort to experiment with it. I would wait on creating the customized UI until you've exhausted the possibilities inherent in the tool.
SSRS is not designed with this scenario in mind, for that matter I am not sure that any out of the box reporting solution is going to have an elegant solution for this. While SSRS can do what you are asking (as well as others), it is by no means quick or easy. You seem to be looking for an advanced ad-hoc solution with dynamic sourcing of the data. I would first question the requirements and determine if the business scenario really justifies such an implementation. I would weigh custom building a solution vs your learning curve with a BI reporting solution. You may find that it is easier to just build something on your own.
I think the heterogeneous dynamic database mashup is probably going to be the most challenging part.
Depending on what your scalability requirements are, one place that has that part covered, and a report writer, is Access. (Duck! Incoming!)
I think you may be creating a rod for your own back to a certain extent as RS ships with a few interfaces for report creation.
Mind you the end product is an rdl file which is nothing but xml, so you can write them by hand if you really like.
Multiple data sources are supported, but combining them on a single control/chart/etc are not, so you'll need to configure yourself a cross database capability from one of your data-sources prior to the report request if you want to do that.

Union being overridden in OBIEE

I have used a 'Union' to combine two reports showing different dates. When I open the report on it's own it works fine with the two dates shown. However when I drill to the (union) report from another on the dashboard it only shows one date. Is there a dashboard filter that I may need to un apply or is there some other explanation!? I cannot seem to get the report to function as required from the drill.
Tough to answer - need more info. Some possibilities:
You're not really drilling to the report you think you are. You can learn a lot by looking at the log to see what's happening in the target report.
You may be passing values from the first report that exclude the date in the second report. Check your filters in the target report. Look at the generated WHERE clause in the log. Run the SQL directly against the database. Is the date still excluded?
You want to start by checking the physical SQL which is written to the nqquery.log. It can also be obtained by viewing the log in Answers directly. Another way is to go to Administration > Issue SQL, enable presentation server cache, set the debug level to 7, execute the SQL and click on the view log link. The Physical SQl can also be tested against the actual database, if that is where your data is coming from. Examining both - your Logical SQL and your Physical SQL can resolve a lot of minor issues, including this one.