How do I change the query execution order in SSRS? - reporting-services

How do I control which stored proc runs first in SSRS. My second stored proc needs to use the temp table data from the first stored proc. Thanks so much.

Dataset Execution Order
http://blogs.msdn.com/b/robertbruckner/archive/2008/08/07/dataset-execution-order.aspx

What you are proposing is a significantly bad idea. If both datasets are sharing a table can you merge them into one result set, then filter or aggregate it in the report.
That said, have you tried reordering the datasets in the RDL (XML) file. I imagine reporting service will run these in order, though it may run them asynchronously. No guarantees.
EDIT:
Adolf's Link confirms it, They do run in parallel unless you set Use Single Trasaction in the datasource. Then they run in the order of the RDL file as I suspected.

If you use a value from the first procedure's dataset to be used to populate an input parameter used by the second procedure, this should ensure that they are called in the desired order.
That said, I agree with jimconstable - it would make more sense to use a single dataset and filter out any unwanted results.

Related

Where to Aggregate Using Microsoft Reporting Services?

I'm working on my first SSRS report and I haven't been able to find general guidelines as to how to create reports. Specifically, I would like to know what the general approach is when aggregate data is needed on a report. For example, let's say I need to show the following in my report:
Pancakes ---34
Eggs----------56
Bacon--------73
I have a several more rows like the above that need to show aggregate data. I'm currently grouping the whole row by type and then on each cell I'm showing a count as follows: [Count(Status)].
My report is already taking 45+ seconds to run. Is it generally preferable to do aggregation like this in the query? Or does this depend on the amount of data being returned? Any pointers are greatly appreciated. Thanks!
As with all SQL answers: it depends.
But generally do your aggregation in SQL. SQL server is much better at performing aggregation than the report layer. Also bringing back less rows will reduce your data transfer and the amount of data which SSRS needs to process. Usually you would only want to do the aggregation at the report layer if there are other constraints which make doing it in the SQL query impossible or if doing so will make the report more difficult to maintain in the future. (There's certainly something to be said for sacrificing a bit of performance in the name of maintainability.) One case would be when you need to display all of the data and returning two datasets is either too complicated or actually slows down the performance of the report.
As a side note, if your report is taking 45+ seconds to run then likely your SQL is not optimized very well or your report is doing a lot of complicated calculations. The more work you can put back on the SQL server the better your performance will be. SQL Server is made for crunching numbers and doing aggregations so certainly let it do what it does best when you can.
YMMV, so always do performance testing for different methods to see what works best.

disable passing sp parameter

I have SSIS package where I have for each loop which runs stored procedures. The loop container passes parameters to the stored procedures. In some cases I need to pass all parameters to stored procedure and in other cases I need to pass only one parameter. Is there any way where I can set if parameter should be passed or not? Maybe will be it possible using Expression in Loop container?
Possibly. Maybe. Probably. As the question stands, it's rather hard to say.
The need is to supply parameters or not based on "logic not supplied in the question." You could have multiple Execute SQL Tasks with a precedence constraint turning powering the different tasks.
An alternative take would be to use an Expression for your query and pass the parameters in as part of the text. I called out some reasons you might not want to do it over on this answer
Yet another approach could be that you null out the parameters in a task where needed as #TI referenced
If you need specifics, please edit your question to contain specifics.

Filter on dataset or on tablix

I have two tablix on report, and one dataset for providing data to tablix. I have parameter on report that is multiselect and I need, based on values in that parameter, filter result.
Is there any distinction, and if is, what is better solution:
Create filter direct on dataset, or on tablix? Both of them will give me same result, but what is correct or better?
If in doubt, always get as close to your data source as possible. The ideal is to filter things in a SQL Server view or stored procedure, since this can be optimised; the next best thing is to filter in an SSRS dataset.
Filtering in a tablix item should be a last resort, and will involve reports running considerably more slowly.
There: that's my penn'orth!
There's no definite answer to this question that can apply to all situations.
In your case, since both Tablix objects need the same filter, I'd apply it at the Dataset level; that way you're not duplicating code/logic in the report.
Or even consider applying the filter when generating the Dataset, e.g. if it's from a Stored Procedure, implement a suitable WHERE clause to filter at the database level. That way there is less data being transferred unnecessarily.
I would go with whatever is the best for ease of coding and maintainability. Out of the options you've presented Dataset filtering seems to be the way to go.

Conditional split based on array variable

I need something like a T-SQL IN statement to filter records in a conditional split based on an array variable (or something similar)
I need to have a list of items that a column can be filtered on.
As Filip has indicated, there is no IN operator in the expression language. I did come up with some options though as I thought this sounded like an interesting problem.
My long analysis is on my blog: Filter list in SSIS
Conditional split
If you can transform your list of values into a delimited string, then you can use FINDSTRING and the current value to determine whether it's in the list. This provided the best throughput for my testing scenario. (FINDSTRING(#[User::MyListStr], [MyColumn],1)) > 0
Script task
I had assumed using a List in a script task to determine membership would provide the best performance but I was wrong. Row.IsInList = MyListObj.Contains(Row.MyColumn);
Lookup/Cache Connection Manager
The third approach I had come up with was dumping the list into a Cached Connection Manager and then using that in a lookup task. I thought this was the easiest to conceptualize and maintain but the performance was lacking.
Conclusion
For this problem domain, the FINDSTRING approach was the most efficient, by a considerable margin. The other three approaches consistently averaged a throughput of within 7 rows per millisecond of each other. I did find it interesting that the standard deviation of the FINDSTRING approach fluctuated so much. While this box is older and slower, there was not a considerable amount of activity going on during the package executions.
There is no IN operator in SSIS expression operators. And there is no similar operator. Since there is no such operator, You can't do that with built-in expressions and built-in Conditional Split. But You can do one of the following:
use Script Transformation to check if particular column has that is in variable array, and add additional column (flag) with value 1 if it contains, 0 if not; then use Conditional Split on this flag added in Script Transformation, or
it's better to put variables in database table and then use Lookup or Merge Join to check if row exists

SSRS Performance

I Have created an SSRS Report for retrieving 55000 records using a Stored Procedure. When
executing from the Stored Proc it is taking just 3 seconds but when executing from SSRS report it is taking more than one minute. How can I solve this problem?
The additional time could be due to Reporting Services rendering the report in addition to querying the data. For example if you have 55,000 rows returned for the report and the report server then has to group, sort and/or filter those rows to render the report then that could take additional time.
I would have a look at the way the data is being grouped and filtered in the report, then review your stored procedure to see if you could offload some of that processing to the SQL code, maybe using some parameters. Try and aim to reduce the the amount of rows returned to the report to be the minimum needed to render the report and preferably try to avoid doing the grouping and filtering in the report itself.
I had such problem because of parameters sniffing in my SP. In SQL Management Studio when I run my SP, I recreated it with new execution plan (and call was very fast), but my reports used old bad plan (for another sequence of parameters) and load time was much longer than in SQL MS.
in the ReportServerDB you will find a table called ExecutionLog. you got to look up the catalog id of your report and check the latest execution instance. this can tell you the break-up of the times taken - for data retrieval, for processing, for rendering etc.
Use the SSRS Performance Dashboard reports to debug your issues.
Archaic question but because things like that are kinda recurring, my "quick and dirty" solution to improve SSRS, which works perfectly on large enterprise environments (I am rendering reports that can have up to 100.000+ lines daily) is to properly set the InteractiveSize of the page (for example, setting it to A4 size -21 cm ). When InteractiveSize is set to 0, then all results are going to be rendered as single page and this literally kills the performance of SSRS. In cases like that, queries that can take a few seconds on your DB can take forever to render (or cause an out of memory exception unless you have tons of redundant H/W on your SSRS server).
So, in cases of queries/ SP's that execute reasonably fast on direct call and retrieve large number of rows, set InteractiveSize and you won't need to bother with other, more sophisticated solutions.
I had a similar problem: a query that returns 4,000 rows and runs in 1 second on it's own was taking so long in SSRS that it timed out.
It turned out that the issue was caused by the way SSRS was handling a multi-valued parameter. Interestingly, if the user selected multiple values, the report rendered quickly (~1 second), but if only a single value was selected, the report took several minutes to render.
Here is the original query that was taking more than 100x longer to render than it should:
SELECT ...
FROM ...
WHERE filename IN (#file);
-- #file is an SSRS multi-value parameter passed directly to the query
I suspect the issue was that SSRS was bringing in the entire source table (over 1 million rows) and then performing a client-side filter.
To fix this, I ended up passing the parameter into the query through an expression, so that I could control the filter myself. That is, in the "DataSet Properties" window, on the "Parameters" screen, I replaced the parameter value with this expression:
=JOIN(Parameters!file.Value,",")
... (I also gave the result a new name: filelist) and then I updated the query to look like this:
SELECT ...
FROM ...
WHERE ',' + #filelist + ',' LIKE '%,' + FILENAME + ',%';
-- #filelist is passed to the query as the following expression:
-- =JOIN(Parameters!file.Value,",")
I would guess that moving the query to a stored procedure would also be an effective way to alleviate the problem (because SSRS basically does the same JOIN before passing a multi-value parameter to a stored procedure). But in my case it was a little simpler to handle it all within the report.
Finally, I should note that the LIKE operator is maybe not the most efficient way to filter on a list of items, but it's certainly much simpler to code than a split-string approach, and the report now runs in about a second, so splitting the string didn't seem worth the extra effort.
Obviously getting the report running correctly (i.e. taking the same order of magnitude of time to select the data as SSMS) would be preferable but as a work around, would your report support execution snapshots (i.e. no parameters, or parameter defaults stored in the report)?
This will allow a scheduled snapshot of the data to be retrieved and stored beforehand, meaning SSRS only needs to process and render the report when the user opens it. Should reduce the wait down to a few seconds (depending on what processing the report requires. YMMV, test to see if you get a performance improvement).
Go to the report's properties tab in Report manager, select Execution, change to Render this report from a report execution snapshot, specify your schedule.
The primary solution to speeding SSRS reports is to cache the reports. If one does this (either my preloading the cache at 7:30 am for instance) or caches the reports on-hit, one will find massive gains in load speed.
Please note that I do this daily and professionally and am not simply waxing poetic on SSRS
Caching in SSRS
http://msdn.microsoft.com/en-us/library/ms155927.aspx
Pre-loading the Cache
http://msdn.microsoft.com/en-us/library/ms155876.aspx
If you do not like initial reports taking long and your data is static i.e. a daily general ledger or the like, meaning the data is relatively static over the day, you may increase the cache life-span.
Finally, you may also opt for business managers to instead receive these reports via email subscriptions, which will send them a point in time Excel report which they may find easier and more systematic.
You can also use parameters in SSRS to allow for easy parsing by the user and faster queries. In the query builder type IN(#SSN) under the Filter column that you wish to parameterize, you will then find it created in the parameter folder just above data sources in the upper left of your BIDS GUI.
[If you do not see the data source section in SSRS, hit CTRL+ALT+D.
See a nearly identical question here: Performance Issuses with SSRS
Few things can be done to improve the performance of the report which are as below:
1. Enable caching on the report manager and set a time period to refresh the cache.
2. Apply indexing on all the backend database tables that are used as a source in the report, although your stored procedure is already taking very less time in rendering the data but still applying indexing can further improve the performance at the backend level.
3. Use shared datasets instead of using embedded datasets in the report and apply caching on all these datasets as well.
4. If possible, set the parameters to load default values.
5. Try to reduce the data that is selected by the stored procedure, e.g. if the report contains historical data which is of no use, a filter can be added to exclude that data.
I experienced the same issue. Query ran in SQL just fine but was slow as anything in SSRS. Are you using an SSRS parameter in your dataset? I've found that if you use the report parameter directly in certain queries, there is a huge performance hit.
Instead, if you have a report parameter called #reportParam, in the dataset, simply do the following:
declare #reportParamLocal int
set #reportParamLocal = #reportParam
select * from Table A where A.field = #reportParam
It's a little strange. I don't quite know why it works but it does for me.
One quick thing you may want to look at is whether elements on your report could be slowing down the execution.
For example i have found massive execution time differences when converting between dateTimes. Do any elements on report use the CDate function? If so you may want to consider doing your formatting at the sql level.
Conversions in general can cause a massive slow down so take the time to look into your dataset and see what may be the problem.
This is a bit of a mix of the answers above, but do your best to get the data back from your stored procedure in the simplest and most finished format. I do all my sorting, grouping and filtering up on the server. The server is built for this and I just let reporting services do the pretty display work.
I had the same issue ... it was indeed the rendering time but more specifically, it was because of the SORT being in SSRS. Try moving your sort to the stored procedure and removing any SORT from the SSRS report. On 55K rows, this will improve it dramatically.
Further to #RomanBadiornyi's answer, try adding
OPTION (RECOMPILE)
to the end of your main query in the stored procedure.
This will ensure the query is recompiled for different parameters each time, in case different parameters need a different execution plan.