SSRS 2012 Reporting Services parallel dataset retrieval - reporting-services

I am trying to optimize a report for SSRS 2012 and using SQL Profiler I can see that the datasets are being processed one at a time instead of in parallel.
The checkbox to request one transaction is NOT checked.
I can't find any other setting on parallel execution.
The data source is an embedded data source.
Every item I find on the internet about parallel execution quotes a Microsoft BLOG from about a decade ago that states 2008 defaulted to parallel unless that single transaction box is checked, and the assumption is that nothing ever changes so this is still default behavior.
It would appear that the box has a different purpose since running in one transaction allows a temp table created in one dataset to be referenced in a later dataset - they are not only serialized but processed in their listed order (top to bottom). So that is about persistence of objects and data instead of parallel vs serialized.
Without the box checked it appears they are called in the order the fields are processed, but profiler results indicate that only one dataset is retrieved at a time.
So, is there a verified way to fetch multiple datasets simultaneously?

No, there aren't any other settings to control this behavior besides the one you described. Of course there are always other ways around this if efficiency is an issue for you. For example, you could look into caching the results before the report runs.

Related

Subscriptions delayed behind single report running thousands of times

I find it difficult to think I'm the first to run into this so either my searching ability has truly become sad or the solution is so apparent that nobody has asked before. Mox nix, I must ask.
SSRS 2012 which has a few hundred reports on it, over a hundred subscriptions and generally speaking, works fine.
This is a Native box, we have a separate box for the SharePoint version.
The monthly 'statement' report is data-driven and is fed over 100K personIds to process and export to pdf on a file server. The SQL takes .3 seconds, the pdf not much more. There are just so many of them. So when this one runs, all others queue up behind it and wait, often for it to completely finish. Not good. Month end reports are important to a few departments.
My question- can I set the priority of this report somehow (or some other setting) to allow for other reports to process when they are scheduled?
It just boggles my mind that this is even an issue, but it is.
Thanks for any insights-
Craig
There's no way to set an individual report's priority, unfortunately.
I would look at either reducing the number of required executions, if possible, and failing that, you can try manually separating them into batches, and assign a different subscription to each batch. Sub 1 could handle rows 1 - 5000, sub 2 5001 - 10000, and so on. Stagger the execution times and it will allow other subscriptions to slip in 'in between' the batch processing subscriptions.
But really, if you truly need to generate 100k-plus reports, I don't think SSRS is the best option. Have you considered SSIS instead?
Dean Kalanquin has an excellent examination of how subscriptions in SSRS work, if you're looking to see what's happening under the hood :
http://blogs.msdn.com/b/deanka/archive/2009/01/13/diagnosing-and-troubleshooting-subscriptions.aspx

SSRS reports time out on dynamic queries after a few days

We're using SSRS 2012 with a number of reports driven by a Query.CommandText reference to a stored procedure executing dynamic sql (sp-executesql). These are consumed from a web application where the user specifies the report, criteria, etc. After a few days, the report requests will timeout, even though the underlying stored procedure executes within a few seconds (the same stored procedure feeds a search result screen and the report). Other reports that do not use dynamic sql continue to execute fine. The only remedy we've found is to restart the SSRS service. After the initial spin-up, the same report will execute within a few seconds.
The SSRS logs don't seem point to any issue, though I'm certainly not an expert reading them. Comparing a slow to a quick one only seems to differ by the time stamps evenly spread out between the start and the end. We do see "ReportProcessingException: There is no data for the field at position xx", but on both the slow and fast runs. Running the report from the Reports portal takes about 10 minutes when it's in slow mode.
My suspicion is that some caching is going on and SSRS is influencing the SQL execution plan.
Any suggestions or requests for more specifics would be very welcome.

Should proactive caching be used instead of processing dimensions?

I am confused about best practice for updating cube data throughout the day. We have a small order processing environment, where I would like to update a dashboard containing Order statuses. I am able to get this to work by creating an SSIS package and scheduling it to run every 4 minutes. This works.
But when I disable the SSIS job above, and instead turn on Real-time ROLAP on all the dimensions and the Cube, nothing ever changes in dashboard. Do I misunderstand the purpose of proactive caching?
I'm using SQL Server standard containing our production data, but our Analysis Server is Enterprise, in case that makes a difference. I'd also be willing to use Automatic or Scheduled MOLAP if that works.
no, you did not. I think you have configuration issues.
I assume the job you disabled was coping data from your database to your data warehouse, right?
And your cube reads from your data warehouse, right?
so now, your OLAP database is being updated (by your application) but the changes are not being pushed to the cube (because the job is off)
Proactive caching (specially with ROLAP) is a way to get your data live without having to schedule a cube refresh for every x minutes. But the job that populates your DW must still be running.
I can guess that the package you disabled was, besides updating the DW, also refreshing the cube. Check it's source.

Sql Server query optimisation

I am optimising a large SP in Sql Server 2008 which uses a lot of dynamic Sql. It is a query which searches the database with a number of optional parameters and short of coding for every possible combination of parameters dynamic sql has proven to be the most efficient method of executing this. The sql striung is built including parameters and then passed to sp_executesql with the param list. When running this in SSMS with any combination of parameters it runs very quickly (<1s) and returns results. When running from a windows forms application however, it sometimes takes considerably longer.
I have read that the difference in the ARITHABORT option can cause this (ON as default in SSMS and OFF in ADO) however I am unsure as to whether turning this on fixes the issue or whether it masks it? Does the difference in settings make a difference to the query itself or does it just mean that Sql Server will use different cached execution plans? If so should clearing the cache and statistics reset the playing field?
I have also read differing points of view on the OPTION RECOMPILE setting. My understanding is that when sp_executesql is used with a parameter list then each combination of parameters will produce an execution plane however as the possible combinations of parameters are finite this will result in optimised queries. Other sources say it should be set to ON at the start of any SP that uses dynamic sql.
I realise that different situations require different settings however I am looking to understand these further before trying the arbritraily on my very busy 24x7 production server. Apologies for the ramblings, I guess my question boils down to:
What causes sql to run differently in SSMS and Window Forms?
If it is ARITHABORT then is this an issue related to execution plans or should I turn it on as a server default?
What is the optimal way to run queries with dynamic sql?
Run a trace in SQL Profiler to see what's actually being submitted to the server. Of course, you need to be aware of the impact of traces on production servers. In my experience very short traces that are limited to a small set are not a big problem for servers that don't have a very high transactions per second load. Also, you can run a trace server-side which reduces its impact so that's an option for you.
Once you see what's actually being submitted to the database this may help you understand the problem. For example, sometimes DB libraries prepare statements (getting a handle to a sort of temporary stored proc) but this can be costly if it is done for each issuance of the query, plus it's not needed with sp_executesql. Anyway, there's no way of knowing for sure whether it will be helpful until you try it.

max time a report should take to generate report

I am just curious to know how long, in minutes, does the reporting service take to generate report when it returns 1MB of data. Maybe using views and table is properly index. SSRS reporting and server side generation.
Report generation time has two components:
- Data Acquisition time
- Render Time
So for 1 Mb of data, how many records (rows) are we talking? How many pages will the report have? How many controls per page? Does the report use charting? These are the factors that will determine generation time.
For most reports, data acquisition time is the most significant factor. Your report is never going to run faster than the raw data acquisition. So if you are using SQL, the report can't generate faster than the time required to run the query. I have seen queries that return far more than 1Mb of data very quickly. I have also seen queries that return very little data, that run for a long time.
On the render side, there are a couple of things that that can cause a report to run be slow. The first is in report aggregation. If a report needs to receive all of the records prior to starting rendering, then its performance will suffer. In particular, depending on the reporting tool. With large data sets (more than 10,000 records), you can have significant improvements in rendering by doing aggregation at the source (DB). The other is charting, which typically involves heavy rendering overhead and aggregation.
Most reporting systems allow you to build in timers or logging that will help you to performance tune the report. It is best to build timers into the report that will tell you what percentage of time the report is spending getting the data, and what percentage is spent rendering. When you have this information, you will know where to focus your energies.
If you are really trying to evaluate the performance of the reporting tool, the best way is to build a report that either reads a flat file or generates the data through code. In other words, eliminate the impact of the database and see how fast your reporting tool can generate pages.
Hope this helps.
How long is acceptable? Depends on what it's doing, how much it's run, things like that. Anything below 30 seconds would be fine if it's run once every day or two. If it's run once a week or once a month that number could be a lot higher.
The report itself is generally very fast, if you're seeing a hangup you may want to check the execution time of the query which generates the data. A complex query can take a long time, even if it only returns a little data...
I've found, when using BIRT and other reporting systems that the best improvements tend to come by offloading most of the work to the database at the back end.
In other words, don't send lots of data across the wire and sort or group it locally. The database is almost certainly going to outperform you with its SQL orderby and groupby clauses and optimizing indexes (among other things).
That way, you get faster extraction of the data you want AND less network traffic.
As several said already, a general question like this really can't be answered. However, I wrote up Turbo-charge Your Report Speed – General Rules & Guidelines (disclaimer - I'm the CTO at Windward Reports, a competitor of SSRS). I think that will help you look for what you can do to speed up the process.
And with all the caveats about the specifics matter a lot, on a 3GHz workstation we generally see 7-30 pages/second. Keep in mind this is numbers for Windward, not SSRS.