I've got a report in SSRS that contains two subreports, but it's taking a very long time to show the final report. According to the SSRS Execution Log, the report is spending ~140ms in data retrieval, ~20minutes in processing, and ~20ms in rendering. If I remove either of the subreports (doesn't matter which one), the time drops to ~10minutes for processing. If I remove both subreports, the time drops to ~2s. Quite obviously I have to do something about the subreports; probably try to integrate them into the main report.
Does the "TimeDataRetrieval" statistic in the execution log represent the aggregate data retrieval time for the report and all subreports, or is it just for the main report (meaning the data retrieval times for subreports are actually being added to the "TimeProcessing" stat)?
Furthermore, when I run the main report within BIDS (Visual Studio), the entire report returns in a couple seconds. Why would a report + 2 subreports render completely within a couple seconds in BIDS, yet the exact same thing takes around 20 minutes when viewed from the reporting server? They are both accessing the same SQL DB (data retrieval is only 140ms from SSRS server), and the SSRS/SQL server should have plenty of power (running in an 8-core VM that doesn't break 1% CPU usage).
Related
We wanted to show data on browser from reporting server, but sometime it's failing to load and taking more than 3 mins. Is there any better approach to get the data faster.
It can be many factors why your report is slow or failing to load. First step would be analyzing executionlog (on server where is reporting services installed, database of report server, probaby ReportServer db, view ExecutionLog2). There you can see three crutial columns TimeDataRetrieval, TimeProcessing and TimeRendering
Extract from this discussion
TimeDataRetrieval - contains the sum of all DataSet durations
TimeProcessing - The number of milliseconds spent in the processing engine for the request
TimeRendering - The number of milliseconds spent after the Rendering Object Model is exposed to the rendering extension
That way you will know if you need to tune your query or your report and that's the good start.
I am trying to optimize a report for SSRS 2012 and using SQL Profiler I can see that the datasets are being processed one at a time instead of in parallel.
The checkbox to request one transaction is NOT checked.
I can't find any other setting on parallel execution.
The data source is an embedded data source.
Every item I find on the internet about parallel execution quotes a Microsoft BLOG from about a decade ago that states 2008 defaulted to parallel unless that single transaction box is checked, and the assumption is that nothing ever changes so this is still default behavior.
It would appear that the box has a different purpose since running in one transaction allows a temp table created in one dataset to be referenced in a later dataset - they are not only serialized but processed in their listed order (top to bottom). So that is about persistence of objects and data instead of parallel vs serialized.
Without the box checked it appears they are called in the order the fields are processed, but profiler results indicate that only one dataset is retrieved at a time.
So, is there a verified way to fetch multiple datasets simultaneously?
No, there aren't any other settings to control this behavior besides the one you described. Of course there are always other ways around this if efficiency is an issue for you. For example, you could look into caching the results before the report runs.
We have couple of long running reports in SSRS. It is rendering fine in ReportViewer. When we try to export to PDF, it times out after some time. We increased httprequesttimeout in ReportServer Web.config and it works fine in Offshore. Our onshore environment is a shared one and the client is not inclined to change setting in shared environment. We have tried changing invidual reports time out properties(Do not time out and also Manually given the time out to some big number). Unfortunately it seems to be overridden by reportserver web.config entry httprequesttimeout which is 2 minutes by default.
Is there a way to override individual report's timeout settings instead of changing Reportserver web.config
Regards
K.P.Kannan
Maybe it's about execution time-out value. I've found something on Microsoft, hope it helps.
How report execution time-out values are evaluated
The report server evaluates running jobs at 60 second intervals. At
each 60 second interval, the report server compares actual process
time against the report execution time-out value. If the processing
time for a report exceeds the report execution time-out value, report
processing will stop.
Note that if you specify a time-out value that is smaller than 60
seconds, the report may execute in full if processing starts and
completes during the quiet part of the cycle when the report server is
not evaluating running jobs. For example, if you set a time-out value
of 10 seconds for a report that takes 20 seconds to run, the report
will process in full if report execution starts early in the 60 second
cycle.
Here is the link for details:
https://learn.microsoft.com/en-us/sql/reporting-services/report-server/setting-time-out-values-for-report-and-shared-dataset-processing-ssrs?view=sql-server-2017
We're using SSRS 2012 with a number of reports driven by a Query.CommandText reference to a stored procedure executing dynamic sql (sp-executesql). These are consumed from a web application where the user specifies the report, criteria, etc. After a few days, the report requests will timeout, even though the underlying stored procedure executes within a few seconds (the same stored procedure feeds a search result screen and the report). Other reports that do not use dynamic sql continue to execute fine. The only remedy we've found is to restart the SSRS service. After the initial spin-up, the same report will execute within a few seconds.
The SSRS logs don't seem point to any issue, though I'm certainly not an expert reading them. Comparing a slow to a quick one only seems to differ by the time stamps evenly spread out between the start and the end. We do see "ReportProcessingException: There is no data for the field at position xx", but on both the slow and fast runs. Running the report from the Reports portal takes about 10 minutes when it's in slow mode.
My suspicion is that some caching is going on and SSRS is influencing the SQL execution plan.
Any suggestions or requests for more specifics would be very welcome.
I'm developing a report in ssrs for display in a public location within the company. The report shows up-to-the-minute data on department activity, so is set to refresh every 30 seconds or so.
The problem I'm running into is that once every few hours, the report throws an error (rsErrorReadingNextDataRow) & (rsProcessingAborted). All I have to do is hit refresh in the browser (rendered in a an internet browser) and it reruns and charts along happily for a while.
After looking at the reportserver execution log, it appears that these errors coincide with a phenominal spike in processing time. On average, each time the report refreshes, processing times is 400ms~800ms, but when it spikes it goes to ~90000ms (yes, five zeroes) and then throws this error.
Being new to SSRS I am not sure where to begin looking for the root cause.
Can anyone give me pointers as to where I can start to find out what is causing the processing time to rocket like that? Data retrieval is stable at ~5000ms, and Time rendering is stable at around 200ms. It's only the processing that goes haywire.
Some background on the report and data:
Data is pretty straight forward. It's based on a view that pulls transactions from last 7 weeks. Number or records, therefore ebbs and flows each week ~ 8000 records. When select * from View is run in SSMS, query takes about 5 seconds to run. I'll work on speeding that up after I resolve this processing issue.
No parameters are used, though the view does use getdate() to figure out which records to show from base tables.
No stored procedures are used.
Report itself is comprised of 6 panes within a single tablix, none of which have to draw more than a few dozen marks ('cept one is a map of US states).
The report does have one feature that may be related, though I am not sure how. Report definition filters the dataset based on a mod 3 of the built in execution time variable, rotating the report to show "This Month", "This Week", or "Today" activity. There is an additional mod 3 on execution time which rotates visibility for the panes. The result is that each 30 seconds, a report with different combinations of charts shows up on a rotating time-frame.
Don't know if this could be a cause, but it's the only element of the report that makes it remotely fancy. Everything else about the report is actually rather straightforward/plain.
While I would love to identify and eliminate the spikes, even a mechanism to automatically refresh on error, or refresh on timeout (or something to that effect) would do the trick. I need to be able to launch the report in the morning and have it run unmanned all day without any human interaction, refreshing every 30 seconds for the duration of business hours.
Just a recap of the comments:
it might be possible to push the web page into a frame, then add javascript to the main page to periodically refresh the frame. It wouldn't fix the underlying problem, but it might help.
it might be worth looking at the performance monitoring tools in Sql Server - it might be a database issue (temp table filling up, etc)
You noted that adding a with (nolock) seemed to help locate the problem.