Since one of the last Sql Server Data Tools updates for Visual Studio 2013, the preview-rendering has changed drastically.
Installed Version: Microsoft SQL Server Data Tools - enu(12.0.30919.1)
Visual Studio opens a separate process to render the preview-data, without any output or information. This preview-service is so ridiculously slow when displaying large amounts of data.
For a comparison:
A statistical report that displays around 20 pages of static precalculated data is normally rendered instantly when executed in our production environment.
When previewed in visual studio the exact same report may take up to 40-60 seconds to render.
Is there any way to optimize this sub-optimal behavior? Currently I have to change one tiny layout / grouping thing in around 30 reports and the additional rendering time during development adds up a lot.
Related
I am using Visual Studio 2019 and using Microsoft Reporting Services Projects Extension v2.6.7. The problem i am facing is i have a report that process about 60k records, the report is complex and has Groups, repeat headers, dataset filters and also VB Code.
The stored procedure used for this report runs in less than 10 seconds and when the report is deployed to Report Server the report completes rendering in less than 2 Mins. But when I run the same report using Visual Studio in preview or Run Mode (Report Viewer) the report runs for a whooping 17-20 mins. I have used SQL Profiler and see the Stored procedure execution time is almost same as the report execution time. The stored procedure is designed to handle parameter sniffing issue and I dont see any issue with the procedure.
From the report side, i have tried Keeptogether=false, Interactive size etc that could impact performance. They look fine.
I also tried to add WorkingSetMaximum to increase memory but still no luck. The client i am working with requires to have the RDLC File integrated in their app and will not want to deploy on Report Server for their own reason.
How can i make my report run faster in Visual Studio Preview Mode/Report Viewer (Run Mode) so that I can match the performance of the report with the performance i Get in Report Server.
Also if anyone could tell if there is a difference in how the report rendering works on Report Server vs Preview Mode.
Edit 1 - Report Server and the database is configured in my laptop and it is not having any different configuration.
Edit 2 - Another observation i have gathered by running SQL Profiler is that during Preview mode the connection is kept open and the data retrieval time justifies report run time. Both are same. But when i run the report through report manager from the same machine, procedure completes in seconds and even the report renders faster. And As i have mentioned above, i have taken care of parameter sniffing. I am now trying to understand if there is a difference in the way SSRS Engine treats report rending and data retrieval for Preview and when report is deployed to reporting service.
I came across this Q&A discussion in MSDN. I tried to replicate this and it gave me a fix by changing the trust level for CAS in config file. But still I have a question with regards to how Report Viewer in Visual Studio behaves. Is there any similar setting that we use in the application config that can be used to improve development and test performance in Visual Studio.
MSDN Blog
Use the existing framework but force the use of Legacy CAS [code access security] Security
In Winforms <NetFx40_LegacySecurityPolicy enabled="true" />
In ASP Net application <trust legacyCasModel="true" level="Full"/>
I am working on an existing SSRS report. It works when viewed in a browser from the report server (~ 30 seconds load), but loads for quite a while in visual studio preview (~ 10 minutes). The queries themselves are fast, so something about the report specifically is causing the slowdown.
As for what I have tried so far, I tried deleting the cached .data file (no fix), deploying the report (works fine when deployed), recreating the data sources, as well as attempting to avoid bad query plans from parameter sniffing (see Fast query runs slow in SSRS). None of these fix the problem. The problem only happens specifically when using local preview.
Many articles describe viewing the execution log to figure out what stage the report is taking so much time in (query, data processing, rendering). I would like to do this so I can narrow down the cause, but it works fine on the report server, just not on the local preview. How can I get the equivalent execution log when running local preview, given there is no report database?
I suspect that because forcing the query plans to recompile doesn't solve the problem, and because the report works fine with the same query against the same server when deployed, that the actual problem might be some weird rendering bug. I have heard this can happen with certain pagination settings, but I want to confirm what general area this problem is in.
My system
Windows 7, 64 bit
Microsoft Visual Studio Community 2013. Version 12.0.40629.00 Update 5
The question
I'm just getting started with SSRS, and I'm swicthing alot between Design and Preview mode.
It frustrates me that this is taking a bit long, particularly when I'm only adjusting the sizes of graphs and tables etc. Is there any way to preview the report without reloading the data?
Unfortunately there is no way to preview without running the queries. This is because if it doesn't run a query it (and you) don't know what data to expect and what the report should therefore look like.
Imagine if you were returning a textbox colour from your query for example. Without running the query the report would have no idea what colour the textbox should be.
To speed up the time to generate the preview is it possible to run the query against less data so it executes faster? For example against a sample set of data from your database which is only 10% of the total rows from your production db?
I have SQL Server 2008 R2 with SSRS. I have created an SSRS report that may contain up to 3,000,000 rows.
When I tried to generate such huge report I saw the following picture:
The stored procedure (one that brings the data into the Report) worked 50 seconds
After this the SSRS ReportingServivesService.exe started to consume a lot of memory. It's Working Set grew up to 11 GB. It took 6 minutes; and then the report generation failed with the following error message:
An error has occurred during report processing. (rsProcessingAborted)
There is not enough space on the disk.
“There is not enough space on the disk.” – this was probably about the disk drive on that server where the Windows page file was mapped into. The drive had 14 GB of free space.
A NOTE: the report was not designed as a single-page report. It is divided on pages by 40 rows. When I try to generate the same report with 10,000 rows – it takes just 1 minute.
The question is: can this be fixed somehow?
SSRS is extremely ill-suited for this kind of scenario. Tools like BCP or SSIS seem much more suited to this task. The question you ask ("can my situation be fixed") is not quite answerable, apart from being answerable by you by demonstrating that it can be done.
In my experience though, I wouldn't think of trying to get 3 million rows to work in SSRS.
If you insist or are compelled to try anyways, here's a few things you can do to improve the situation:
Dive in to the rdl and remove everything you don't need: font instructions, dimensions, images, etc. Check regularly in the designer if the file is still valid. You could even consider rebuilding the report with as little extra features as possible.
Move any expression or dynamic ssrs bit to the query.
Remove all formatting and formats for cells.
Increase disk en memory space. If you are compelled to generate such big reports in SSRS you're gonna need it. Close any other application you can, your PC or Server's gonna need all the resources it can get for this. Normally I'd consider this a non option / as an indication you need different tooling though. But I'm beginning to sound like a broken record :)
Choose the export format wisely. Excel or PDF is gonna take many additional resources, e.g. CSV renderer will be much friendlier.
I have observed performance problems when rendering a 1000-page ServerReport using Microsoft.ReportViewer.WebForms versions 10 and 11 in Visual Studio 2013, running against a SQL Server 2012 Report Server.
This is not a query performance problem, since the underlying stored procedure returns the data in a few seconds. Running the report through the Report Manager returns an HTML or PDF report in 30 seconds. It is only when the report is returned through the Report Viewer that an additional delay of several minutes is encountered. Neither PageCountMode.Estimate nor local processing of the report have helped.
It appears to be a rendering issue. This is unexpected, because the rendering should be happening on the server. The client machines are less powerful than the server, and we do in fact want the server to render the report.
Some of the rendering is in fact being done on the client, despite the fact that this is a ServerReport.
Examination of the ExecutionLog2 view in the ReportServer database shows that an RPL format is returned in, e.g., 30 seconds. However, even though this is a ServerReport, the client then spends several minutes apparently performing additional rendering. There is no additional delay for older versions of the ReportViewer, which receive HTML4.0 instead of RPL. There is also no additional delay when all rendering is performed on the Report Server, as through the Report Manager.
It would be possible to request HTML4.0 format through ServerReport.Render(), except that Render() cannot be called explicitly for a WebForms ReportViewer Control.
For large reports, it would be useful to be able to prevent the client from receiving RPL or performing any rendering. There is a Microsoft Connect item about this.