How to show large ssrs reports on the browser - reporting-services

We wanted to show data on browser from reporting server, but sometime it's failing to load and taking more than 3 mins. Is there any better approach to get the data faster.

It can be many factors why your report is slow or failing to load. First step would be analyzing executionlog (on server where is reporting services installed, database of report server, probaby ReportServer db, view ExecutionLog2). There you can see three crutial columns TimeDataRetrieval, TimeProcessing and TimeRendering
Extract from this discussion
TimeDataRetrieval - contains the sum of all DataSet durations
TimeProcessing - The number of milliseconds spent in the processing engine for the request
TimeRendering - The number of milliseconds spent after the Rendering Object Model is exposed to the rendering extension
That way you will know if you need to tune your query or your report and that's the good start.

Related

Kingswaysoft SSIS create Contacts Slow

I’m currently doing some testing for an upcoming data migration project and came across Kingswaysoft which seemed like it would be ideal for this purpose.
However I’m currently testing importing 225,000 contact records into a new sandbox Dynamics 365 instance and it is on course to take somewhere between 10 and 13 hours.
Is this typical of the speeds I should expect or am I doing something silly?
I am setting only some out of the box fields such as first name, last name, dob and address data.
I have a staging contact SQL database holding the 225k records to be uploaded.
I have the CRM Destination Component setup to use multi threading batch size of 250 with up to 16 threads.
Have tested using both Create and Upsert and both very slow.
Am I doing something wrong - I would have expected it to be much quicker.
When it comes to the data load to Dynamics 365 Online, the most important aspect that affects your data load performance is the network latency. You should try to put the data migration solution as close as possible to the Dynamics 365 online server. If you have the configuration right, you should be able to achieve something like 1m to 2m records per hour. The speed that you are getting is too slow. There must be something. There are many other things that could affect the data load performance, but start from network latency first. We have some other tips shared at https://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365/help-manual/crm/advanced-topics#MaximizedPerformance, which you should check out.

Data driven SSRS subscription delayed execution on the first run

I have a weird issue with one of the data driven subscriptions on SSRS.
The subscription is a timed subscription that generates invoices (pdf/excel) and gets triggered by a stored procedure.
The issue we are facing is that the first run always takes 30-60 minutes regardless of how many invoices are being generated. Once the first run has completed the subsequent runs are completed under a minute throughout the day.
There is a second version of the same report that is run manually and it runs fine(ruling out any delays with the data extraction bit).
I have looked at some other questions here but that didnt help identify the problem:
SQL Reporting services: First call is very slow
SSRS report subscription not working sometime
Without knowing more about the query, data, database setup, other process, etc.; it will be quite difficult to say for sure. But if I had to guess, based on your description, it sounds like the query plan cache is lost and is rebuilt on the first run of the day. Without the plan the query can be less efficient. Each subsequent run will use the plan created on the first run, and will therefor run more quickly. There are a number of reasons that could cause the query plan to be wiped from cache. A recompile, other queries using too much memory, not enough system memory to begin with etc.
Hope that helps!

SSRS reports time out on dynamic queries after a few days

We're using SSRS 2012 with a number of reports driven by a Query.CommandText reference to a stored procedure executing dynamic sql (sp-executesql). These are consumed from a web application where the user specifies the report, criteria, etc. After a few days, the report requests will timeout, even though the underlying stored procedure executes within a few seconds (the same stored procedure feeds a search result screen and the report). Other reports that do not use dynamic sql continue to execute fine. The only remedy we've found is to restart the SSRS service. After the initial spin-up, the same report will execute within a few seconds.
The SSRS logs don't seem point to any issue, though I'm certainly not an expert reading them. Comparing a slow to a quick one only seems to differ by the time stamps evenly spread out between the start and the end. We do see "ReportProcessingException: There is no data for the field at position xx", but on both the slow and fast runs. Running the report from the Reports portal takes about 10 minutes when it's in slow mode.
My suspicion is that some caching is going on and SSRS is influencing the SQL execution plan.
Any suggestions or requests for more specifics would be very welcome.

SSRS Data Driven Subscription Multiple reports running concurrently

We have a 2010 BI Sharepoint (SSRS 2012) site that has links to several databases:
Database A will be available #12:00am
Database B will be available #1:00am
Database B will be available #2:00am
So I have a shared schedule setup for each database for the above times. How many reports should I have running in each shared schedule? For now I only have 10 sample reports and they all kicked off and ran within the same second (maybe some kicked off a couple seconds later). But I infer from that they don't run in order, rather asynchronously.
So, what is the limit and will it kill my server's performance if I have 100's of reports running. Or should I make a schedule and limit each schedule to run about 30-40 reports each?
I can not infer from your question if you are creating a snapshot or an email service...in any case there are tools that you can use to fine tune scheduling outside of ssrs. In ssrs I would recommend that you stager your report requests, if is makes sense for huge reports. In any case, You should stager your schedules to some degree.

max time a report should take to generate report

I am just curious to know how long, in minutes, does the reporting service take to generate report when it returns 1MB of data. Maybe using views and table is properly index. SSRS reporting and server side generation.
Report generation time has two components:
- Data Acquisition time
- Render Time
So for 1 Mb of data, how many records (rows) are we talking? How many pages will the report have? How many controls per page? Does the report use charting? These are the factors that will determine generation time.
For most reports, data acquisition time is the most significant factor. Your report is never going to run faster than the raw data acquisition. So if you are using SQL, the report can't generate faster than the time required to run the query. I have seen queries that return far more than 1Mb of data very quickly. I have also seen queries that return very little data, that run for a long time.
On the render side, there are a couple of things that that can cause a report to run be slow. The first is in report aggregation. If a report needs to receive all of the records prior to starting rendering, then its performance will suffer. In particular, depending on the reporting tool. With large data sets (more than 10,000 records), you can have significant improvements in rendering by doing aggregation at the source (DB). The other is charting, which typically involves heavy rendering overhead and aggregation.
Most reporting systems allow you to build in timers or logging that will help you to performance tune the report. It is best to build timers into the report that will tell you what percentage of time the report is spending getting the data, and what percentage is spent rendering. When you have this information, you will know where to focus your energies.
If you are really trying to evaluate the performance of the reporting tool, the best way is to build a report that either reads a flat file or generates the data through code. In other words, eliminate the impact of the database and see how fast your reporting tool can generate pages.
Hope this helps.
How long is acceptable? Depends on what it's doing, how much it's run, things like that. Anything below 30 seconds would be fine if it's run once every day or two. If it's run once a week or once a month that number could be a lot higher.
The report itself is generally very fast, if you're seeing a hangup you may want to check the execution time of the query which generates the data. A complex query can take a long time, even if it only returns a little data...
I've found, when using BIRT and other reporting systems that the best improvements tend to come by offloading most of the work to the database at the back end.
In other words, don't send lots of data across the wire and sort or group it locally. The database is almost certainly going to outperform you with its SQL orderby and groupby clauses and optimizing indexes (among other things).
That way, you get faster extraction of the data you want AND less network traffic.
As several said already, a general question like this really can't be answered. However, I wrote up Turbo-charge Your Report Speed – General Rules & Guidelines (disclaimer - I'm the CTO at Windward Reports, a competitor of SSRS). I think that will help you look for what you can do to speed up the process.
And with all the caveats about the specifics matter a lot, on a 3GHz workstation we generally see 7-30 pages/second. Keep in mind this is numbers for Windward, not SSRS.