Currently we have all our users sign up for subscriptions (almost all at 8AM daily) to each report they need and currently we do not have caching enabled on the reports. Is it safe to say when each subscription runs it's a full DB lookup and report generation? If we enable Caching for say 30 minutes would that reduce DB workloads?
Yes, the individual report is going to run the query and generate the report every time. If you had one subscription sending to multiple people, it would only occur once. Sounds like caching would be a good idea.
Related
We wanted to show data on browser from reporting server, but sometime it's failing to load and taking more than 3 mins. Is there any better approach to get the data faster.
It can be many factors why your report is slow or failing to load. First step would be analyzing executionlog (on server where is reporting services installed, database of report server, probaby ReportServer db, view ExecutionLog2). There you can see three crutial columns TimeDataRetrieval, TimeProcessing and TimeRendering
Extract from this discussion
TimeDataRetrieval - contains the sum of all DataSet durations
TimeProcessing - The number of milliseconds spent in the processing engine for the request
TimeRendering - The number of milliseconds spent after the Rendering Object Model is exposed to the rendering extension
That way you will know if you need to tune your query or your report and that's the good start.
I have a weird issue with one of the data driven subscriptions on SSRS.
The subscription is a timed subscription that generates invoices (pdf/excel) and gets triggered by a stored procedure.
The issue we are facing is that the first run always takes 30-60 minutes regardless of how many invoices are being generated. Once the first run has completed the subsequent runs are completed under a minute throughout the day.
There is a second version of the same report that is run manually and it runs fine(ruling out any delays with the data extraction bit).
I have looked at some other questions here but that didnt help identify the problem:
SQL Reporting services: First call is very slow
SSRS report subscription not working sometime
Without knowing more about the query, data, database setup, other process, etc.; it will be quite difficult to say for sure. But if I had to guess, based on your description, it sounds like the query plan cache is lost and is rebuilt on the first run of the day. Without the plan the query can be less efficient. Each subsequent run will use the plan created on the first run, and will therefor run more quickly. There are a number of reasons that could cause the query plan to be wiped from cache. A recompile, other queries using too much memory, not enough system memory to begin with etc.
Hope that helps!
I find it difficult to think I'm the first to run into this so either my searching ability has truly become sad or the solution is so apparent that nobody has asked before. Mox nix, I must ask.
SSRS 2012 which has a few hundred reports on it, over a hundred subscriptions and generally speaking, works fine.
This is a Native box, we have a separate box for the SharePoint version.
The monthly 'statement' report is data-driven and is fed over 100K personIds to process and export to pdf on a file server. The SQL takes .3 seconds, the pdf not much more. There are just so many of them. So when this one runs, all others queue up behind it and wait, often for it to completely finish. Not good. Month end reports are important to a few departments.
My question- can I set the priority of this report somehow (or some other setting) to allow for other reports to process when they are scheduled?
It just boggles my mind that this is even an issue, but it is.
Thanks for any insights-
Craig
There's no way to set an individual report's priority, unfortunately.
I would look at either reducing the number of required executions, if possible, and failing that, you can try manually separating them into batches, and assign a different subscription to each batch. Sub 1 could handle rows 1 - 5000, sub 2 5001 - 10000, and so on. Stagger the execution times and it will allow other subscriptions to slip in 'in between' the batch processing subscriptions.
But really, if you truly need to generate 100k-plus reports, I don't think SSRS is the best option. Have you considered SSIS instead?
Dean Kalanquin has an excellent examination of how subscriptions in SSRS work, if you're looking to see what's happening under the hood :
http://blogs.msdn.com/b/deanka/archive/2009/01/13/diagnosing-and-troubleshooting-subscriptions.aspx
We have a 2010 BI Sharepoint (SSRS 2012) site that has links to several databases:
Database A will be available #12:00am
Database B will be available #1:00am
Database B will be available #2:00am
So I have a shared schedule setup for each database for the above times. How many reports should I have running in each shared schedule? For now I only have 10 sample reports and they all kicked off and ran within the same second (maybe some kicked off a couple seconds later). But I infer from that they don't run in order, rather asynchronously.
So, what is the limit and will it kill my server's performance if I have 100's of reports running. Or should I make a schedule and limit each schedule to run about 30-40 reports each?
I can not infer from your question if you are creating a snapshot or an email service...in any case there are tools that you can use to fine tune scheduling outside of ssrs. In ssrs I would recommend that you stager your report requests, if is makes sense for huge reports. In any case, You should stager your schedules to some degree.
Is it possible to cache a query or report the first time it is run? It seems that opening a report will re-query the datasource. For certain queries, the data source does not change frequently enough that I'd be worried about a cache being out of date (users are notified when the database changes), and it would be much easier for the users to be able to open the report instantly rather than having to wait several minutes every time they want to see the data (though I realize if they close the file the caches will be lost - that's OK).
Data comes from an ODBC connection to Oracle, using Access 2003.
Most server databases cache ad hoc SQL statements. That is, if you run the report, Oracle should cache the result, and if it sees the SQL statement come across the wire again, deliver the result from the cache instead of retrieving it all from scratch. I know that SQL Server does this and I assume any enterprise-level database will do the same. Along with the caching, of course, is some form of checking to insure that the cached data is still up-to-date. I don't know what level of control the DBA has over how this works, but you might look into implementing this server-side. A temp table might also be a solution.
Could you maybe keep the report open the entire time the database is open? Open it hidden when the database is opened.
DoCmd.OpenReport "YourReport", acViewPreview,,,acHidden
Then never close it while the database is open.
Alternatively, since you can deal with cache staleness, perhaps you could store the report's data in a local table for faster access. Since the users are aware of when new data is available, give them a command button which empties the local table and puts the latest Oracle data back in.