I find it difficult to think I'm the first to run into this so either my searching ability has truly become sad or the solution is so apparent that nobody has asked before. Mox nix, I must ask.
SSRS 2012 which has a few hundred reports on it, over a hundred subscriptions and generally speaking, works fine.
This is a Native box, we have a separate box for the SharePoint version.
The monthly 'statement' report is data-driven and is fed over 100K personIds to process and export to pdf on a file server. The SQL takes .3 seconds, the pdf not much more. There are just so many of them. So when this one runs, all others queue up behind it and wait, often for it to completely finish. Not good. Month end reports are important to a few departments.
My question- can I set the priority of this report somehow (or some other setting) to allow for other reports to process when they are scheduled?
It just boggles my mind that this is even an issue, but it is.
Thanks for any insights-
Craig
There's no way to set an individual report's priority, unfortunately.
I would look at either reducing the number of required executions, if possible, and failing that, you can try manually separating them into batches, and assign a different subscription to each batch. Sub 1 could handle rows 1 - 5000, sub 2 5001 - 10000, and so on. Stagger the execution times and it will allow other subscriptions to slip in 'in between' the batch processing subscriptions.
But really, if you truly need to generate 100k-plus reports, I don't think SSRS is the best option. Have you considered SSIS instead?
Dean Kalanquin has an excellent examination of how subscriptions in SSRS work, if you're looking to see what's happening under the hood :
http://blogs.msdn.com/b/deanka/archive/2009/01/13/diagnosing-and-troubleshooting-subscriptions.aspx
Related
I'm working with MS Reporting Services 2016. I noticed that the application domain is set by default to recycle every 12 hours. Now the impact on users after a recycle is either slow response from reporting services or a failed report. Both disappear after a refresh of the report, but this is not ideal.
I have come across a SO answer where people suggest that you can turn off the scheduled recycle by setting the configuration attribute RecycleTime to zero.
I have also read that writing a script to manually restart reporting services, which also recycles the app domain. Then a script that simply loads a report at a controlled time to remove the first time load issues. However this all seems like a work around to me and I would rather not have to do this.
My concern is that there must be a logical reason for having the scheduled recycle time, but I cannot find any information explaining this. Does anyone know if there is a negative impact from turning off the scheduled application domain recycle?
The RecycleTime is a function aimed at making sure SSRS isn't consuming RAM it doesn't need and potentially starving the rest of the machine. Disabling the refresh essentially removes the ability to claw back any memory used for a brief period of intensive processing.
If you are confident your machine is suitably resourced you can turn the refresh off or, if not, alternatively schedule the refresh for an out of hours time and define a Cache Refresh Plan to cache any super important reports immediately afterwards to minimise any user impact.
Further reading here: https://www.mssqltips.com/sqlservertip/2735/prevent-sql-server-reporting-services-slow-startup/
I guess I'm possibly over simplifying this, but SSRS was designed to recycle every 12 hours (default) for a reason. If it ain't broke, don't fix it. In my case, I wanted to control when the recycle occurred. I execute a 1 line powershell script from a SQL Agent job at 6:50 am, then generate a subscription report at 7 am, which kick starts SSRS and the users do not see any performance degradation.
restart-service 'ReportServer'
Leaving the SSRS config file setting at 720 minutes lets the recycle occur again at 6:50 pm. Subscription reports generate throughout the night, so if a human gets on SSRS after hours there should be no performance issue because the system is already running.
Are we possibly overthinking it?
I'm developing a report in ssrs for display in a public location within the company. The report shows up-to-the-minute data on department activity, so is set to refresh every 30 seconds or so.
The problem I'm running into is that once every few hours, the report throws an error (rsErrorReadingNextDataRow) & (rsProcessingAborted). All I have to do is hit refresh in the browser (rendered in a an internet browser) and it reruns and charts along happily for a while.
After looking at the reportserver execution log, it appears that these errors coincide with a phenominal spike in processing time. On average, each time the report refreshes, processing times is 400ms~800ms, but when it spikes it goes to ~90000ms (yes, five zeroes) and then throws this error.
Being new to SSRS I am not sure where to begin looking for the root cause.
Can anyone give me pointers as to where I can start to find out what is causing the processing time to rocket like that? Data retrieval is stable at ~5000ms, and Time rendering is stable at around 200ms. It's only the processing that goes haywire.
Some background on the report and data:
Data is pretty straight forward. It's based on a view that pulls transactions from last 7 weeks. Number or records, therefore ebbs and flows each week ~ 8000 records. When select * from View is run in SSMS, query takes about 5 seconds to run. I'll work on speeding that up after I resolve this processing issue.
No parameters are used, though the view does use getdate() to figure out which records to show from base tables.
No stored procedures are used.
Report itself is comprised of 6 panes within a single tablix, none of which have to draw more than a few dozen marks ('cept one is a map of US states).
The report does have one feature that may be related, though I am not sure how. Report definition filters the dataset based on a mod 3 of the built in execution time variable, rotating the report to show "This Month", "This Week", or "Today" activity. There is an additional mod 3 on execution time which rotates visibility for the panes. The result is that each 30 seconds, a report with different combinations of charts shows up on a rotating time-frame.
Don't know if this could be a cause, but it's the only element of the report that makes it remotely fancy. Everything else about the report is actually rather straightforward/plain.
While I would love to identify and eliminate the spikes, even a mechanism to automatically refresh on error, or refresh on timeout (or something to that effect) would do the trick. I need to be able to launch the report in the morning and have it run unmanned all day without any human interaction, refreshing every 30 seconds for the duration of business hours.
Just a recap of the comments:
it might be possible to push the web page into a frame, then add javascript to the main page to periodically refresh the frame. It wouldn't fix the underlying problem, but it might help.
it might be worth looking at the performance monitoring tools in Sql Server - it might be a database issue (temp table filling up, etc)
You noted that adding a with (nolock) seemed to help locate the problem.
In my MS Access application I have several forms that are very data intensive (several subforms based on even more tables). My users are complaining that when opening the data across the network the load times are unbearably long.
I have do have a slit front end / back end setup using the excellent autofe application.
One solution I have come up with to the problem is instead of docmd.close when the user clicks the "Save & Close" button I me.visible = false. The user then has the long wait time the first time after the application is loaded but for later loads performance is improved by a noticeable amount.
So far this has been working fairly well. I am just concerned that there may be some hidden gotchas hidden in this strategy that I haven't encountered yet.
My users aren't overly intelligent and I don't use the application myself so I can't expect to get meaningful feedback if something is behaving erratically.
Anyone else employed this strategy successfully or know of a good reason not to do it?
Anyone else employed this strategy successfully or know of a good reason not to do it?
Yes, that strategy is similar to recipe #8.1 Accelerate the Load Time of Forms from the second edition of the Access Cookbook. However that recipe pre-loads a set of forms, with WindowMode:=acHidden, at database startup. So the tradeoff is that database startup takes longer, but subsequent form opens (for the pre-loaded forms) are comparatively fast.
The discussion for that recipe didn't mention any drawbacks for that technique. In limited use, I haven't discovered any. And since it seems to improve your users' experience, I would continue to use it.
Beyond that, I would take a hard look at the amount of data your forms pull from the back-end database. Limit the number of rows retrieved as the Record Sources for the main and subforms. Give the user a method to select a different record or small set of records. Also make sure you use indexing to support Record Source WHERE and ORDER BY clauses. Avoid WHERE conditions that use functions which will force a full table scan to figure out which rows to exclude from the Record Source. Similar considerations apply to combo and list boxes which use saved queries or SELECT statements as their Record Sources; if you can't limit the rows, at least make sure to optimize data retrieval.
At first, just hiding the form is not too bad, I think.
I would dig a bit more on WHY your load times are so long. You mentionned several subforms. Are they all displayed at the same time, or are they in the various pages of a Tab control ?
In the latter case, you could quite easily unbind the subforms that are not visible, and bind them on the PageClick event. That makes a big difference in performance.
EDIT:
Also, a bit out of scope for this question, but good for every performance issue:
-did you double check that the foreign keys in the related tables are properly indexed ?
-make sure the back-end is regularly compacted.
Are you making sure that the data gets refreshed in an appropriate timeframe?
Yes, I've doen the same thing myself in very complex forms which had about 10 or 15 tabs each with a subform. Worked for at least ten years. You had to watch for varous form level values or unbound controls which you assume start as null or zero. But once it's running smoothly it should run just fine. We had to this back in Access 97 days because Access would crash with out of memory errors after the users had opened and closed varous forms thousands of times per day.
How can I convert a database from MySQL to MS SQL Server 2005?
You can use SSIS to copy over the table data to the new structure, but that is the easy part. Next you need to check all your sql code to make sure it will still work. This link can help see the differences between how each of the databases implements SQL
http://troels.arvin.dk/db/rdbms/
While you are converting, you might consider if now might not be the best time to do some refactoring as well.
The key piece of doing a conversion though is to make sure that everything is automated and reproducable. You are going to want to do this several times in dev before moving to prod data. And when you go to prod, you will need to take the database down for maintanance or you will end up having data added to the old database after you have moved the data from that table to the new one. You might even want to build the process to copy over the bulk of the rcords before the maintenance window and then during the maintenace window only move the new records or records which hae changed since the main move. This will depend on on how big your database is and how long you will have to be down to move the records. If it can be done in one step without being down for longer than your system can tolerate, it is better to do that. Another choice for a large database might be a client by client data movement, so that instead of being down for everyone for a full day, you are only down for a couple of hours per client. Again this depends on your database design and how possible that might be to set up and do.
Whatever you do, make sure the users are fully aware of what you are doing and when in advance so they can plan. Also avoid times of the month for the change that would coincide with a need for the database to be up and running - I'm thinking in terms of don't close the payroll database the day that payroll runs or the finanical database when end of the fiscal year tasks need to be done or monthly reports run, etc. I don't know if you have any of those issues, but is is good to consider if you do and work around those periods. If the users say, "No we can't do that on Friday", then find out why - they may have a really good reason why the day you choose to implement is bad for their own work schedules.
Here is a application that will do the conversion for you:
http://www.spectralcore.com/fullconvert/tutorials/convert-mysql-to-mssql-sql-server.php
This white paper by Microsoft may also help too:
http://technet.microsoft.com/en-us/library/cc966396.aspx
I am just curious to know how long, in minutes, does the reporting service take to generate report when it returns 1MB of data. Maybe using views and table is properly index. SSRS reporting and server side generation.
Report generation time has two components:
- Data Acquisition time
- Render Time
So for 1 Mb of data, how many records (rows) are we talking? How many pages will the report have? How many controls per page? Does the report use charting? These are the factors that will determine generation time.
For most reports, data acquisition time is the most significant factor. Your report is never going to run faster than the raw data acquisition. So if you are using SQL, the report can't generate faster than the time required to run the query. I have seen queries that return far more than 1Mb of data very quickly. I have also seen queries that return very little data, that run for a long time.
On the render side, there are a couple of things that that can cause a report to run be slow. The first is in report aggregation. If a report needs to receive all of the records prior to starting rendering, then its performance will suffer. In particular, depending on the reporting tool. With large data sets (more than 10,000 records), you can have significant improvements in rendering by doing aggregation at the source (DB). The other is charting, which typically involves heavy rendering overhead and aggregation.
Most reporting systems allow you to build in timers or logging that will help you to performance tune the report. It is best to build timers into the report that will tell you what percentage of time the report is spending getting the data, and what percentage is spent rendering. When you have this information, you will know where to focus your energies.
If you are really trying to evaluate the performance of the reporting tool, the best way is to build a report that either reads a flat file or generates the data through code. In other words, eliminate the impact of the database and see how fast your reporting tool can generate pages.
Hope this helps.
How long is acceptable? Depends on what it's doing, how much it's run, things like that. Anything below 30 seconds would be fine if it's run once every day or two. If it's run once a week or once a month that number could be a lot higher.
The report itself is generally very fast, if you're seeing a hangup you may want to check the execution time of the query which generates the data. A complex query can take a long time, even if it only returns a little data...
I've found, when using BIRT and other reporting systems that the best improvements tend to come by offloading most of the work to the database at the back end.
In other words, don't send lots of data across the wire and sort or group it locally. The database is almost certainly going to outperform you with its SQL orderby and groupby clauses and optimizing indexes (among other things).
That way, you get faster extraction of the data you want AND less network traffic.
As several said already, a general question like this really can't be answered. However, I wrote up Turbo-charge Your Report Speed – General Rules & Guidelines (disclaimer - I'm the CTO at Windward Reports, a competitor of SSRS). I think that will help you look for what you can do to speed up the process.
And with all the caveats about the specifics matter a lot, on a 3GHz workstation we generally see 7-30 pages/second. Keep in mind this is numbers for Windward, not SSRS.