I have a problem in SQL Server 2008 Reporting Services. The problem is that the report is sometimes too slow to render (it takes more than 30 min), although I took the query and executed it in SQL Server Management Studio and it didn't take more than 25 seconds.
The query returns a large table (about 5000 rows) and I use it to draw a pie chart in the report, I tried to optimize the query so that it returns only 4 rows but the report was slow again.
What confuses me is that sometimes the report (with different input) is as fast as the query (about 30 sec), I thought it might be because of low number of users so I tried with some colleagues to view it at the same time but the reports still are fast, I tried to change in the configuration but I had no luck.
I've been searching for a solution for this problem for more than two months, so if anyone could help me on this I will be very thankful.
If you have access to the ReportServer sql database execute the following query or similar against the ExecutionLog view:
select TimeStart, TimeEnd, TimeDataRetrieval, TimeProcessing, TimeRendering, Status, ReportID from executionlog
This will provide you with a good breakdown of your report rendering (with different parameters).
Pay close attention to TimeRendering, TimeProcessing and TimeDataRetrieval.
Large or high values for any of these columns will illustrate where your bottleneck is.
One problem that I have experienced in the past is when you are returning a fairly large dataset to the report (5000 rows is large enough for this scenario) and then you are using the inbuilt ssrs filtering, the rendering is very slow and this would result in a very high TimeRendering value.
All rendering should be done at the database layer, grouping and filtering does not perform well will large amounts of data when performed in the ssrs report itself.
Related
I have a SQL view, which takes 4-5 seconds to run, with no filters, if I run it within SSMS. If I try to open the linked "table" in Access 2010, it times out.
In the Options - Client Side Settings, I set the OLE/DDE timeout to 0, and the ODBC timeout to 0. I still get the ODBC--call failed. [Microsoft][ODBC SQL Server Driver]Query timeout expired (#0). Once I click ok, I get another message [current application name] can't open the table in Datasheet view.
I just don't understand how I can't open this linked table in Access, but the underlying view only has 88 records right now. There are a lot of columns, but few results, and it only takes a few seconds to run in SSMS. Why does it timeout and have such a problem as a linked table in Access?
Any help is greatly appreciated.
Thanks!
So I was looking at this puzzle, with a colleague. It would be difficult and still poor performance, to translate this 118 line query, with 30 table joins, into an Access query.
Instead, I am breaking the sections of the giant view, into separate smaller views. Each independent view, will be joined in an access query, so that each section of the query can be filtered independently, and allow for smaller sets of results, thereby improving the overall performance.
I am an ex multi value developer that over the last 6 months have been thrust in to the world of SQL and apologies in advance for the length of the question. So far I have got by with general instinct (maybe some ignorance!) and a lot of help from the good people on this site answering questions previously asked.
First some background …
I have an existing reporting database (SQL Server) and a new application (using MySQL) that I am looking to copy data from at either 30 min, hourly or daily intervals (will be based on reporting needs). I have a linked server created so that I can see the MySQL database from SQL Server and have relevant privileges on both databases to do read/writes/updates etc.
The data that I am looking to move to reporting on the 30 minute or hourly schedule typically are header/transactions by nature have both created and modified date/time stamp columns available for use.
Looking at the reporting DBs other feeds, Merge is the statement used most frequently across linked servers but to other SQL server databases. The merge statements also seem to do a full table to table comparison which in some cases takes a while (>5mins) to complete. Whilst the merge seems to be a safe options I do notice a performance hit on reporting whist the larger tables are being processed.
In looking at delta loads only, using dynamic date ranges (eg between -1 hour:00:00 and -1 hour:59:59) on created and modified time stamps, my concern would be the failure of any one job execution could leave the databases out of sync.
Rather than initially ask for specific sql statements what I am looking for is a general approach/statement design for the more regular (hourly) executed statements with the ideal being just to perform delta loads of the new or modified rows safely with a SQL Server to MySQL connection.
I hope the information given is sufficient and any help/suggestions/pointers to reading material gratefully accepted.
Thanks in advance
Darren
I have done a bit of “playing” over the weekend.
The approach I have working pulls the data (inserts and updates) from MySQL via openquery into a CTE. I then merge the CTE into the SQL Server table.
The openquery seems slow (by comparison to other linked tables) but the merge is much faster due to limiting the amount of source data.
I have an SSRS report which uses 4 datasets retrieving appox 1000 records each.
These records are presented in SSRS using grouping (row wise / column wise) to generate group counts and perecentages.
There are 4 such tables.
The procedures used to fetch data run in less tha 30 sec when run in SQL Server but the report takes much time to render:-
TimeDataRetrieval TimeProcessing TimeRendering
1250648 219 214
The behaviour is:-
In Visual Studio it is giving the error:-
Object has been disconnected or does not exist at the server.
When deployed in local it is at times rendering the report but taking 10 min or more.
Could you please suggest what could be the issue?
This could be a "parameter sniffing" issue. Try adding OPTION (RECOMPILE) to the end of your SQL code.
I have run into problem with selecting large data from SQL Server. I have a view with 200 columns and 200 000 rows. And I am using the view for Solr indexing.
I tried to select data with paging but it took a lot of time(more then 6 hours). Now i am selecting it without paging and it take 1 hour. But SQL Server takes a lot of memory.
What is the best method or approach to select large data in such situations from SQL Server 2008 R2?
Thanks in advance.
200k rows is not that much and definitely shouldn't take 6 hours, not even 1 hour.
I did not understant if the problem is on the actuall select or bringing the result to the application.
I would recomend running the select with NOLOCK to ignore blockings, maybe your table is beeing accessed by other processes when you are running the query
SELECT * FROM TABLE WITH(NOLOCK)
If the problem is on bringing the data to the application you'll need to provide mroe details on how you are doing it
I'd suggest taking a look at your execution plan. Look at the properties in the very first operator and see if you're getting "Good Enough Plan Found" or a timeout. If it's the latter, you may have a very complicated view or you may be nesting views (a view calling a view). See what you can do to simplify the query in order to give the optimizer a chance to create a good execution plan.
I have a query that takes roughly 2 minutes to run. It's not terribly complex in terms of parameters or anything, and the report itself doesn't do any truly extensive processing. Basically just spits the data straight out in a nice format. (Actually one of the reports doesn't format the data at all, just returns a flat table meant to be manipulated in excel.)
It's not returning a massive set of data either.
Yet the report takes upwards of 30 minutes to run.
What could cause this?
This is SSRS 2005 against a SQL 2005 database btw.
EDIT: OK, I found that with the addition of WITH (NOLOCK) in the report it takes the same time as the query does through SSMS. Why would the query be handled differently if it's coming from reporting services (or visual studio on my local machine) than if coming from SSMS on my local machine? I saw the query running in Activity Monitor a couple times in SLEEP_WAIT mode, but not blocked by anything...
EDIT2: The connection string is:
Data Source=SERVERNAME;Initial Catalog=DBName
Is it definitely the query taking a long time to run, or is the processing being done by the server that is slow? Some reports call queries multiple times. For instance, if you have a subreport inside a of a paging list control, each page of that report calls the query separately. So maybe there's something the report is doing with the data causing the delay?
How large is the data set that is returned by your query? If it is very large the majority of the time that is taken on the report server could be related to the time it takes the report to render. To be sure you could look at the ExecutionLog table on the report server to see if the TimeRendering is a large number in comparison to the overall execution time.
I think that this is not uncommon, but we looked into similar issues.
From memory, one thing that we did notice was that our subreport had parameters, and we've configured the "possible values" to be queried from the database.
I think that every time the subreport runs, SSRS re-queries the possible values of the parameters (& runs any other queries in your report even if you don't use the results).
In this case, once we were happy the subreport was working OK, we removed the queries for vaidating the parameter values and allowed "any value", assuming the parent report would not feed us bad parameter values.
A tad late to the party, but for anybody from the future having a similar problem.
Parameter sniffing
If a stored procedure with parameters is being used, it might be due to a phenomenon called 'parameter sniffing'.
In short, the first time a stored procedure is executed from SSRS an execution plan, based on the specified parameter values, is determined. This execution plan is then stored and used every time the stored procedure is executed from SSRS. Even though this execution plan might not be optimal for any future parameter values.
For an excellent and more extensive explanation have a look at: https://www.brentozar.com/archive/2013/06/the-elephant-and-the-mouse-or-parameter-sniffing-in-sql-server/
Other questions
Also have a look at this similar question:
Fast query runs slow in SSRS