Fast query runs slow in SSRS 2 - reporting-services

This is basically the same question as "Fast query runs slow in SSRS" however nothing in there I could see worked for me.
Context:
Created a Report Builder dataset to use for my report, query is:
SELECT DISTINCT A.A
, A.B
, A.C
, A.A + ', (' + A.B + '), : ' + CAST(A.C AS VARCHAR) D
FROM Table1 X
INNER JOIN TableA A ON X.Key = A.Key
WHERE (X.Col1 IN (#Param1))
AND (X.Col2 IN (#Param2))
UNION
SELECT '(All)', '(Select to apply all)', 0, '<APPLY ALL>'
ORDER BY C, A
OPTION (OPTIMIZE FOR UNKNOWN)
I have 3 multi valued parameters. The first (#Param1) and second (#Param2) lists load fine and the second is not dependent on the first. The third list which uses the dataset described above is dependent on both 1 and 2 as you can see from the query above.
I add the OPTION (OPTIMIZE FOR UNKNOWN) clause as an attempt to speed it up but to no avail. I do not even know how to declare a multivalue parameter anyhow so this was my only option to try at this stage.
After I am finished selecting values from my second dropdown/parameter and move to the third parameter the report loading icon thing churns for about 5 seconds than it just halts and nothing happens after that and I never get to select from my third dropdown unless I am prepared to wait until it finishes but that would seem to be about 20 minutes, so then after a while I just kill the web-page and start again. When I run the report query in SQL Server in runs in a second and when I run thru the report in the Report Builder IDE it works fine a little delay like seconds as compared to 20 minutes however running via SharePoint/SSRS it takes 20 minutes.
Does anyone have a suggestion to try?

I found my own solution more by accident than ingenious thought and which was purely a mistake on my behalf so sorry to waste anyone's time however it still left me a little perplexed, or perhaps annoyed is a better way to describe it.
Apparently one of my attempts to improve performance also included caching the DataSet which was done a week or so earlier. I had "Cache shared dataset" turned on with a specific daily schedule however no "Cache Refresh Plan" if that makes a difference - probably does as it may have still been using the cached version, i.e. perhaps changing a dataset does not actually force a automatic refresh when loading the data in the drop-down on the report. So in the process of updating my query which was after I had set the caching I went from straight sql to a multi-statement table valued function (with Option optimize for unknown) then to a stored procedure that called my function. So after updating the query in the dataset and refreshing my report that used the dataset because of the cache it was probably not applying my latest change. I turned off the cache and then my report third parameter (Dropdown) started responding. Still quite a bit slow when running in SharePoint where in report builder IDE it is instantaneous however, down to about 10 seconds as opposed to infinite amount of time. So the question I am left with is was using a stored procedure actually effective or was the fact I used the option on the query and further whether to cache or not to cache if it is going to cause a non obvious problem, i.e. will not refresh until the next scheduled period and I am not sure if having a "Cache Refresh Plan" whether it would make any difference. And also more importantly why does running a query via SharePoint as supposed to MS Server Management Studio so much more intricate, I am sure if I wrote my own custom web-page that called the same query it would also be instantaneous - anyhow things to go back to experiment with if I can be bothered.
Obviously data changes overtime and the report similar to another post on this site was working fine then all of sudden it became non-responsive anyhow it is working good enough for now without a need to cache.

Related

SSRS Report Time processing

I have a report which is taking extremely long time to run. It is bringing around 187k rows and it takes around 5 minutes to run. When I checked the execution log table in Reportserver and I noticed that time processing is taking more compared to TimeDataRetrieval and Time Rendering.
Things I did to improve performance:
I made sure that keeptogether property of the report is False so that paging is allowed
The report is not doing any grouping or filtering or sorting. Everything is been taken care at database level.
Caching is not an option for this report, as the report has parameters and they are dynamic values.
Only thing I noticed in the report is for few columns, there are some expressions in the format, which cannot be removed as they are mandatory and was asked by my client. The report has few date columns and I was using following expression:
=IIF(Globals!RenderFormat.Name = "RPL","M/d/yyyy","M/d/yyyy hh:mm tt")
I tried to run the report without the above expression, but I didn't notice any drastic change in performance
I also noticed that TimeProcessing for the report is not constant always. It keeps on changing. How can I make it to constant? Are there any other things I need to change or check to improve performance of the report?

MySQL - Select from view or direct select

I have created a view on a simple table. My problem is that my average execution time of a select on that view is about 29 seconds.
However, if I run the select statement which describes the view directly, the query executes in about 0.015 seconds.
Now, I have looked up some info, and here and here, people basically say that it should be roughly the same since a view is just a stored query.
Is it possible that I have this much of a difference in time? I have tried using SQL_NO_CACHE to make sure no cache is used so I get representative data when testing both options.
I would prefer to keep my view unless I have no option in reducing costs.
After a lot of research and trial and error I have concluded that on even simple queries and views, the performance can be a huge difference when selecting * from a view or just running the select query that is described in the creation of the view.

MySQL: How to work with multiple connection?

I have developed a script which works with a large MySQL database. The script works on IIS, MySQL, ASP Classic. The script selects 10,000 or 100,000 records and works with the records one by one and updates the database. Everything works fine, but with very slow performance. The reason for the slowness is not because of select or update statements or slow server, but because of working with those records one by one and doing some changes, then updates.
For example
SELECT * from mytable
WHERE isempty(title)
ORDER BY length(title) DESC LIMIT 100000;
Then working with those 100,000 records one by one takes, e.g. 100,000 minutes. So, I want to run the same script with 2 or 3 browsers, let say IE, Chrome, FireFox..
I was thinking to do it like this, but I am not sure if it is possible or not.
On IIS when runs the script on browser 1, it selects 100,000 records and starts work on them and starts making some changes. On browser 2 it selects 100,000 but on database less records with same condition, it might selects 90,000 and start work. Since the browser started little early, it might do some changes, so while both threads work, each other has to see those changes and work with those changes. For example the title finished on current record, then pass that record and choose another one. Is that possible? I am not sure and never used the cursor location and cursor type or whatever..
Let say 101,000 records are on the database, script 1 started first and selects 100,000 rows. After 100 minutes browser 2 starts. But when browser 2 selects 100,000 rows, that time the browser 1 has already finished 10,000, so the browser 2 will get only 91,000 records. But since those two browsers work on the same record, how can they see each others changes?
Is there any solution for my current situation? I am not MySQL expert, thats why I don't know what to do.
I am sorry for my English, but I hope you understand my question.
UPDATE;
this is not because of any script problems, or slow server problem or any other problem. this is slow because between "DO WHILE RS.EOF" AND "LOOP" I do lots of things AND aswell it doesn't really takes one minute per record, just saying an example. but I was thinking simultaneously 2 or 3 instances running the script.
ASP-Classic does not support the type of multi-threading you are looking for, however you could write a COM component or something similar that would and call it from your page.
Unless there is some sort of input required from the user, you could also write a server-side task in VBScript/PowerShell/Python/etc. to occasionally run through the data and perform what ever task it is you are trying to accomplish. It's hard to be specific when the question isn't very specific.
Having said all that, it really does sound like there are more problems with your code than you realize. It's hard to point out due to the lack of a concrete example in front of us. If you haven't already, I'd double-check to make sure the bottlenecks are where you think they are.
I've used a crude ASP Profiler in the past to look for where the specific bottlenecks are in the ASP/VBScript sites I still maintain, and on a few occasions I've found that the problem was in the least likely spots.
The bottom line is that your question is missing a fair amount of information for providing useful answers, and seems to make some assumptions that might not necessarily be true. Show us some code, provide us with some data, and you'll probably get better answers.

Speed Up - Multiple MySQL Queries on one Page

So I have a real estate website. I have a page that runs about 5 queries to build a statistics page. I am wondering if there is a way to speed this up or optimize or combine the queries so that it runs faster. Right now its take up to 5 seconds to run the page.
Query:
SELECT COUNT(`listing_num`) as `count`,
AVG(`price`),
AVG(`square_feet`),
AVG(`bedroom_total`),
AVG(`bathroom_total`),
MIN(`price`),
MAX(`price`),
MIN(`square_feet`),
MAX(`square_feet`),
MIN(`bathroom_total`),
MAX(`bathroom_total`),
MIN(`bedroom_total`),
MAX(`bedroom_total`),
MIN(`psf`),
MAX(`psf`),
AVG(`psf`)
FROM `Res_Active2`
WHERE `status` != 'S'
So i run this query about 6 different times on the page with the WHERE clause changed in each so that I can display stats for sold properties, active properties, under contract properties, etc.
What is the right way and fast way to do this? Can i use cache, combine the sql, anything? I need to speed this page up. Thanks.
Try just setup mysql query cache. It will do only once and reuse result in all other queries.
To enable mysql cache see mysql cache
I am pretty sure for you will be enought just add in your /etc/my.conf
query_cache_size=30M
If that not help, you can create special table which have hold result of that query and update that result every X minutes by external script.

SSRS Performance

I Have created an SSRS Report for retrieving 55000 records using a Stored Procedure. When
executing from the Stored Proc it is taking just 3 seconds but when executing from SSRS report it is taking more than one minute. How can I solve this problem?
The additional time could be due to Reporting Services rendering the report in addition to querying the data. For example if you have 55,000 rows returned for the report and the report server then has to group, sort and/or filter those rows to render the report then that could take additional time.
I would have a look at the way the data is being grouped and filtered in the report, then review your stored procedure to see if you could offload some of that processing to the SQL code, maybe using some parameters. Try and aim to reduce the the amount of rows returned to the report to be the minimum needed to render the report and preferably try to avoid doing the grouping and filtering in the report itself.
I had such problem because of parameters sniffing in my SP. In SQL Management Studio when I run my SP, I recreated it with new execution plan (and call was very fast), but my reports used old bad plan (for another sequence of parameters) and load time was much longer than in SQL MS.
in the ReportServerDB you will find a table called ExecutionLog. you got to look up the catalog id of your report and check the latest execution instance. this can tell you the break-up of the times taken - for data retrieval, for processing, for rendering etc.
Use the SSRS Performance Dashboard reports to debug your issues.
Archaic question but because things like that are kinda recurring, my "quick and dirty" solution to improve SSRS, which works perfectly on large enterprise environments (I am rendering reports that can have up to 100.000+ lines daily) is to properly set the InteractiveSize of the page (for example, setting it to A4 size -21 cm ). When InteractiveSize is set to 0, then all results are going to be rendered as single page and this literally kills the performance of SSRS. In cases like that, queries that can take a few seconds on your DB can take forever to render (or cause an out of memory exception unless you have tons of redundant H/W on your SSRS server).
So, in cases of queries/ SP's that execute reasonably fast on direct call and retrieve large number of rows, set InteractiveSize and you won't need to bother with other, more sophisticated solutions.
I had a similar problem: a query that returns 4,000 rows and runs in 1 second on it's own was taking so long in SSRS that it timed out.
It turned out that the issue was caused by the way SSRS was handling a multi-valued parameter. Interestingly, if the user selected multiple values, the report rendered quickly (~1 second), but if only a single value was selected, the report took several minutes to render.
Here is the original query that was taking more than 100x longer to render than it should:
SELECT ...
FROM ...
WHERE filename IN (#file);
-- #file is an SSRS multi-value parameter passed directly to the query
I suspect the issue was that SSRS was bringing in the entire source table (over 1 million rows) and then performing a client-side filter.
To fix this, I ended up passing the parameter into the query through an expression, so that I could control the filter myself. That is, in the "DataSet Properties" window, on the "Parameters" screen, I replaced the parameter value with this expression:
=JOIN(Parameters!file.Value,",")
... (I also gave the result a new name: filelist) and then I updated the query to look like this:
SELECT ...
FROM ...
WHERE ',' + #filelist + ',' LIKE '%,' + FILENAME + ',%';
-- #filelist is passed to the query as the following expression:
-- =JOIN(Parameters!file.Value,",")
I would guess that moving the query to a stored procedure would also be an effective way to alleviate the problem (because SSRS basically does the same JOIN before passing a multi-value parameter to a stored procedure). But in my case it was a little simpler to handle it all within the report.
Finally, I should note that the LIKE operator is maybe not the most efficient way to filter on a list of items, but it's certainly much simpler to code than a split-string approach, and the report now runs in about a second, so splitting the string didn't seem worth the extra effort.
Obviously getting the report running correctly (i.e. taking the same order of magnitude of time to select the data as SSMS) would be preferable but as a work around, would your report support execution snapshots (i.e. no parameters, or parameter defaults stored in the report)?
This will allow a scheduled snapshot of the data to be retrieved and stored beforehand, meaning SSRS only needs to process and render the report when the user opens it. Should reduce the wait down to a few seconds (depending on what processing the report requires. YMMV, test to see if you get a performance improvement).
Go to the report's properties tab in Report manager, select Execution, change to Render this report from a report execution snapshot, specify your schedule.
The primary solution to speeding SSRS reports is to cache the reports. If one does this (either my preloading the cache at 7:30 am for instance) or caches the reports on-hit, one will find massive gains in load speed.
Please note that I do this daily and professionally and am not simply waxing poetic on SSRS
Caching in SSRS
http://msdn.microsoft.com/en-us/library/ms155927.aspx
Pre-loading the Cache
http://msdn.microsoft.com/en-us/library/ms155876.aspx
If you do not like initial reports taking long and your data is static i.e. a daily general ledger or the like, meaning the data is relatively static over the day, you may increase the cache life-span.
Finally, you may also opt for business managers to instead receive these reports via email subscriptions, which will send them a point in time Excel report which they may find easier and more systematic.
You can also use parameters in SSRS to allow for easy parsing by the user and faster queries. In the query builder type IN(#SSN) under the Filter column that you wish to parameterize, you will then find it created in the parameter folder just above data sources in the upper left of your BIDS GUI.
[If you do not see the data source section in SSRS, hit CTRL+ALT+D.
See a nearly identical question here: Performance Issuses with SSRS
Few things can be done to improve the performance of the report which are as below:
1. Enable caching on the report manager and set a time period to refresh the cache.
2. Apply indexing on all the backend database tables that are used as a source in the report, although your stored procedure is already taking very less time in rendering the data but still applying indexing can further improve the performance at the backend level.
3. Use shared datasets instead of using embedded datasets in the report and apply caching on all these datasets as well.
4. If possible, set the parameters to load default values.
5. Try to reduce the data that is selected by the stored procedure, e.g. if the report contains historical data which is of no use, a filter can be added to exclude that data.
I experienced the same issue. Query ran in SQL just fine but was slow as anything in SSRS. Are you using an SSRS parameter in your dataset? I've found that if you use the report parameter directly in certain queries, there is a huge performance hit.
Instead, if you have a report parameter called #reportParam, in the dataset, simply do the following:
declare #reportParamLocal int
set #reportParamLocal = #reportParam
select * from Table A where A.field = #reportParam
It's a little strange. I don't quite know why it works but it does for me.
One quick thing you may want to look at is whether elements on your report could be slowing down the execution.
For example i have found massive execution time differences when converting between dateTimes. Do any elements on report use the CDate function? If so you may want to consider doing your formatting at the sql level.
Conversions in general can cause a massive slow down so take the time to look into your dataset and see what may be the problem.
This is a bit of a mix of the answers above, but do your best to get the data back from your stored procedure in the simplest and most finished format. I do all my sorting, grouping and filtering up on the server. The server is built for this and I just let reporting services do the pretty display work.
I had the same issue ... it was indeed the rendering time but more specifically, it was because of the SORT being in SSRS. Try moving your sort to the stored procedure and removing any SORT from the SSRS report. On 55K rows, this will improve it dramatically.
Further to #RomanBadiornyi's answer, try adding
OPTION (RECOMPILE)
to the end of your main query in the stored procedure.
This will ensure the query is recompiled for different parameters each time, in case different parameters need a different execution plan.