I'm tasked with reporting on survey data using Reporting Services 2008.
My challenge is this:
a survey has any number of questions
a question is one of three types (a numerical eval, a yes/no question, or a free text)
To handle this, I decided to use subreports on my main report, e.g. I defined one report for each of the three question types, and now when I'm reporting on a survey, I basically dynamically create a RDL for the survey report, using the three question types as subreports.
That actually works quite nicely so far - but I'm facing one major problem: how do I get the data into the subreports?
The approach I see right now is to have each (sub)report per question type define its own data set, based on a shared data source, to extract the values from the database. I'm pretty sure this would work - but I am not very keen on having potentially 5, 10, 20 subreports going off to the database to get their data independently.
What I was hoping for was being able to go fetch the data once for the whole survey, on the "main" report, and then just feed the appropriate subset of data into each subreport, as its being rendered - but I can't seem to find any way to do this....
Am I missing something totally obvious? I haven't had much exposure to Reporting Services, and my last project with it was four years ago (with Reporting Services 2000) - so there's a good chance I am just blind for the obvious solution :-) Please let me know!
Thanks for any hints, pointers to good articles or blogs on Reporting Services, and any help at all!
Marc
The usual way is to pass parameters (like date range) from main report to subreports and then subreports take care of everything else. To improve performance, see if you can render subreports from cache or snapshot. The cache stores report with combination of parameters passed, so after first "database hit" some or most subreports may actually be returned from the cache.
I struggled with the same issue. But there is a way you can achieve reasonable performance using "cached shared dataset". Basically subreports will use larger dataset including all rows for all subreports. By using "dataset filter" each subreport can filter out rows properly.
This is only available for 2008 version however.
If the parameters are the final data you require, then just use them and create a dummy dataset in the subreport - you could just have 'SELECT 1 AS DUMMY' as the sql (this is assuming that the subreports have distinct layouts from one another)
Or perhaps you can rethink the 'master' dataset with a function or table function?
It would still tax the sql server, but at least it would be doing it in one hit and the drain on the RS box would be less.
Related
Is there an issue with SPLIT(JOIN()) functions in SSRS 2012?
Here's why I ask...
I've just set up an SSRS 2012 server. I have an existing report I built in SSRS 2008 R2, which pulls from a 2005 database. I created a new project in MVS 2010 and added the existing rdl.
When I preview the report, the performance is at least 5 times worse than it is when I preview it in MVS 2008. I ran a trace and found that it took quite a while for SSRS to even execute the SP. Once it did, it rendered quickly.
I was trying to think of something that might slow down the SP's execution. The only thing I came up with is that I have a lot of multi-valued parameters I pass into the SP using SPLIT(JOIN()) functions. Have those been replaced by something new in 2012? If not, I don't even know where to start looking for the problem. My initial google searches have turned up nothing.
Has anyone experienced this problem or know of a list of things that worked well in 2008 but not so well in the new version?
I hope this question isn't too vague. Thanks for reading!
EDIT: I feel silly. I just traced the 2008 report execution and it turns out it does the same thing, which I never noticed before. The rendering is really quick after the SP shows up in Profiler. So...I have no clue what the problem could be. Any help would be GREATLY appreciated!
I would just stick with a predicate like:
Where thing in (#Sets)
Where the 'Sets' variable could be from another dataset I created obtained from SQL like:
Select 'Brett' as Name
Union
Select 'Anna'
Union
Select 'John'
Union
Select 'Jenny'
Simple choose to get the data for sets from 'get data from a dataset'. Once the variable is set SQL 2008R2 and higher should do the lifting for you with figuring out the clause of the predicate in expression of Where thing in (#Sets) actually translates to:
Where thing in ('Brett', 'Anna', 'John', 'Jenny')
The primary solution to speeding SSRS reports and decreasing server load is to cache the reports. If one does this (either my preloading the cache at 7:30 am for instance) or caches the reports on-hit, one will find massive gains in load speed.
Please note that I do this daily and professionally and am not simply waxing poetic on SSRS
Caching in SSRS
http://msdn.microsoft.com/en-us/library/ms155927.aspx
Pre-loading the Cache
http://msdn.microsoft.com/en-us/library/ms155876.aspx
If you do not like initial reports taking long and your data is static i.e. a daily general ledger or the like, meaning the data is relatively static over the day, you may increase the cache life-span.
Finally, you may also opt for business managers to instead receive these reports via email subscriptions, which will send them a point in time Excel report which they may find easier and more systematic.
You can also use parameters in SSRS to allow for easy parsing by the user and faster queries. In the query builder, type IN(#SSN) under the Filter column that you wish to parameterize, you will then find it created in the parameter folder just above data sources in the upper left of your BIDS GUI.
[If you do not see the data source section in SSRS, hit CTRL+ALT+D.
See a nearly identical question here: Performance Issuses with SSRS
Create a UDF which will take a comma (or some other delimiter you want to use) separated list, and return a table you can join on.
https://blogs.msdn.microsoft.com/amitjet/2009/12/11/convert-comma-separated-string-to-table-4-different-approaches/
Then you can take set up a parameter in your sproc such as #TheList varchar(max)
You should then be able to use it in a JOIN, Use it to create a temp table and then join on that in your query, or use it as a sub-select.
We use this quite often, and found that if you are primarily using values which are Integers, then the returned table should be a table of INT's to increase performance.
Pseudo example:
declare #TheList varchar(max)
set #TheList = ('1,2,3,4,5,6,7,8')
select *
from dbo.MyRecords r
join dbo.udf_CreateArrayTable(#TheList) at on r.RecID = at.RecID
I have about 35 queries in a large MS Access (2007) database that I would like to use in a report. However, I don't want to create reports for each query, but would rather include multiple queries in one report. The other questions relating to this on SO seem to be related to tables, but I am mainly concerned with just queries.
I have looked at several solutions so far, none of which seem to help:
1) DLookup - returns one value. I need to populate a report with many values from each query.
2) Subreports - I have tried to create text boxes to link my query data, but since they are unbound, it won't let me.
3) Yelling at it. I keep yelling at Access and it doesn't seem to help.
4) One giant query to get the values from all the smaller queries - it doesn't recognize the expressions I built in the smaller queries. Example: CustomerCount = DCount("[Customer_ID]","[Customers]"). Error: 'CustomerCount' is unrecognized.
Any ideas would be greatly appreciated.
Thanks,
If you are creating a report from one query that is made up of multiple subqueries, you can do this by populating textboxes. If looking at the properties on the report > Data tab > Control Source, you would place the name of the your subquery from your giant query and then the value you that you want.
For example:
Master query
SELECT
[Deb<30].CountOfACCT_ID, [Deb<30].SumOfB001,
[Deb<60].CountOfACCT_ID, [Deb<60].SumOfB001,
[Deb<90].CountOfACCT_ID, [Deb<90].SumOfB001,
[Deb>90].CountOfACCT_ID, [Deb>90].SumOfB001,
TodaysHD.CountOfACCT_ID, TodaysHD.SumOfB001,
TodaysLD.CountOfACCT_ID, TodaysLD.SumOfB001,
Part.CountOfACCT_ID, Part.SumOfB001,
FROM [Deb<30], [Deb<60], [Deb<90], [Deb>90], TodaysHD, TodaysLD, Part;
Each of the items in the FROM portion are subqueries being pulled into the master query. Then in your report you would populate the Control Source with [Deb<30].CountOfACCT_ID or whatever value you need from the master query. I have this setup in multiple reports in my application so it should work.
as for 2) you should be able to bind your controls, your subreport should have a record-source.
Else, try listboxes, their graphical apperance isn't exactly flexible, but they might get the job done.
regards,
//t
I'm currently in the process of creating an extensive Access report (~50 calculated fields) for a client and while I've gotten 99% of the report down, I'm having trouble handling the Sum and Count logic on the report.
In a nutshell, the report is intended to list the attendees at an event, tally up the number of attendees at each track/course, and also list any outstanding dues which will be paid at the door. In addition, and this is where I'm having trouble, the report also shows the revenue from each track/course along with a breakdown of the revenue from commuters vs. people staying overnight.
At the moment, all of the formulas follow fairly similar structures so despite having 50 fields, 99% of those are simply adjusting the fields to fit the relevant variables.
For the report now with revenue, my issue is that although I have the fee data specified in the data tables, Microsoft Access keeps zeroing out the calculations rather than displaying the total on the fly.
Here's two of the formals I'm using:
=((Count([W1]))*[Fee Charged W1])
=Sum([Room & Meals])
Earlier today I think I pinpointed the problem to the fact that Microsoft Access is showing the report essentially per individual rather than displaying one report for the entire dataset. I've confirmed this because I manually checked the data and noticed some fields had null values which explained the null tallies.
My question now is whether there is a way to assign a default value for variables in reports so that the report cancels out any invalid data in the database table? I imagine using VBA would help, but I have little experience in that realm.
Thanks very much in advance for any assistance
Just found the solution which was right under my nose the entire time. I just had to configure the fields to use a Running Sum (also called a Cumulative Total).
The answer came right from a page in the Microsoft Office Website: http://office.microsoft.com/en-us/access-help/summing-in-reports-HA001122444.aspx
I'm using SQL Server Reporting Services and the report designer that comes with Visual Studio. I've got a really big report. It's actually so large that Visual Studio hangs (sometimes for hours at a time) or just crashes when I make changes.
There is preciously little I can do to solve the problem, so I've decided to just move the bottom half of the report into a sub-report. So, I started with one enormous, unresponsive report and ended with two small, manageable reports -- surprisingly, this actually works.
One problem: my subreport uses the same data as my main report. Right now, it populates its dataset by re-querying the database. The extra round-trip to the database causes the report to take twice as long to generate; up from 45 minutes to 1 1/2 hours to generate.
I'd like to avoid hitting the database again, and instead use the same dataset in both reports.
How can I share or pass a dataset between a report and subreport?
I think this can help you:
http://www.gotreportviewer.com/subreports/index.html
Supplying data for the subreport - the SubreportProcessing event To
supply data for the subreport you have to handle the
SubreportProcessing event. Note that this event is on the LocalReport
object. You can add an event handler like this:
private void MainForm_Load(object sender, EventArgs e)
{
this.reportViewer1.LocalReport.SubreportProcessing += new SubreportProcessingEventHandler(MySubreportEventHandler);
}
Below is an example for the event handler. In this example
LoadSalesData is defined to return a DataTable.
void MySubreportEventHandler(object sender, SubreportProcessingEventArgs e)
{
e.DataSources.Add(new ReportDataSource("Sales", LoadSalesData()));
}
If your report has multiple subreports you can look at the ReportPath
property of SubreportProcessingEventArgs and supply data for the
corresponding subreport. You may also want to examine the values of
Parameters property of SubreportProcessingEventArgs and only return
the subset of data that corresponds to the subreport parameters, as
mentioned here.
I'm pretty sure you can't. You're probably better off looking for ways to redesign the report entirely so that it's not so large... not to mention the various problems with subreports when exporting to excel.
I have several reports that the SQL is so complex in that it locks up Visual Studio when I try to edit it. In these reports I go straight into the Code view and edit the XML directly, which works. I also do this when Visual Studio mysteriously makes columns slightly wider than I set them at. However, I doubt you'd want to go down this path if you are editing the layout of the report too much.
Instead of running your query in the report, would it be possible to build a table using a stored procedure that both reports use? The first report runs the stored procedure to build the table and then both reports simply query the report. Watch for concurrency problems if the report can be run by multiple users.
Have you tried using a list within a list where both lists use the same dataset and then filter the inner list to display only records linked to the the outer list?
As far as the execution time, 45 minutes seems like an awful long time in the first place. I'm assuming you've done some analysis of the execution plan to verify your query or stored procedure is using meaningful indexes?
Hope this helps,
Bill
You can do it using a dummy parameter:
i. Create a parameter in your main report 'MyData' and tick 'internal'
ii. Set default value of 'MyData' to your data-set
iii. Set the sub-report parameter with the expression
=Parameters!MyData.Value
Hope this helps,
Duncan
If you create a table, you can merge all the cells of the details row and put a subreport as the contents. Then set the parameter of the subreport to the field you want to run the subreport against.
Jason
I want to group by a report item, but that's not allowed.
So I tried creating a parameter...not allowed as well.
Tried referencing from footer...failed again.
This is somewhat complicated.
Let me explain:
I have textbox22, it's value is:
=Code.Calc_Factor(Fields!xx.Value, fields!yy.Value...)
This is embedded VB code in the report that's called for each row to calculate a standard factor.
Now to calculate the deviation from the standard factor, I use textbox89, whose value is:
=(Fields!FACTOR.Value - ReportItems!textbox22.Value)/ReportItems!textbox22.Value
Don't get confused between Fields!FACTOR.Value and textbox22.Value, they are different.
Fields!FACTOR.Value is the factor used, textbox22.Value is what it should be (standard factor).
Now I want to create a group which splits deviations into 2 groups, > 1% or not.
So I tried creating a group:
=IIF(ReportItems!textbox89.Value > 1,0,1)
...But then SSRS complains about using report items.
I have run into a similar problem of using report items in the past, but this is a new case!
Any help greatly appreciated.
Have you tried adding a calculated field to your dataset?
Here is how it works:
While you are in the layout view of the report, open "datasets" tool window(in my environment it is on the left).
Right click on the DataSet you are working with and add a field, you can use a calculated field, and build your formula appropriately
Then you should be able to group on this field
-Dan
I'm not 100% that someone won't have some magic solution for this but I have run across similar problems myself in the past. I believe (but could be wrong) the problem Reporting Services is having is that it only renders once and what you're asking it to do is render the data before rendering the grouping which it doesn't do.
The only way I have ever been able to produce the exact results I need is to make the data rendering happen exclusively in the SQL (through the use of table variables usually) and then use Reporting Services merely as a display platform. This will require that your factoring algorithm gets expressed in the T-SQL within the stored procedure you will likely have to write to get the data in shape. This would appear to be the only way to achieve your end result.
This has the bonus feature of separating report design and presentation from data manipulation.
Sorry I couldn't provide a SSRS solution, maybe someone else will know more.