I am working with TFS 2013 and generating reports with SSRS however I have benn tasked with creating a detailed report showing all steps and acceptance criteria for each test case and list the available parameters. I have searched and searched the web and have not found any examples that show the test steps.
Thank you in advance.
In most instances the Test Steps fields are not set to reportable by default so they are not populated in normal reporting databases. You'll need to set them to reportable or create a new data source targeting your TFS transaction databases instead of the Warehouse/Cube databases.
It is not recommended you target transaction databases with reports since this can affect performance of TFS when reports are run. Especially a field like Test Steps since it might need to pull a lot of data at once.
Use the witadmin command to change the reportability of the Test Steps field. Using detail will only put the information in your Warehouse while the detail option will put it into the Warehouse and the Cube. Where you want it depends on how you write your reports. Once set the reportability type can only be changed in limited ways, so please keep this in mind when deciding how you want it set.
witadmin changefield n:<nameOfFieldToChange> /collection:http://yourcollectionURL:8080/tfs/YourCollectionName /reportingtype:<dimension,detail>
See the reportable attributes section for details.
Related
I have a folder with around 15 reports in it, these are Report Server reports. To run each report individually will take a while, so I want them to run together. So, what I want to be able to do is somehow run all the reports in this folder, is this possible?
This is somewhat of an ambiguous question. Let me explain. What are you asking specifically?
Q: Can you run multiple reports at the same time?
A: Yes, and there are several ways to accomplish this.
1. You can use SQL agents
2. Use batch files with task scheduler
3. Use an SSIS package and use an agent to run them at specific times...etc...
Hopefully one of the reports does not depend on another and another thing that you have to take in to consideration is how hard you will be hitting the SSRS or SQL server. Running them all at one time may take longer than one at a time. depending on the bandwidth of the SQL Server and what tables are going to be locked up during each of these processes.
You might want to give a little more detail in your question...
I would recommend an SSIS package, especially as it also one of the options presented by #Michael that can email the Excel workbook too which you mentioned in an earlier comment.
The following resource covers quite well the execution and export of an SSRS report using SSIS, including code you will need as a starting point: Executing an SSRS Report from an SSIS Package.
You could save some time in coding the solution by using the following custom Task that can be integrated into SSIS: SSIS ReportGenerator Task.
There is one problem in your requirements though which is merging reports into one Excel workbook where I assume you want separate sheets for each report within the same workbook?
Reporting Services can use multiple worksheets (to divide a report up into pages a.k.a pagination) but only for a single report; it can't merge reports into one Excel file. This can be accomplished with custom code however. There's a somewhat basic example here: Merging workbooks into a master workbook with separate sheet for each file.
One way to run all the reports at once is to add subscription to all of them and set same subscription start time in all of the reports. what will happen is once the start time arrived all the reports will run simultaneously and will generate excel/pdf (any format specified) file at shared location.
Morning folks.
This problem is very isolated but very annoying. It only happens with one customer I do work for, and can happen when creating new datasets or amending existing ones, but doesnt happen every time.
So lets start with a new dataset. I right-click and Add Dataset, I give the dataset a name, select the data source and then select Stored Procedure. This is where the fun begins. I start to type the SP name and BOOM, Report Builder crashes. I know I can just pick off the list but in the case of this client, the list of SPs is massive, so starting to type the name narrows down the list.
This can also happen if I amended the name of an SP within an existing dataset.
It has to be something to do with configuration as this doesnt happen with any of my other customers. The only difference with this site is they use Citrix but I cant see how that would affect this.
So, Report Builder 3.0 connecting to a SQL Server 2008 R2 instance. Any suggestions?
Thanks in advance
It seems not the issue on SSRS side, please let the user connect the database and execute the SP directly to check the performance. It also can be some hardware or configuration issue on that client's machine. You can suggest him use Process Monitor to fetch some detail information.
I have a tabular model that I've processed and deployed.
I'm having a problem getting SSRS to reflect the newly deployed information. I have a shared Dataset accessing a shared Data Source. When I run the MDX in the query designer of the Dataset, the correct numbers are returned. When I run the report, however, the old numbers still show. I've tried deleting the .DATA file but it didn't help.
EDIT:
I've verified that the problem is in the SSAS database itself. I queried it with drillthrough from SSMS and saw that it is returning rows that aren't in the source views any more. They used to be, but no longer.
This almost seems to be some crazy caching issue. I've rebooted and dropped/redeployed the SSAS database and no luck.
Any thoughts?
I would suggest a few steps.
Ensure you are connecting to the correct tabular model.
Expand the tables in the tabular model, and right click one of the tables and click "Process". Check all the additional tables in the model.
Change "Process Default" to "Process Full" (Process default does not always work correctly)
Click Ok.
You should now see the model process table by table.
I would close and re-open the report.
Actually I would completely ignore the BIDS / Visual Studio Preview pane as it is riddled with bugs and inconsistencies and proves nothing (assuming your end users aren't using Visual Studio).
Instead I would deploy the report for each test run to a test environment / folder on the host server (Report Manager / SharePoint). As well as being a realistic and meaningful test, this has many advantages such as being able to leave multiple IE tabs open with various parameter combinations set, then just refresh them after a Deploy to retest.
I'd like to create a simple report that shows files that currently have pending changes (checked out) from a TFS 2008 server. I know that I can use the "Find in Source Control" option from Team Explorer, but I would rather generate a reporting services report. Ideally, I'd be able to show when the file was checked out and the user that checked it out, but that's not imperative.
If the data isn't pushed to the TFS data warehouse by default, then I'd like to find the relational table(s) in the SQL Server instance that would need to be queried.
I've spent some time digging around the TFS data warehouse and looking at all of the canned Reporting Services reports that I can get my hands on, but everything seems to be geared towards work items, check-ins associated with work items, etc...
If you're looking for some easy to read data and not too worried about print outs, have a look at the TFS sidekick application by Attrice. Very helpful and if you have the correct permissions, you'll be able to see all the checked out files.
http://www.attrice.info/cm/tfs/
I doubt the information you're looking for is in the data warehouse and even if it was it might not be fresh enough for your purposes. By default the warehouse is updated once an hour.
You could use SSRS to report directly against the TFSVersionControl database but I would not recommend going this route. The database is not documented and chances are very good that it will change in the next version. It could also have performance implications if your queries are not written correctly.
A better solution would be to use the TFS web services as your SSRS data source. There are services you can call to get all files that are checked out. This iformation is always current and the queries it runs are highly optimized.
Example command line (Studio 2008):
"C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\tf.exe" status /recursive /user:*
I would like to use SQL Reporting Services 2008 to generate my reports, but I want to use my own UI for specifying the report type, columns, parameters and everything. I want to be able to take these criteria, and then kick off an asynchronous request to SSRS and have the report emailed to me. Is this possible? I don't want to go all the way down the road of researching SQL Reporting Services 2008 only to find that it doesn't do what I need it to do. Also, I will have a ton of DB partitions that the data will need to be pulled from. Some reports will need to pull data from only one of these, but other ones may actually need to span different databases. Is it possible when sending a report request to SSRS to specify what servername/database to pull the data from? Is it possible to tell it to take the data from multiple databases and combine it? Thanks.
Like Crystal Reports, ActiveReports and other report generators, SSRS has two basic elements behind each report: the SQL query and the report layout. No matter what tool you use for the SQL -- it can be inline SQL in the report or a call to a stored procedure -- it's going to be the same query. Multiple databases are fine as long as you can specify them up front.
You can have parameterized queries, so the user is prompted to input the relevant filters (customer ID, product group, date range, whatever).
Doing the report layout is similar to other tools -- you drag and drop controls like labels onto the report, and set their formatting.
SSRS does provide a lot of options for distributing the report, including email. You can embed the report in an ASP.Net web page, leave it on the report server site for users to browse to, run it in the wee hours of the morning and cache it so every user doesn't have to wait for the lengthy query to run.
It's a great tool. I think it will be worth your effort to experiment with it. I would wait on creating the customized UI until you've exhausted the possibilities inherent in the tool.
SSRS is not designed with this scenario in mind, for that matter I am not sure that any out of the box reporting solution is going to have an elegant solution for this. While SSRS can do what you are asking (as well as others), it is by no means quick or easy. You seem to be looking for an advanced ad-hoc solution with dynamic sourcing of the data. I would first question the requirements and determine if the business scenario really justifies such an implementation. I would weigh custom building a solution vs your learning curve with a BI reporting solution. You may find that it is easier to just build something on your own.
I think the heterogeneous dynamic database mashup is probably going to be the most challenging part.
Depending on what your scalability requirements are, one place that has that part covered, and a report writer, is Access. (Duck! Incoming!)
I think you may be creating a rod for your own back to a certain extent as RS ships with a few interfaces for report creation.
Mind you the end product is an rdl file which is nothing but xml, so you can write them by hand if you really like.
Multiple data sources are supported, but combining them on a single control/chart/etc are not, so you'll need to configure yourself a cross database capability from one of your data-sources prior to the report request if you want to do that.