I have a tabular model that I've processed and deployed.
I'm having a problem getting SSRS to reflect the newly deployed information. I have a shared Dataset accessing a shared Data Source. When I run the MDX in the query designer of the Dataset, the correct numbers are returned. When I run the report, however, the old numbers still show. I've tried deleting the .DATA file but it didn't help.
EDIT:
I've verified that the problem is in the SSAS database itself. I queried it with drillthrough from SSMS and saw that it is returning rows that aren't in the source views any more. They used to be, but no longer.
This almost seems to be some crazy caching issue. I've rebooted and dropped/redeployed the SSAS database and no luck.
Any thoughts?
I would suggest a few steps.
Ensure you are connecting to the correct tabular model.
Expand the tables in the tabular model, and right click one of the tables and click "Process". Check all the additional tables in the model.
Change "Process Default" to "Process Full" (Process default does not always work correctly)
Click Ok.
You should now see the model process table by table.
I would close and re-open the report.
Actually I would completely ignore the BIDS / Visual Studio Preview pane as it is riddled with bugs and inconsistencies and proves nothing (assuming your end users aren't using Visual Studio).
Instead I would deploy the report for each test run to a test environment / folder on the host server (Report Manager / SharePoint). As well as being a realistic and meaningful test, this has many advantages such as being able to leave multiple IE tabs open with various parameter combinations set, then just refresh them after a Deploy to retest.
Related
We had a problem earlier when deploying a single report to the production environment, when, for reasons we don't understand, SSRS decided to also overwrite the Data Source associated with the report with settings that do not even match those currently in the project.
We want to understand why/how this happens, and what we need to be doing to control it - ie, what are we missing about SSRS that we need to be aware of.
The steps we took were as follows:
Before starting: This is to update an existing report, not a new report, so the Prod report server already has the Data Source and the (old) Report Definition. The Data Source (shared) does not need to be changed at all, nor do we believe we did anything that should have prompted SSRS to do so. We only intended to overwrite the old report definition with the new one.
Data Source within project modified to point to the production Sql Server source
Deployment settings within the project modified to point to the production Report Server
The single report deployed (literally by right-clicking TheReport.rdl in the Solution Explorer and then clicking Deploy). That is everything we did. We did not deploy or change anything else.
Expected result: report definition on the prod server overwritten with the new report. Data Source completely unchanged, because why would it be? We didn't deploy that (and in any case, the one in the project is pointing to prod, so it shouldn't even matter if it did)
Actual result: Report overwritten as expected. Data Source also overwritten... with the old dev settings. Not the ones currently in the project. All the other reports sharing this data source suddenly stop working or display dev server data.
What are we doing wrong? SSRS quietly overwriting the associated Data Source on the server when deploying a single report seems dangerous (we would likely have missed that it had even happened, had the data on these particular dev and live environments been similar enough) so I presume we are missing something we should be doing/checking when deploying reports, but are at a loss as to exactly what.
That is configuration to copy your datasources and datasets to the report portal or not. You can change the configuration by right click on your report project and select properties that will open up the property pages. There is an option to overwrite default settings. Please check below image for more details
By using the above configuration it will deploy dataset and data sources on SSRS server only if it is not exists on the server.
Hope this will work.
I am working with TFS 2013 and generating reports with SSRS however I have benn tasked with creating a detailed report showing all steps and acceptance criteria for each test case and list the available parameters. I have searched and searched the web and have not found any examples that show the test steps.
Thank you in advance.
In most instances the Test Steps fields are not set to reportable by default so they are not populated in normal reporting databases. You'll need to set them to reportable or create a new data source targeting your TFS transaction databases instead of the Warehouse/Cube databases.
It is not recommended you target transaction databases with reports since this can affect performance of TFS when reports are run. Especially a field like Test Steps since it might need to pull a lot of data at once.
Use the witadmin command to change the reportability of the Test Steps field. Using detail will only put the information in your Warehouse while the detail option will put it into the Warehouse and the Cube. Where you want it depends on how you write your reports. Once set the reportability type can only be changed in limited ways, so please keep this in mind when deciding how you want it set.
witadmin changefield n:<nameOfFieldToChange> /collection:http://yourcollectionURL:8080/tfs/YourCollectionName /reportingtype:<dimension,detail>
See the reportable attributes section for details.
I have created a SSAS project and it works perfectly fine my cubes, data-sources and such first time I deploy the cube, but if I change the data in my database and I click Cube>Process in BIDS it will not reflect the new changes even though when I retrieve the table data in SQL server Management Studio the table shows my data is changed.
I have also tried updating the cube in SSIS package using the Analysis Services Processing Task. However, the changes in my underlying data is not shown. It stays the same. Can anybody give me the few possible scenarios that can cause this problem.
Much appreciated
Thanks in Advance
First step is to verify that the datasource ON THE SERVER (not locally on your dev machine) is set to the correct database.
Are you processing the whole project or just the cube? I noticed that I have to process the top level item in the solution explorer. Processing just the cube was not enough.
Do you get any error messages?
Try processing the dimensions first and then process the cube.
I am not an expert myself, but I understood that there is a difference between updating ("process") and "deploy".
So when you create or add dimensions you will first have to deploy the cube to the analysis server and then process it.
Later when there is new data in your datasource, you can just process the cube (it's already deployed) to update the data.
Try this:
Identify which Dimension or Fact Table data you have updated.
Goto the SSAS project you have created in Visual Studio for the CUBE.
In the Solution Explorer expand Dimensions folder.
Right Click the Dimension or Dimension related fact table for which data is updated and Click on Process.
Then once the Process window is opened then do confirm in the window then Object Name = and Process Option = "Process Update".
Then Click on run on the bottom of the Process window.
Verify the process is succeed, if failed then fix the errors.
Go to Browser Tab of the CUBE and check the data is updated.
Any changes made to the cube structure i.e. adding new measure(s) or business intel. will reflect merely after processing the cube, there is no need to deploy the whole project again. However if any changes are made in underlying data or data structure, then the whole project needs to be deployed.
Changes can be seen right after processing/deployment in Cube>Browser but will take some time to be realized in SSRS because of caching.
I am trying to print a report that contains a bar graph using the report viewer, but running into an error. My reporting server is running SQL Server 2005 Reporting Services SP3 on Windows Server 2003 SP2.
Here are some steps that will reproduce the problem (at least for me)...
On a clean machine, I open up the
report, and it displays fine.
I then click the print button, and I
am prompted to install the
RSClientPrint ActiveX control. The
control downloads and installs fine.
I then click the print button again,
and the print dialog appears.
I select a printer, and click "OK".
A message box appears that has the
following text (including the
spelling error)...
An error occured during printing.
(0x80004005)
Any other report I try to print works fine. The only difference between this report and the other ones is that it contains a bar graph. If I remove the graph from the report, redeploy it, and then re-run it, it prints without getting that error.
As far as I know, it is not isolated to a specific machine. It happens to every customer I have talked to, and a variety of machines here in the office.
Has anyone seen anything like this? I have seen similar posts on the web suggesting to uninstall video drivers on the reporting server (thinking the GDI dlls have become corrupt ), install service packs, etc. I have tried every suggestion, but haven't found a good solution yet.
Thanks.
I ended up having to use a paid Microsoft incident on this, but it is resolved now. The issue was that I had a matrix in my report that had dynamic columns. Depending on exactly which date range you picked, the report could have n number of columns. In my case, when a date range was chosen that produced three or more of these dynamic columns, it would cause the matrix to become too large and run outside of the margins of the report.
The report would run and display fine with the matrix being too large, but the incredibly non-descriptive error would display whenever the report was printed or exported.
I resolved the issue by reducing the size of other columns and the overall font size in the report. This prevents the matrix from running off the page in the case of date ranges that produce three dynamic columns. It doesn't solve it in the general case (four or more columns will make it fail), but is good enough for my current purposes.
Microsoft didn't have a fix for the general case (such as a way to make the matrix fixed width).
I figured I should answer this in case anyone else runs across it.
-David
I'd like to create a simple report that shows files that currently have pending changes (checked out) from a TFS 2008 server. I know that I can use the "Find in Source Control" option from Team Explorer, but I would rather generate a reporting services report. Ideally, I'd be able to show when the file was checked out and the user that checked it out, but that's not imperative.
If the data isn't pushed to the TFS data warehouse by default, then I'd like to find the relational table(s) in the SQL Server instance that would need to be queried.
I've spent some time digging around the TFS data warehouse and looking at all of the canned Reporting Services reports that I can get my hands on, but everything seems to be geared towards work items, check-ins associated with work items, etc...
If you're looking for some easy to read data and not too worried about print outs, have a look at the TFS sidekick application by Attrice. Very helpful and if you have the correct permissions, you'll be able to see all the checked out files.
http://www.attrice.info/cm/tfs/
I doubt the information you're looking for is in the data warehouse and even if it was it might not be fresh enough for your purposes. By default the warehouse is updated once an hour.
You could use SSRS to report directly against the TFSVersionControl database but I would not recommend going this route. The database is not documented and chances are very good that it will change in the next version. It could also have performance implications if your queries are not written correctly.
A better solution would be to use the TFS web services as your SSRS data source. There are services you can call to get all files that are checked out. This iformation is always current and the queries it runs are highly optimized.
Example command line (Studio 2008):
"C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\tf.exe" status /recursive /user:*