We had a problem earlier when deploying a single report to the production environment, when, for reasons we don't understand, SSRS decided to also overwrite the Data Source associated with the report with settings that do not even match those currently in the project.
We want to understand why/how this happens, and what we need to be doing to control it - ie, what are we missing about SSRS that we need to be aware of.
The steps we took were as follows:
Before starting: This is to update an existing report, not a new report, so the Prod report server already has the Data Source and the (old) Report Definition. The Data Source (shared) does not need to be changed at all, nor do we believe we did anything that should have prompted SSRS to do so. We only intended to overwrite the old report definition with the new one.
Data Source within project modified to point to the production Sql Server source
Deployment settings within the project modified to point to the production Report Server
The single report deployed (literally by right-clicking TheReport.rdl in the Solution Explorer and then clicking Deploy). That is everything we did. We did not deploy or change anything else.
Expected result: report definition on the prod server overwritten with the new report. Data Source completely unchanged, because why would it be? We didn't deploy that (and in any case, the one in the project is pointing to prod, so it shouldn't even matter if it did)
Actual result: Report overwritten as expected. Data Source also overwritten... with the old dev settings. Not the ones currently in the project. All the other reports sharing this data source suddenly stop working or display dev server data.
What are we doing wrong? SSRS quietly overwriting the associated Data Source on the server when deploying a single report seems dangerous (we would likely have missed that it had even happened, had the data on these particular dev and live environments been similar enough) so I presume we are missing something we should be doing/checking when deploying reports, but are at a loss as to exactly what.
That is configuration to copy your datasources and datasets to the report portal or not. You can change the configuration by right click on your report project and select properties that will open up the property pages. There is an option to overwrite default settings. Please check below image for more details
By using the above configuration it will deploy dataset and data sources on SSRS server only if it is not exists on the server.
Hope this will work.
I'm getting the following error when publishing a report from the command line using the report scripting tool:
System.Web.Services.Protocols.SoapException: The definition of this
report is not valid or supported by this version of Reporting
Services. The report definition may have been created with a later
version of Reporting Services, or contain content that is not
well-formed or not valid based on Reporting Services schemas
There are other questions regarding this error and the solution always seems to be to either install SQL Server 2016 or to change my reports project to output for 2014. This does, in fact, work but it shouldn't be necessary. My SQL Server version is 13.0.4446.0, which is 2016. Furthermore, I can publish just fine from within VS with it set to 2016.
Anyone know what's causing this?
A couple things to try:
What version of VS are you using? I came across an issue where I'd open a solutions in VS 2015 (I think) and it upgraded report definitions without telling me. When you pick a deployment target in VS it changes the file on the fly, not the source file in the solution. Hence the different results you see.
You can check this by downloading a report from SSRS & open the RDL in a text editor. Compare with your solution RDL & check the header. The Schema at the top is what you want to compare:
Report xmlns="http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition" xmlns:rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner"
Another option: Have you tried deploying with PowerShell? Check out the SSRS GitHub for commands but it will be 2, maybe 3 lines of code. Very easy to use.
I have a report in Microsoft Access 97 (yes, I know) that works properly on my client's copy of Server 2003. However, when I try to run it on my own copy of Server 2003, the report crashes Access immediately. It does the same in Server 2008 R2 and Server 2012 R2. Now, the interesting thing is that the query underneath that report runs perfectly, which I would think rules out things like MDAC. There's nothing unusual about the report -- no strange fonts or graphics -- at least nothing obvious. If I can get it working in an environment identical to the client's, then I can move it forward to other operating systems. Where should I start looking for the cause?
Usually this kind of crash is related to missing references..You need to go to the VBA Editor (ALT + F11) -> Tools -> References and check if you have any MISSING
Import everything from your current project into a new blank project, seems like that is the only resolution I have been able to find online.
The information that I found can be viewed here: http://www.pcreview.co.uk/threads/access-97-crashes-opening-a-report.3901588/
What worked for me, don't ask me why:
I found the printing worked and so did Layout View
Open the report in Layout View
Save the report
It works!
Running into an issue where after upgrading SQL Server from 2008/R2 to 2012, SSRS report when rendered to PDF will display "Page x of 0" at the bottom (x = page number of report). This only occurs in PDF.
I open the reports, and the correct usage of the global parameters is there. If I re-deploy the report to the upgraded SSRS, it works fine.
It appears to be an upgrade issue. Applying the SQL Server 2012 SP1 doesn't help either.
Any ideas?
So we opened a ticket with MS. In a nutshell, when running an in-place upgrade, the RDL's stored in the ReportServer DB are "left alone". Because MS changed only the Globals!TotalPages to Globals!OverAllTotalPages and the RDL's are never touched during the upgrade process, the error occurs.
This also explains why when you take the un-modified original report and re-upload/deploy the report. It works.
This is fine if your just updating you local server. When you have 4500+ installations which can support up to N users, and 180+ reports. We get a trouble ticket in when a customer upgrades. That can be a lot of trouble tickets.
Not sure why MS just didn't "upgrade" the Globals!TotalPages to Globals!OverAllTotalPages. Didn't get an answer for that one.
When I open up the solution that contains SSIS packages created by a colleague, I get this awkward error that tells me nothing about what I'm supposed to do to fix it.
He left instructions to take all the "variables" out of the connection string in the dtsx file manually before opening up the solution. I have done that, now when try to view the package in the designer I just get an image of a red x and this message.
EDIT: You cannot see any design elements, no tabs across the top to switch to errors or data flows. Just a gray center area on the screen with a red x, and the message, its like VisualStudio dies in the process of reading the dtsx file.
The question is rather unspecific so it’s of course difficult to get on the right track here. All of the given answers focus different issues. I would say that PeterX had the best guess. The reason for the error could be as simple as a modified data source.
I came across with a bug "error output has no corresponding output" quite often when adding a new column to a table that needs to be processed by an existing SSIS package. This bug came along with an error message saying that a "Value does not fall within the expected range".
A newly added column needed to be processed by an existing SSIS Package. The expected behavior is that SSIS will recognize that there is a new column and select this column on the columns page of the OLEDB Source Task SSIS to be processed. However, when opening the OLEDB Source Task for the first time after having modified the table I got twice the following error message: "Value does not fall within the expected range." The error message showed up when opening the editor and when opening the Columns page of the editor. Within the Advanced Editor of the OLEDB Source Task the new column showed up in the OLEDB Source Output Columns Tree, but not in the OLEDB Source Error Output Columns Tree. This is the actual underlying problem of the error message. Unfortunately, there seems to be no way to add the missing column manually.
To solve the problem, remove and re-add the newly added column on the Columns Page of the normal Editor as mentioned by Jeff.
It is worth to be mentioned that the data source of the OLEDB Source task was a modified MDS View. Microsoft CRM Dynamics – as mentioned in the related thread – is using views, too. That leads me to the conclusion, that using views as a data source may produce either of the above mentioned errors, when modifying datatypes or adding/removing columns.
Related Thread: Error" ...The OLE DB Source.Outputs[OLE DB Source Output].Columns[XXXXXXXX] on the non-error output has no corresponding output
The described workaround refers to Visual Studio 2008 Version 9.0.30729.4462 QFE with Mircorsoft.NET Framework 3.5 SP1. The database is SQL Server 2008 R2 (SP2).
I had to delete and recreate the OLE DB Data source in my Data Flow - this is where I got the error. I also noted I had to "re-select" the "OLE DB connection manager" in the drop-down-list to force it to recognise the new connection.
This was probably a combination of getting the solution from TFS (where I noticed the data-sources didn't come-across properly and it complaining about a missing connection GUID) and/or copying and pasting the elements from another package.
(For BIDS 2008).
I had this issue for my OLE DB Source component with an SQL command after adding new columns to the database, and it wouldn't let me select columns or anything else to add the new columns.
I'm working with an Oracle database, and the only way I could get it to update was to change the SQL query to select 1 from dual, and preview it. Then revert it back to my old query.
You get a similar message if someone uses EncryptAllWithUserKey as the ProtectionLevel. However, I believe the message is slightly different (even though you get a grey design surface with a red X).
Have you tried viewing the file in Notepad? Is it just a series of GUIDs or is there anything in it that is humanly readable? If it doesn't have any readable code, then it was probably encyrpted with the user key.
If the employee deployed the packages to a server and used SQL Server as the deployment destination (not File System or SSIS Pacakge Store) then you can download the packages to your machine. Just connect to the SQL Server Integration Services engine, expand Stored Packages, expand MSDB, expand the relevant folder, right-click on the package, and click Export Package. Save the file on your local machine and open it. The package will probably lose annotations and pretty formatting, but otherwise it should be identical to what the employee deployed.
I just struck the same issue. After flailing about for a bit, I found the solution was to edit the Solution Configuration.
The Solution Configuration appeared to have a matching Project configuration, as shown:
However clicking the drop-down arrow for that Project (SSIS-Advance in this example) revealed that there was no Project Configuration for that project called Production - Sub Reports. I'm not sure how that came about - this Solution has a 7-year history and many developers.
Anyway once I created a New Project configuration (using that same drop-down menu), it is all happy now.
If it has Oracle data sources, you may need to install the Microsoft Connectors v4.0 for Oracle by Attunity:
https://www.microsoft.com/en-us/download/details.aspx?id=52950
I also had to use VS 2015 - the version originally used to create the project and package.
I had this exact problem and installing these connectors and using VS 2015 fixed the issue.
I had this occur as well when I tried to call a stored procedure with OUTPUT parameters with OLE DB.
I found this: http://sqlsolutions.blogspot.com/2013/04/ssis-value-does-not-fall-within.html, which resolved my issue. The relevant action was to rename the SSIS parameter mappings to '0', '1', etc.
So for example, when calling dbo.StoredProc #variable0 = ?, #variable1 = ? OUTPUT, #variable2 = ?;, in the parameter mapping dialog, you would name the parameters '0', '1', 2' to correspond to those. Ah, SSIS <3
I get this when I do not follow the convention for parameter naming, e.g. not name parameters 0,1,2,... in the right order for OLE DB connections.
The details are documented here.
In your connection manager, convert your connections to package level instead of project level
Delete connection manager and re-create and setup ssis package solve the problem.
I got this issue after I Add Existing Connection Manager in a SSIS project. I was just importing a Project Connection Manager from a different project (.conmgr) to my project. My solution to fix the issue was:
Deleting the imported .conmgr
Recreating it from scratch