How to permanently fix destination column maps in SSIS 2014 - ssis

I have an SSIS 2014 solution in Visual Studio 2015. The solution has multiple configurations. The solution is under source control in Git. The project is set for project deployment.
The issue I'm having is that the column mapping in the destinations keep disappearing every time anybody makes a change and checks it back into Git. The disappearances do not happen in the same areas.
I have tried recreating the project, then the whole solution, but this keeps happening. I even recreated the solution and put it under a different repository in Git. This is the first time in 10+ years of working with SSIS that I've seen something like this.
So far I haven't found anything on the different search engines.
EDIT:
The sources are just regular OLE DB sources with T-SQL queries that are typed in the "SQL command text:" box. Nothing fancy.
The project has one master package with 8 child packages. That child packages are called using the "Execute Package Task." All the child packages are self-contained, so no parameters are being passed from the master package.
The disappearances do not happen in the same areas
This means that the destination columns getting unmapped, happen randomly on the data flow in any of the 8 child packages. It's different every time any of us open the packages to check on the package logic. The destinations are just OLE DB destinations. So nothing fancy here either.
So anytime anybody in the team makes changes in any of the packages, any number of the child packages will unmap columns randomly in their data flows. There's no pattern to this behavior. That's why it's so maddening.

Related

Sudden increase in packages failures executed against Sharepoint/Office 365 from SSIS

Beginning yesterday afternoon (12/9 Central US time) we began seeing a marked increase in SSIS packages execution failures. These packages have been in operation for several months and experienced no failures on 12/8. Initially I brushed it off as temporary, but now it seems as if "none" of them are working. Several of these packages run hourly, with the first failure around 10:30 on 12/9. Between 10:30 and 1500 'most' succeeded, but after 1500 on 12/9 most failed.
I'm testing with a relatively simple dataflow package. I have two sources (SQL and Sharepoint). From the sources, I compare the two and then update the Sharepoint list with any changes that have been made (SQL query is the authoritative record). The Source Sharepoint list is the same list that is being updated. As a further test, I removed all steps except for querying the Sharepoint list and sorting it. The initial query still fails.
Errors are happening inconsistently within the dataflow package. For example since I've been testing this morning, I had one (and only once) that made it through the package to the point it was should have tried to Add, Update or Delete list items. The table comparison resulted in updates to the Sharepoint list. The package failed when attempting to update the records. Most of them (and all recent attempts) are failing when the dataflow queries the Sharepoint list initially. There are only two records on the Sharepoint List and two records on the SQL table.
I'm connecting to Sharepoint using MS Graph. Testing the connection (Connection Manager) within VS 2019 has succeeded every time. I've verified that the secret I'm using is not expired. I created a new secret and am receiving the same error. 'Usually' if I attempt to preview the Sharepoint source that is successful, but not always. Even if it is successful attempting to debug and run the package fails. I'm not seeing any alerts on Microsoft or Azure that would provide any indication that the problem's there, though I feel like something must have changed there.
I have opened a support ticket with CozyRoc and they have directed me to open a ticket with Microsoft. Microsoft's support request workflow is directing me here.
In the production All Execution reports, the error I'm getting back is:
"Data Flow Task:Error: Attempt to read message string for 0xc02090f5 failed with error 0xc02090f2. Make sure all message related files are registered."
Initial research pointed me toward a data typing issue, but I've not changed anything in our Sharepoint, SSIS or SQL environment to have changed the data types.
This appears to be very repeatable so I can try providing more information if needed.
Look likes the answer was to wait. After about 1 week, the issue resolved as quickly as it appeared. I didn't find a way to report my issue directly to Microsoft without adding a support plan. I was in the process of finding alternative methods to address our needs when it resolved 'by itself'.

SSRS Create development environment from Live server

I've inherited a live SSRS server and have been asked to amend a lot of reports that are on there.
Is there a quick way I can "export" all of the reports/data sources to a local instance so I can develop against it using BIDS?
e.g. Can I copy the ReportServer database from Production?
What else would I need to do?
I'd like to be able to have a Development copy of everything, with DataSources pointing to copies of the production databases but with the same names. Therefore I could re-write the report and re-define any SP's required locally, and then just deploy the new RDL to the server along with the ALTER SP scripts.
Is that possible or even sensible!?
Personally, with the volume you mentioned in the comments (30 RDL's and 3 databases) I wouldn't recommend some automated cloning of the entire Reporting setup from production to local. Instead, I'd suggest the following:
Reports
Go to the web front-end for your reportserver (typically http://yourserver/reports). Find each report, open it, and on the Properties tab click the Edit button. This button does not do what you might expect (edit the report inside the browser), but instead offers you a download of the RDL file. Save all the RDL files in one folder on disk.
With 30 reports manually downloading the reports may take you maybe an hour, max. This will probably beat most automated approaches. And since you should only need to do this step once...
Databases
It's not entirely clear from the question, but if you only have production databases and no DTAP setup yet, now may be a good time to start with that. You could host clones of the 3 production databases on a test server or possibly on your dev environment. Note that the schema's important here (should be the same as production), the data doesn't have to be entirely up to date.
Alternatively you can skip this bit and develop your reports against the production databases, assuming you can create connections from your dev machine to the production databases. Up to you.
Visual Studio / BIDS
This bit has a few parts to it:
Create a new reports project and solution in Visual Studio.
Add the existing RDL files you've downloaded earlier.
Depending on how the reports were set up, you may need to add shared data sources in your project, to get your reports up and running.
After all this, you should be able to preview your reports from Visual Studio (either with data coming from the "cloned" databases, or directly from production).
At this point you should also be able to safely make changes and preview/test them before deploying them.
Be sure to add the solution, reports, etc. to your version control system of choice.
Deployment
Once you've made changes you want to deploy to the reportserver, you have two basic options:
Deploy them using BIDS (see also the deployment properties MSDN page)
Go back to the web front-end, find the report, open the Properties tab again, click the Update button. This allows you to re-upload the RDL file with the changes you've made.
From now on you can just rinse and repeat on making updates and deploying the reports. No need for cloning/exporting the entire SSRS instance to keep things in sync.

TFS Requirements Overview Report showing wrong data

I have an interesting issue with TFS reports. When I run the QUERY: Team Queries->Planning and Tracking->Work Breakdown, I see the correct information, which is to say that I see the work items, etc. that are entered into TFS. However, when I run the REPORT: Reports->Project Management->Requirements Overview I see that same data PLUS data that is no longer in the system.
Important information:
* I am using TFS 2010
* When I originally created this project, I used a Microsoft Project plan to upload the work items. Before my team started using it, I decided to forget about Project and just use the web/studio interface, so I used the query "Delete all items" to clean the database.
While the clean worked in all other cases, this report seems to be holding on to those items, and I would like to know if there is a way to fix that. It has been several weeks, and I ran the cube reports to see if it was updating (everything updates fine).
Anyone have a clue what's going on here?
I'm not familiar with the query that you talk about, but if you do a delete of workitems, the delete may not have been propagated to your warehouse (and subsequently the cube). If you have a relatively small number of WorkItems in your TFSWorkItemTracking database, it may be a good idea to rebuild your TFSWarehouse, which will then refresh your cube.
Take a look at the SetupWarehouse.exe command, which should be installed on your Application Tier. This could take anywhere from an hour to a day to run, depending on your version control and work item tracking database, so you may want to do it off hours. It shouldn't affect the day-to-day execution of TFS, just the reports.
The above is for TFS 2008 Only. Per Matthew below, here's the answer for TFS 2010
From what I found SetupWarehouse.exe
no longer exists with TFS2010. In the
Administration Console, under
Application Tier->Reporting, there is
an option called "Start Rebuild".
Using this completely resolved my
problem. Thank you. It should be noted
that there is NO feedback from
clicking on "Start Rebuild". At first
it looked like the admin panel hung,
then it came back without feedback. It
took about an hour for reports to
start working again, which is the only
way I knew it was done.
If you ever get into the situation again where you need to permanently get rid of one or more workitems, you should get the TFS Power Tools. The TFPT utility has a "destroywi" command that allows you to permanently (and safely) remove workitems from TFS.
Power Tools are available here: http://msdn.microsoft.com/en-us/vstudio/bb980963

Recover a SSIS project from a SSIS package

I was developing a SSIS project, but accidentaly, I erased it. However I keep a copy of the SSIS package. So my question is, it is posible recover the project using the package? or is someway to read the package content to start over the project?
Thanks
I don't remember there being anything too essential stored in the project files for SSIS projects - you can create a new project and then 'Add Existing Item...' and add the package(s).
#Will gave you the correct solution. Project files are XML files that list which packages are part of a project. You can add an existing package back without any issues. You can even manually add a node if you want by editing the file directly. I use to find this useful before BIDS Helper offered sorting capabilities.
You may also want to implement a version control system if you are working with SSIS. Every once in a blue moon a package gets into a funky, unrecoverable state and we have to rollback to a previous version to get it working again. This happens about 4 times a year for a team of 6 people who work on 100-200 packages. Also, you will never lose a package again even if you erase it on the server and your local copy is wiped out.

How to fix SSIS : "Value, does not fall within expected range"?

When I open up the solution that contains SSIS packages created by a colleague, I get this awkward error that tells me nothing about what I'm supposed to do to fix it.
He left instructions to take all the "variables" out of the connection string in the dtsx file manually before opening up the solution. I have done that, now when try to view the package in the designer I just get an image of a red x and this message.
EDIT: You cannot see any design elements, no tabs across the top to switch to errors or data flows. Just a gray center area on the screen with a red x, and the message, its like VisualStudio dies in the process of reading the dtsx file.
The question is rather unspecific so it’s of course difficult to get on the right track here. All of the given answers focus different issues. I would say that PeterX had the best guess. The reason for the error could be as simple as a modified data source.
I came across with a bug "error output has no corresponding output" quite often when adding a new column to a table that needs to be processed by an existing SSIS package. This bug came along with an error message saying that a "Value does not fall within the expected range".
A newly added column needed to be processed by an existing SSIS Package. The expected behavior is that SSIS will recognize that there is a new column and select this column on the columns page of the OLEDB Source Task SSIS to be processed. However, when opening the OLEDB Source Task for the first time after having modified the table I got twice the following error message: "Value does not fall within the expected range." The error message showed up when opening the editor and when opening the Columns page of the editor. Within the Advanced Editor of the OLEDB Source Task the new column showed up in the OLEDB Source Output Columns Tree, but not in the OLEDB Source Error Output Columns Tree. This is the actual underlying problem of the error message. Unfortunately, there seems to be no way to add the missing column manually.
To solve the problem, remove and re-add the newly added column on the Columns Page of the normal Editor as mentioned by Jeff.
It is worth to be mentioned that the data source of the OLEDB Source task was a modified MDS View. Microsoft CRM Dynamics – as mentioned in the related thread – is using views, too. That leads me to the conclusion, that using views as a data source may produce either of the above mentioned errors, when modifying datatypes or adding/removing columns.
Related Thread: Error" ...The OLE DB Source.Outputs[OLE DB Source Output].Columns[XXXXXXXX] on the non-error output has no corresponding output
The described workaround refers to Visual Studio 2008 Version 9.0.30729.4462 QFE with Mircorsoft.NET Framework 3.5 SP1. The database is SQL Server 2008 R2 (SP2).
I had to delete and recreate the OLE DB Data source in my Data Flow - this is where I got the error. I also noted I had to "re-select" the "OLE DB connection manager" in the drop-down-list to force it to recognise the new connection.
This was probably a combination of getting the solution from TFS (where I noticed the data-sources didn't come-across properly and it complaining about a missing connection GUID) and/or copying and pasting the elements from another package.
(For BIDS 2008).
I had this issue for my OLE DB Source component with an SQL command after adding new columns to the database, and it wouldn't let me select columns or anything else to add the new columns.
I'm working with an Oracle database, and the only way I could get it to update was to change the SQL query to select 1 from dual, and preview it. Then revert it back to my old query.
You get a similar message if someone uses EncryptAllWithUserKey as the ProtectionLevel. However, I believe the message is slightly different (even though you get a grey design surface with a red X).
Have you tried viewing the file in Notepad? Is it just a series of GUIDs or is there anything in it that is humanly readable? If it doesn't have any readable code, then it was probably encyrpted with the user key.
If the employee deployed the packages to a server and used SQL Server as the deployment destination (not File System or SSIS Pacakge Store) then you can download the packages to your machine. Just connect to the SQL Server Integration Services engine, expand Stored Packages, expand MSDB, expand the relevant folder, right-click on the package, and click Export Package. Save the file on your local machine and open it. The package will probably lose annotations and pretty formatting, but otherwise it should be identical to what the employee deployed.
I just struck the same issue. After flailing about for a bit, I found the solution was to edit the Solution Configuration.
The Solution Configuration appeared to have a matching Project configuration, as shown:
However clicking the drop-down arrow for that Project (SSIS-Advance in this example) revealed that there was no Project Configuration for that project called Production - Sub Reports. I'm not sure how that came about - this Solution has a 7-year history and many developers.
Anyway once I created a New Project configuration (using that same drop-down menu), it is all happy now.
If it has Oracle data sources, you may need to install the Microsoft Connectors v4.0 for Oracle by Attunity:
https://www.microsoft.com/en-us/download/details.aspx?id=52950
I also had to use VS 2015 - the version originally used to create the project and package.
I had this exact problem and installing these connectors and using VS 2015 fixed the issue.
I had this occur as well when I tried to call a stored procedure with OUTPUT parameters with OLE DB.
I found this: http://sqlsolutions.blogspot.com/2013/04/ssis-value-does-not-fall-within.html, which resolved my issue. The relevant action was to rename the SSIS parameter mappings to '0', '1', etc.
So for example, when calling dbo.StoredProc #variable0 = ?, #variable1 = ? OUTPUT, #variable2 = ?;, in the parameter mapping dialog, you would name the parameters '0', '1', 2' to correspond to those. Ah, SSIS <3
I get this when I do not follow the convention for parameter naming, e.g. not name parameters 0,1,2,... in the right order for OLE DB connections.
The details are documented here.
In your connection manager, convert your connections to package level instead of project level
Delete connection manager and re-create and setup ssis package solve the problem.
I got this issue after I Add Existing Connection Manager in a SSIS project. I was just importing a Project Connection Manager from a different project (.conmgr) to my project. My solution to fix the issue was:
Deleting the imported .conmgr
Recreating it from scratch