SSIS package fails after column order changes - ssis

I have a SSIS package that does a simple read from a flat file and inserts into SQL Server 2005. It was running fine on one computer (computer 1) with both the source and destination pointing locally.
The package then got moved to another computer (computer 2), again with everything pointing locally, and started failing. After looking into this for a while, it turned out to be that the columns of the destination table were in a different order on the two machines. The package was failing because it was trying to write data to the wrong table columns. That is, on computer 1 the columns were A,B,C and on computer 2 they were C,A,B - the package was trying to write A's data into C on computer 2, etc.
Am I missing something here? Does SSIS really depend on column order when writing to an OLE DB destination, instead of the column names? Or do I have a bad setting?

SSIS reads the metadata from the connection and stores it with the mapping. Sometimes it can detect a change and will give a validation error (in which case the package will fail at validation and you will have to alter the package to correct the problem). You can sometimes see this in the designer if you open the package, and it will offer to correct the columns.
In some cases, it will not detect changes during the validation phase, and it will fail during the insert.
So my question is, did it fail during validation or a later execution phase?

It is same when you use Oracle as source or destination. It doesn't detect column changes and faling during execution. I am updating Data Flow task everytime when column order changed or a new column added.

Have you checked the column mapping of in the data flow task? Sometimes the mapping disappears or points to a wrong column. In your case the SSIS package is moved then there is good chance the mapping is corrupted. I think you just need to correct the mapping and it should be a fix.

Related

SSIS package wrote 0 rows

Yes, I read the other questions on the same topic, but they do not cover my issue.
We run two environments; DEV and Prod. The two were synched last week, meaning they ought to contain the same data, run the same SSIS packages, and source the same source data.
However, today we had a package on PROD go through its' usual steps (3 tables being truncated, and then loaded from OLEDB source to OLEDB destination, one after the other). The package finished without throwing an error, and the first 2 tables contain data, whereas the last one does not.
On DEV, everything looks fine.
I went through the package history, and it actually shows it wrote 0 rows:
Yesterday, however, it worked as intended:
When I manually ran the package, it wrote data. When I click "Preview", it displays data. When I manually run the source query, it consistently returns data, the same amount of rows, every time. The SSIS catalog has not been updated (no changes were deployed to PROD between yesterday and today).
The source query does not use table variables, but it does use CTEs. I have seen suggestions to add SET NOCOUNT ON, and willing to accept this could be an explanation. However, those answers seem to indicate the package never writes any data, whereas this package has worked successfully before, and works successfully on DEV.
Does anyone have any explanation as to how I can explain to my customer that I have no clue as to why 1 package suddenly chose not to write any data, and how I can ensure this won't happen again, to either this package or any of the other packages?
This can be tricky. Try the following:
Under Integration Service Catalogs -> SSISDB -> project -> (right click)Reports -> Standard Reports -> All executions. Check here if at any point, ETL job lost contact with warehouse.
2.If you have logging enabled, try to see at what task_name your package started returning 0:
select
data_stats_id,
execution_id,
package_name,
task_name,
source_component_name,
destination_component_name,
rows_sent
from
ssisdb.catalog.execution_data_statistics
How are you handling transactions and checkpoints? This is important if you want to know root cause of this issue. It may happen that due to loss of connectivity had forced to rollback any write in warehouse.
As it turns out, the issue was caused by an oversight.
Because we run DEV and PROD on the same server (we know, and have recommended the customer to at the very least consider using different instances)), we use variables in which we point at the proper environment (set in the environment variables).
The query feeding this particular package was updated, and apparently rather than using the variable to switch databases, it was hard-coded (likely as result of testing, and then forgetting to update the variable). The load for DEV and PROD run at the same time, and we suspect that while PROD was ready, DEV was still processing the source tables, and thus 0 rows were returned.
We only found this out today because the load again ran fine right until this morning. I was too late to catch it using Profiler, but because it was only this package, I checked, and spotted the hardcoded reference to _DEV.
Thanks everyone for chiming in.

SSIS Data Source Data Types

I'm in the process of creating a suite of packages to import data from our ERP system running on Informix IDS 11.7 into SQL server (2012).
Using SSIS to import this data I've come across an issue. I can access the data in 2 ways, using an ODBC connection and an ADO.NET data source, or using the OLEDB connection and provider.
Using ODBC is about 3 times slower (conservatively!), so naturally I'm keen to move away from that.
However the problem is, when I use OLEDB the data source is reporting the wrong data types.
NVARCHAR data types on the server are being reported as VARCHAR (DT_STR) to SSIS. This causes problem when importing data as any unicode data that come in causes the package to fail. There's no opportunity to do data conversions here, the package is failing when the data hits the data source component. I can set the component to ignore these errors and it will run fine, but with missing data, which isn't acceptable at all.
I've tried setting the DB_LOCALE and CLIENT_LOCAL in setnet32, doesn't have any effect.
When you create the OLEDB data source it complains about the default code page not being able to be found, so you have to set the property to "AlwyasUseDefaultCodePage" to true for that warning to go away, however setting the default code page doesn't seem to actually change it's behavior, it's still trying to bring this data through as VARCHAR even if I set the code page to something like 65001.
So, my question is, how can I either get the OLEDB driver from Informix working correctly, or set/force SSIS to think this data is actually DT_WSTR, not DT_STR?
Thanks!
So, to summarise the steps required to get this working (for posterities sake at least).
When setting up your OLEDB connection you need to add the 2 parameters -
RSASWS=TRUE;
UNICODE=TRUE;
These 2 parameters are NOT shown in the GUI for the connection, at least with the Informix 4.1 driver.
To add these in you need to modify the "ConnectionString" property of the connection, adding these 2 properties to the end of the connection string. Be aware that this property gets overwritten each time you open the connection GUI. You will need to make sure you modify the connection string manually after each time you enter this interface.
The other option for setting the connection string is using variables (or parameters in SSIS on SQL 2012), so any changes that get automatically made to the connection string will be corrected at run time (they actually get corrected at design time when using parameters).
One final caveat I've found with this data source, is appears to be padding out nvarchar's with empty spaces, as if they were nchar's. This could be an issue with my source data though, but something to check if you setting this up, you may need to add a trim step in.

SSIS ‘Data Flow Task’ No records in flat file destination

Please forgive my initial posting being a question instead of a solution.
I’ve got two SSIS packages that basically do the same thing. The last step of both is a ‘Data Flow Task’ that queries the database and attempts to write the results to a flat file. One of the packages builds the flat file correctly, the other builds the file but doesn’t populate it with any records. Running SQL Server 2008 R2.
This is in a University setting involving transferring degree_codes and demographics between two systems. The degree_code package is functioning, the demographics isn’t. Both ‘Data Flow Tasks’ consist of an OLE DB Source linked to a Flat File Destination (tab delimitated text). Both packages display the correct data set when previewing the OLE DB Source.
In the Flat File Destination, the mappings are correct in both packages. However, when previewing the data, the degree details display correctly, but there are no records in the demographic preview. That’s also true when looking at the connection managers. And when the packages run, the degree_codes file is correct while the demographics file only contains a header. It seems there is a problem with the link between the OLE DB Source and the Flat File Destination
Both packages run with only a warning about shared global memory impacting performance. I’ve deleted and rebuilt the non-functioning Data Flow Task and connection managers without fixing the problem. At this point I am at a loss of which direction to go and don’t know how to diagnose the problem. Have any of you folks run into a similar situation or do you have any suggestions how to chase it down. I’d be grateful for any solutions.
Try to export the data to a tmp table in your db, if the data is saved there the issue is on the file connection, if not your query needs to be rewritten
Verify the query columns you are executing on the tables are matching and data types are as expected in output , try putting all as string types initially and check if it works then apply correct data types after it executes successfully you can modify for data types as needed

Timeout issue during data transfer from MySQL to SQL Server using SSIS

I am trying to transfer 67,714,854 rows from MySQL to SQL Server using SSIS. The package times out after transferring 14,282,990 rows. I changed the time out property to 0 also, but that didn't help.
How do I resolve this issue?
I found a hacky solution to it. And that is having a limit at the end of your query. I was facing the same problem with ADO .NET connection to connect to MySQL. Although it doesn't solve the problem. It atleast get the work done.
SSIS: 2208 R2.
MySQL: 5.0
On your OLE DB Destination connection, what "Data access mode" have you selected. If you have selected "Table or view - fast load" (this is the default), then there will be a "Maximum insert commit size" specified. You can try one of two things: 1) change the commit size to a larger number; or 2) try the other data access mode "Table or vew". Since you're getting a timeout error, I suspect that option 1 may not help (since you're already getting a timeout with a smaller value), so try option 2. Although that will likely result in slower performance, it may actually complete successfully. (You could then try #Siva's approach and split the output across multiple destinations to improve performance).
(Note: I'm referring to what's available in SQL Server 2008 R2, if you're using previous versions, it may be slightly different)
If none of the above work, you could also try to create a new SSIS package from scratch by running the SQL Server Import Wizard (right-click on your database in SQL Server Management Studio and select Tasks/Import Data. Follow the wizard screens and near the end make sure you check the box to Save the SSIS package, and choose a file location to save it to. Typically, the resulting SSIS package will be a functional package (and then you can also make whatever further modifications you like to it).
Does MySQL give you the error or are you using PHP (or another language) to transfer the data and does that timeout? In the case of the latter, in PHP you can set the script timeout to infinite using this:
set_time_limit(0);
Either way, based on the information given, I'm not sure what type of database it is, but typically I would set up a cron script to transfer the data bit by bit in order to keep the load at an acceptable level. Please give more information...

How to fix SSIS : "Value, does not fall within expected range"?

When I open up the solution that contains SSIS packages created by a colleague, I get this awkward error that tells me nothing about what I'm supposed to do to fix it.
He left instructions to take all the "variables" out of the connection string in the dtsx file manually before opening up the solution. I have done that, now when try to view the package in the designer I just get an image of a red x and this message.
EDIT: You cannot see any design elements, no tabs across the top to switch to errors or data flows. Just a gray center area on the screen with a red x, and the message, its like VisualStudio dies in the process of reading the dtsx file.
The question is rather unspecific so it’s of course difficult to get on the right track here. All of the given answers focus different issues. I would say that PeterX had the best guess. The reason for the error could be as simple as a modified data source.
I came across with a bug "error output has no corresponding output" quite often when adding a new column to a table that needs to be processed by an existing SSIS package. This bug came along with an error message saying that a "Value does not fall within the expected range".
A newly added column needed to be processed by an existing SSIS Package. The expected behavior is that SSIS will recognize that there is a new column and select this column on the columns page of the OLEDB Source Task SSIS to be processed. However, when opening the OLEDB Source Task for the first time after having modified the table I got twice the following error message: "Value does not fall within the expected range." The error message showed up when opening the editor and when opening the Columns page of the editor. Within the Advanced Editor of the OLEDB Source Task the new column showed up in the OLEDB Source Output Columns Tree, but not in the OLEDB Source Error Output Columns Tree. This is the actual underlying problem of the error message. Unfortunately, there seems to be no way to add the missing column manually.
To solve the problem, remove and re-add the newly added column on the Columns Page of the normal Editor as mentioned by Jeff.
It is worth to be mentioned that the data source of the OLEDB Source task was a modified MDS View. Microsoft CRM Dynamics – as mentioned in the related thread – is using views, too. That leads me to the conclusion, that using views as a data source may produce either of the above mentioned errors, when modifying datatypes or adding/removing columns.
Related Thread: Error" ...The OLE DB Source.Outputs[OLE DB Source Output].Columns[XXXXXXXX] on the non-error output has no corresponding output
The described workaround refers to Visual Studio 2008 Version 9.0.30729.4462 QFE with Mircorsoft.NET Framework 3.5 SP1. The database is SQL Server 2008 R2 (SP2).
I had to delete and recreate the OLE DB Data source in my Data Flow - this is where I got the error. I also noted I had to "re-select" the "OLE DB connection manager" in the drop-down-list to force it to recognise the new connection.
This was probably a combination of getting the solution from TFS (where I noticed the data-sources didn't come-across properly and it complaining about a missing connection GUID) and/or copying and pasting the elements from another package.
(For BIDS 2008).
I had this issue for my OLE DB Source component with an SQL command after adding new columns to the database, and it wouldn't let me select columns or anything else to add the new columns.
I'm working with an Oracle database, and the only way I could get it to update was to change the SQL query to select 1 from dual, and preview it. Then revert it back to my old query.
You get a similar message if someone uses EncryptAllWithUserKey as the ProtectionLevel. However, I believe the message is slightly different (even though you get a grey design surface with a red X).
Have you tried viewing the file in Notepad? Is it just a series of GUIDs or is there anything in it that is humanly readable? If it doesn't have any readable code, then it was probably encyrpted with the user key.
If the employee deployed the packages to a server and used SQL Server as the deployment destination (not File System or SSIS Pacakge Store) then you can download the packages to your machine. Just connect to the SQL Server Integration Services engine, expand Stored Packages, expand MSDB, expand the relevant folder, right-click on the package, and click Export Package. Save the file on your local machine and open it. The package will probably lose annotations and pretty formatting, but otherwise it should be identical to what the employee deployed.
I just struck the same issue. After flailing about for a bit, I found the solution was to edit the Solution Configuration.
The Solution Configuration appeared to have a matching Project configuration, as shown:
However clicking the drop-down arrow for that Project (SSIS-Advance in this example) revealed that there was no Project Configuration for that project called Production - Sub Reports. I'm not sure how that came about - this Solution has a 7-year history and many developers.
Anyway once I created a New Project configuration (using that same drop-down menu), it is all happy now.
If it has Oracle data sources, you may need to install the Microsoft Connectors v4.0 for Oracle by Attunity:
https://www.microsoft.com/en-us/download/details.aspx?id=52950
I also had to use VS 2015 - the version originally used to create the project and package.
I had this exact problem and installing these connectors and using VS 2015 fixed the issue.
I had this occur as well when I tried to call a stored procedure with OUTPUT parameters with OLE DB.
I found this: http://sqlsolutions.blogspot.com/2013/04/ssis-value-does-not-fall-within.html, which resolved my issue. The relevant action was to rename the SSIS parameter mappings to '0', '1', etc.
So for example, when calling dbo.StoredProc #variable0 = ?, #variable1 = ? OUTPUT, #variable2 = ?;, in the parameter mapping dialog, you would name the parameters '0', '1', 2' to correspond to those. Ah, SSIS <3
I get this when I do not follow the convention for parameter naming, e.g. not name parameters 0,1,2,... in the right order for OLE DB connections.
The details are documented here.
In your connection manager, convert your connections to package level instead of project level
Delete connection manager and re-create and setup ssis package solve the problem.
I got this issue after I Add Existing Connection Manager in a SSIS project. I was just importing a Project Connection Manager from a different project (.conmgr) to my project. My solution to fix the issue was:
Deleting the imported .conmgr
Recreating it from scratch