SSIS: cannot convert between unicode and non-unicode string data types - ssis

I am working in SQL Server 2008 and BIDS (SSIS). I have a simple data flow task in which I'm importing a flat file into an OLE DB destination. On the OLE DB destination, I'm getting an error message, stating "cannot convert between unicode and non-unicode string data types".
Now, I know one solution method is to put a data conversion in between the flat file and the OLE DB destination. However, my concern is why this is even happening in the first place. In the connection manager for the given flat file, all columns are string (DT_STR) data types, and the Unicode option is unchecked. Similarly, all columns in the destination table (upon the inspection of the metadata in SSMS) are varchar data types. So, there is no nvarchar-to-varchar mapping going on at all.
Why is SSIS thinking that I have a unicode-to-nonunicode mapping happening? And, is there an easier way to resolve this than inserting a data conversion step for the columns that have this problem?

[Since I don't seem to be allowed to comment on the question, I'm having to put my question here.]
Have you checked the table you're trying to insert the data into to see if the columns in the table are varchar or nvarchar? The SSIS metadata could be out of sync with the database table.

Related

How to handle frequently changing data types in SSIS

I have a project containing numerous SSIS packages. These simple packages are loading data from an external server to my internal database and not making any data transformations. I cannot control the quality and the structure of the data in the source and get it 'as is'. Most of the data comes in the nvarchar data type.
My problem is that the owner of the source database is frequently changing the length of the nvarchar fields (say, from nvarchar(500) to nvarchar(510)). This makes my packages crash on truncation.
How can I set up my packages in a way that in case of any length changes they ignore it and simply truncate the data if needed? As far as I understand I should set up my 'Error output', but I am not sure whether I should work with ODBC Source Output or with OLE DB Destination Input.

Extract data from Sybase ASE 15 using SSIS 2008 in Unicode format

I'm attempting to use SQL Server Integration Services (SSIS) 2008 R2 to extract data from a Sybase ASE 15 database.
I've managed to configure the OLE DB Source with the correct connection information and can see the tables and data. However no matter what I try it always returns DT_STR columns.
I would like to have the data returned in Unicode format, without using the Derived Column / Data Conversion task, as the destination tables are all defined with NVARCHAR (DT_WSTR) column and it would be a bit of a pain to have to go through every column just to change the type.
Is there a way to define the connection string / set defaults on the login / other method to ensure that the OLE DB Data Source returns DT_WSTR columns instead of DT_STR when running a query?
Many thanks,
John
Right click on the OLE DB Data source and select the Advanced Editor. Then go to the Input and output Properties tab.
Select the column that you're importing and change its dataType on the properties display on the right.
Alternatively you could have your OLE DB Source use a query to return its columns. Your SELECT statement can then CAST() your columns into the type you need.
I don't know why you dont want to use a derived column/data transformation transformation tho. It seems like it would be just the same amount if not more work

Why all the records are not being copied from CSV to SQL table in a SSIS package

I am trying to copy data from a flat file to a SQL table using SSIS.
I have a Data Flow Task where I have created a Flat File Source pointing to the csv file and an OLE DB Destination pointing to the table I want the data in.
The problem I am facing is when I run the package, I get only 2621 rows copied to the SQL destination table, where I have about 1,70,000 records in the csv. Not sure why this is happening.
Thanks in advance.
This could be a number of things. This is what comes to mind:
The connection string to your flat file is overwritten by a variable expression or a package configuration. Check SSIS -> Package configurations or the Expressions properties on your connection manager.
The DataRowsToSkip property on your flat file connection manager is set to a value.
The meta data definition of your flat file is incorrectly configured in your connection manager. See properties such as Format, Row delimiter, Column delimiter, etc. Use the preview function to see the output.
The error output on your flat file source is set to Ignore failure, meaning that lines which SSIS cannot process (due to, e.g., incompatible data types) are ignored without warning.

SSIS package creation for integrating MSSQL and MySql dbs

I am trying to create an SSIS package for integrating between MSSQL and MYSQL. I have no prior experience of working with Bids or SSIS and following the instructions from here.
I added the OLE DB Source, Lookup, Conditional Split, OLE DB Destination and OLE DB Command components to the Data Flow and configured the connection managers and column mappings upto the Conditional Split component.
From here, I am facing two problems -
1) After configuring the OLE DB Destination, it shows error symbol on the component that says could not convert between unicode and non unicode string datatypes. To solve this, I tried to insert a Data Conversion Component between the Conditional Split and the Destination and configured it for the problematic column. But that doesnt seem to help
2) While configuring the OLE DB Command, the right hand side column in Column mappings tab shows zero columns. I have added the Sql command with question marks so i guess it should be showing columns named "Param_0", "Param_1" etc if i am not wrong. I even tried to add them manually from the input and output properties tab but then it shows the warning, external columns for OLE DB command are out of sync with data source
What am I missing here ?
Thanks
The way you describe your first problem, it sounds like it should work. Here are a couple of things to check.
The data conversion component creates a new column for the converted data. Make sure you are referring to it in your following transformations and destination.
Right-click on the Data Conversion component and select Advanced Editor. Select the Input and Output Properties tab in the Advanced Editor. Expand the Data Conversion Output branch of the tree-view and select your new column. Ensure that the Data Type Properties show the data type that you want to convert too. If these values are not right then something is not right with the setup in the component.
For your second problem, the issue can frequently be caused by an error with the SqlCommand value. First, make sure the Connection Manager is correct on the Connection Manager tab. Switch to the Column Mappings tab. Near the bottom of the form, there may be a warning message that indicates that the SQL statement cannot be prepared. In other words, SSIS can't figure out what the statement is supposed to do. Address any problems with the SQL statement and switch back to the Column Mappings tab. The columns will appear once the SQL statement can be parsed.
If you want to avoid the conversion issues then change your destination table column types from char/varchar to nchar/nvarchar. I'm pretty sure you will need to use an ADO connector for mysql source and destinations, you should be able to read data from the mysql source and write to the mssql database w/o using anything other than source and destination components.

OLE DB to get BlobColumn Data in SSIS Dataflow

When I use ADO.net source in DataFlow to read Blob Column and pass it to Script Component to do further validations - need script compoment to do further validations on each column to generate master / child error records master (for each row) and child (for each error column). This works fine.
As I need to parameterize my source, I can't use ADO.net and instead need to use the OLEDB Source which supports parameters. When I use this OLEDB source, the script component doesnt recognise the BLOB data being passed by OLEDB source. It reports datatype problems i.e., convering nonunicode to unicode.
How can this be done.
Regards
Can you confirm what your source database is (SQL Server, Oracle, etc).
I had the same problem using the 'Oracle OLEDB provider for Oracle' data source. The provider seems to convert every varcahr into an nvarchar. I solved this by adding a 'data conversion' component, and explicitly converting all nvarchar columns to varchar here.
The new columns are incuded in the output of this compnent, so you can link them to the fields on your spreadsheet.