SSIS 2010 Configure Flat File Source Editor for Polish Lanugage - ssis

I have Pipe delimited flat files containing data in Romania and Polish languages/characters.
The first row contains the column_names in English.
I enabled the "unicode" check box in Flat File Connection Manager Editor. But the columns are displayed in unknown character(In Columns Tab).
I need to map this data to a table in SQL server 2012. But in the OLE DB Editor, I am getting the same single column as "Available Input Column"enter image description here

Try opening the source file using a file editor like Notepad++
The editor will tell you which codepage has been used
In your Flatfile Connection use the codepage displayed

Source setting:
In Import export wizard, when you select flat file; set Code page: UTF - 8
Destination setting:
Click Edit Mapping, and while mapping, flat file column with table column, choose destination column Type as nvarchar for required columns.
Run the package (or Click Preview) and check the table data, it should work.

Related

SSIS Adding Text Qualifier to Imported Data

I am trying a CData Driver to import reports from Quickbooks Desktop and export them out as a flat file. The fields are
GL_ID, Debit, Credit
However, when SSIS loads this file, the fields are not currently using text Qualifiers(GL_ID= Hello, World vs GL_ID = "Hello, World"), which means when I import using comma delimiters, any fields which already have commas in them split apart.
How Can I add this missing text qualifier so that fields which have commas in their text are not split up when using a CSV delimiter.
Highlighted Ex: Row with .... ,LLC loads as 4 Columns instead of 3
You need to edit the Flat File Connection Manager (not pictured) that is being used by the Flat File Destination.
In the Flat File Connection Manager, on the General tab, you can specify the Text Delimiter (default is None).

BLOB error when mapping nvarchar columns with the same fixed length in SSIS

I am using SSIS to move data between environments, and I am getting the following error quite often inside Lookup components, when mapping the input columns to the output columns:
I fixed the problem in most locations, and using nvarchar(MAX) as the type was the cause of the problem, but I am still getting it, even when the type of the input and output columns is nvarchar(100). Any idea why I am getting this error? I tried to use a data conversion on the source data before, but without any success!
EDIT
Below you can find screenshots from my lookup's configuration (named lookup update rows)
EDIT 2
When I open the .dtsx file related with the project in a text editor, I have several datatypes set as nText (like shown below), which I think is the cause of my problem.
dataType="nText"
cachedDataType="nText"
I change these lines to, respectively, the following lines:
dataType="wstr"
length="100"
cachedDataType="wstr"
cachedLength="100"
But when I build, my changes disappear, and the ntext types are once again set.
The solution to get rid of BLOB types is to change the datatypes (SSIS datatypes) for the components within the dataflow in the advanced editor.
For each component, right click on it, and choose "Show advanced
editor"
Click in the column "Input and output properties"
For all the input and output columns listed there, change the datatype when it is DT_NEXT to DT_WSTR, choosing an appropriate length as well
This didn't work for me as I was using an ODBC data source.
I had to CAST my blob table fields as varchar(max) using the SQL command text box in the ODBC Source Editor and then go to the advanced editor and edit all ODBC source Output columns that I had CAST as DataType string[DT_STR].
Hope this helps someone.
What solved my problem is that my source had string constraint of 50 chars while my destination was varchar(max). I changed to metadata in the destination column that was giving me the error from max to 50. Problem solved.

SSIS package for export data into csv file to FTP

I'm creating SSIS package for to get .csv file to my local server and transfer it to FTP
When I get my csv into FTP and open into excel, My data getting shift over to other columns. Is there internally any kind set up do I need to change?
Also I tried different text qualifier still did not work.
It sounds like there may be hidden characters in your data set. If you are using comma's you may want to consider using a lesser used character for the delimiter such as a pipe "|". For instance an address may naturally have comma's. If a pipe shows up in an address field it's probably a type-o, and is far less likely. Things that shift data cells are often things like tab characters and CRLF. You can also open your data set in a text editor like notepad ++ and choose the "Show all Characters" option under "View->Show Symbols" menu option to see what the exact character is. If it's rampant in your data set you can use the replace function within the Derived Column Task to scrub the data as it comes out of the data source.

Junk characters at the beginning of file obtained via column transformations in SSIS

I need to export varbinary data to file. But, when I do it using Column Transformations in SSIS, the exported files are corrupt. There are few junk characters at the start of the file. On removing them, the file opens fine.
A similar post for BCP, says that these characters specify the data length.
Would like to know how to address this issue in SSIS?
Thanks
Export transformation is used for converting the varbinary to files.I have tried something similar using Adventure works which has image type of var-binary data.
Following Query is used for the Source query. I have Modified the query
since it does not have the full path to write image files.
SELECT [ProductPhotoID]
,[ThumbNailPhoto]
,'D:\SSISTesting\ThumnailPhotos\'+[ThumbnailPhotoFileName]
,[LargePhoto]
,'D:\SSISTesting\LargePhotos\'+[LargePhotoFileName]
,[ModifiedDate]
FROM [Production].[ProductPhoto]
Used the Export column transformation[also available in 2005 and
2008] and configured as follows.
Mapped rest of the columns to the destination.
After running package all the image files are written into the
respective folders[D:\SSISTesting\ThumnailPhotos\ and D:\SSISTesting\LargePhotos].
Hope this helps!

How to assign a text qualifier in a flat file destination?

We have an SSIS package which reads from a DB, creates a flat file from that info, and drops it to a file server.
I recently made an update to the package's query which is used against the DB, adjusted the column mappings, and placed it under the SQL Job which ran the SSIS package before.
The problem is that the text qualifier in the flat file should be a quotation mark: ". But when I checked the flat file it produced, the text qualifier showing is: _x0022_
I investigated the Text Qualifier property for the DestinationConnectionFlatFile, and it is set to a quotation mark: "
How can I ensure the flat file will have a text qualifier of quotation mark?
Here is a previous answer I found when this happened to me:
SSIS exporting data to flat file renders double quotes as hexadecimal characters
Additionally ,
This issue occures because of installation issue. So if you see this sort of Issue, It mean if you are loading from file Database table and file contains 100 records , then instead of 100 records only 99 records would get loaded to database , last records would get skipped.
I had same issue, to fix that I had re-instalation of
1) MS Visual Studio
2) MS BI Studio
in the sequence mentioned above.
Given below are the two solutions :
Solution 1: Open the Package in Notepad and Edit the Value present in the "TextQualifier" of particular object to """.
object Name
"
Solution 2: open the Package and Replace the Value in the "TextQualifier" of the Flat File Connection Managers (FFD,SRC,SOURCE) to "\"".
Solution 1: Open the Package in Notepad and Edit the Value present in the "TextQualifier" of particular object to (") &quot follow semicolon
object Name
"
Thanks,
Prakash.A