How to assign a text qualifier in a flat file destination? - ssis

We have an SSIS package which reads from a DB, creates a flat file from that info, and drops it to a file server.
I recently made an update to the package's query which is used against the DB, adjusted the column mappings, and placed it under the SQL Job which ran the SSIS package before.
The problem is that the text qualifier in the flat file should be a quotation mark: ". But when I checked the flat file it produced, the text qualifier showing is: _x0022_
I investigated the Text Qualifier property for the DestinationConnectionFlatFile, and it is set to a quotation mark: "
How can I ensure the flat file will have a text qualifier of quotation mark?

Here is a previous answer I found when this happened to me:
SSIS exporting data to flat file renders double quotes as hexadecimal characters

Additionally ,
This issue occures because of installation issue. So if you see this sort of Issue, It mean if you are loading from file Database table and file contains 100 records , then instead of 100 records only 99 records would get loaded to database , last records would get skipped.
I had same issue, to fix that I had re-instalation of
1) MS Visual Studio
2) MS BI Studio
in the sequence mentioned above.

Given below are the two solutions :
Solution 1: Open the Package in Notepad and Edit the Value present in the "TextQualifier" of particular object to """.
object Name
"
Solution 2: open the Package and Replace the Value in the "TextQualifier" of the Flat File Connection Managers (FFD,SRC,SOURCE) to "\"".

Solution 1: Open the Package in Notepad and Edit the Value present in the "TextQualifier" of particular object to (") &quot follow semicolon
object Name
"
Thanks,
Prakash.A

Related

MySQL import - CSV - file refuses to be properly imported

I'm trying to import the following file into a MySQL Db:
https://drive.google.com/drive/folders/1WbRdNgqVre3wN4DpJZ-08jtGkJtCDJNQ?usp=sharing
Using the "data import wizard" on MySql Workbench, for some reason I'm getting "218\223 lines imported successfully", whereas the file contains close to 100K.
I tried looking for special chars around lines 210-230, also removing all of them, but still the same happens.
The file is a CSV of Microsoft Bing's geo locations, used in Microsoft Advertising campaigns, downloaded from Microsoft's website (using an ad account there).
I've been googling, reading, StackOverflowing, playing with the file and different import options...
I tried cutting the file into small bits, and the newly created file was completely corrupt somehow...
Encoding seems to be UTF-8, line breaks all "\n". I tried changing them all into "\r\n" using notepad++, but still the same happens.
File opens normally in Excel, looks normal, passes CSVlint.io...
The only weird thing is that the file contains quotes on some of the values but not on the rest (e.g. line 219. Yeah I know it sounds like this would be the problem, but I removed it, and all the rest of the lines with quotes, and it still happens... Also tried loading with ENCLOSED BY ", see below).
I also tried using SQL statements to import:
LOAD DATA LOCAL INFILE 'c:\\Users\\Gilad\\Downloads\\GeoLocations.csv'
INTO TABLE aw_geo_map_bmsl
FIELDS TERMINATED BY ','
(tried also with: ENCLOSED BY '"')
LINES TERMINATED BY '/n'
IGNORE 1 ROWS;
(had to add OPT_LOCAL_INFILE=1 to the connection on Advanced for MySQL Workbench to be allowed access to local files on my computer)
This gives 0 rows affected.
Help?
Epilogue: In the end I just gave up on all these import wizards and did it the old "make your SQL statements from Excel" way.
I imported the CSV data into Excel. Watch out: in this case I found I needed to use a data import wizard from Excel (but that one worked perfectly) to be able to change the encoding to UTF, which Excel 2010 chose as "windows" which was wrong.
After processing the data a bit to my liking, I used the following Excel code:
=CONCATENATE("INSERT INTO aw_geo_map_bmsl (`Location Id`,Name,`Canonical Name`,`Location Type`,Status,`Adwords Location Id`)
VALUES (",
A2,
",""",B2,"""",
",""",C2,"""",
",""",D2,"""",
",""",E2,"""",
",",F2,");")
to generate INSERT statements for every line, then copy-pasted and pasted only values, then pasted into an editor, removed additional quotes that Excel adds, and ran it in MySQL Workbench, which runs it line by line (takes some time), and you can see the progress.
Saved me hours of unsuccessfully playing around with "automatic tools" which fail for unknown reasons and don't give proper logs ootb.
Warning: do NOT do this for unsanitized code as it's vulnerable to SQL injection. In this case it was data from Microsoft so I know it's fine.

Import fails from CSV file into SQL Server 2012 table

I am trying to import a rather large (520k rows) .CSV file into a SQL Server 2012 table. The file uses a delimiter of ;.
Please do not edit my delimiter. It is ";" I know that may seem strange, but that is what they used. It is not just a semicolon.
I don't think the delimiter is the issue because I replaced it with a tab and it seemed to be okay. When I try importing the file, I get a text truncation error, but I set the column to 255 just to be sure it had plenty of room.
Even when I delete the row, the next row causes the error. I don't see any offending characters in the data, so I am at a loss as to what the issue is.
I ended up using the EOL Conversion and selected Windows format in Notepad++ and then created a script to import the data.

SSIS 2012 extracting bool from .csv failing to insert to db "returned status 2"

Hi all quick question for you.
I have an SSIS2012 package that is reading a flat file (.csv) and is loading it into a SQL Server database table. However, I am getting an error for one of the columns when loading the OLEDB Destination:
[Flat File Source [32]] Error: Data conversion failed. The data conversion for column "Active_Flag" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
I am wondering if this is because in the flat file (which is comma delimited), the values are literally spelled out "TRUE" or "FALSE". The advanced page on the flat file properties has it set to "DT_BOOL" which I thought was right. It was on DT_STRING originally, and that wasn't working either.
In the SQL server table the column is set up as a bit, and allows nulls. Is this because it is literally typed out TRUE/FALSE? What's the easiest way to fix this?
Thanks for any advice!
It actually turned out there was a blank space in front of "True"/"False" in the file. Was just bad data and I missed it. Fixing that solved my issue. Thank you though, I did try that and when that didn't work that's when I knew it was something else.

SQL Server Export Unicode & Import via SSIS

(SQL Server 2008)
So here's my task ..
I need to export query results to file, and then import that file using SSIS to another DB.
Specific to the task, the data contains every awkward unicode character you can think of, so delimiting with commas, pipes etc is out of the question.
Here are the options SSMS gives me for export format:
Column Aligned
Comma/Tab/Space delimited
Custom delimiter
And here are the options SSIS gives me for a flat file data source:
Delimited (custom)
Fixed Width
Ragged Right
So given that a delimiter character is out of the question ... I cannot see another method that both SSMS & SSIS agree on.
Such as fixed width ?
Seems strange that the 2 closely related MS products have such different options.
Or have I missed something here ?
Any advice appreciated !!
It seems you need to try out different combination of options while creating delimited flat file(for your exported query result).
Try setting Code page to UTF-8 with and without Unicode. Also use Text qualifier as " or any of your choice which you thought might work. Also try using different option for column delimiter.
Once you are able to create delimited file then you have to apply same setting on file while importing to another DB.

Junk characters at the beginning of file obtained via column transformations in SSIS

I need to export varbinary data to file. But, when I do it using Column Transformations in SSIS, the exported files are corrupt. There are few junk characters at the start of the file. On removing them, the file opens fine.
A similar post for BCP, says that these characters specify the data length.
Would like to know how to address this issue in SSIS?
Thanks
Export transformation is used for converting the varbinary to files.I have tried something similar using Adventure works which has image type of var-binary data.
Following Query is used for the Source query. I have Modified the query
since it does not have the full path to write image files.
SELECT [ProductPhotoID]
,[ThumbNailPhoto]
,'D:\SSISTesting\ThumnailPhotos\'+[ThumbnailPhotoFileName]
,[LargePhoto]
,'D:\SSISTesting\LargePhotos\'+[LargePhotoFileName]
,[ModifiedDate]
FROM [Production].[ProductPhoto]
Used the Export column transformation[also available in 2005 and
2008] and configured as follows.
Mapped rest of the columns to the destination.
After running package all the image files are written into the
respective folders[D:\SSISTesting\ThumnailPhotos\ and D:\SSISTesting\LargePhotos].
Hope this helps!