SSIS Handling a Flat File Missing a Text Qualifier - csv

I'm currently designing as SSIS package to import some CSV files and needs to account for various error types. One of the errors is an incorrect or missing text qualifier.
I.E: "col1","col2","col3/,"col4"
The package is currently throwing the error "[ProductMaster CSV [66]] Error: The column delimiter for column "Column 2" was not found.".
Which is what I would expect to see in this situation.
Apparently getting the file initially sent in the correct format isn't an option at the moment.
I've tried changing the file to have no text qualifier, but this then falls over if there is a comma in a field so is not a viable solution.
Is there any way of handling this?

I use a third party tool to read csv files and it handles this type of situation. If you must do something on your own I would import the entire line to one column and then parse it with either a stored procedure or a script component.
There are plenty of solutions out there, some free and some with a minimal cost.
I have never found a way to handle this with SSIS connection managers 'out of the box'.

To solve this issue, look at your file format. Use text editor like Notepad++ and if your file is CR, make sure you don't use (") instead choose in the text qualifier and choose CR in the header row. This should work 100%

Related

SSIS Exporting to Flat File Destination (CSV) - Custom Property EscapeQualifier Not Working (Undocumented?)

Many questions have been asked on this topic, but I can't find anything specifically addressing what I see in Visual Studio 2017 (SSDT). A Custom Property named "EscapeQualifier" exists for a flat-file destination component in an SSIS project. Unfortunately, setting this to true doesn't seem to do anything.
Searching official documentation from MS doesn't even show the property existed.
On the surface, using this option seems to be a very elegant solution to the common issue of creating a "real" CSV file when the data being exported contains the double-quote character. If it worked as it seems it should, then it would double any double-quotes (or similarly escape whatever character you defined as your text-qualifier) for all quotable fields in the destination.
The solutions for "the CSV problem" that I've been able to find suggest modifying the specific data via transforms or at the data-retrieval level, but that's very impractical to do on each and every text-qualified data column.
To add insult to injury, I found a KB article from MS that suggests "exporting to CSV" is an official thing in SSDT.
KB4135137 - SSMS and SSDT do not escape double quotation marks when you export data as CSV
For example, you export a table into CSV format in a SQL Server Integration Services (SSIS) project.
This article suggests that the double-quotes not being escaped are a bug that has been fixed. Maybe it has, but only for the "Save results as..." option within SSMS. I still don't see any possible way to specify a true CSV export in an SSIS package, and this "EscapeQualifier" option gave me false hope.
Does this "EscapeQualifier" option ever do anything? If so, how do I get it to work? If not, is there another universal solution to the SSIS export to CSV issue?
Note: I created a pull request to add information about this property to Microsoft Docs.
As mentioned in the Flat File Destination properties, the EscapeQualifier property is used to:
When text qualifier is enabled, specifies whether the text qualifier in the data written to the destination file will be escaped or not
To test this property, I created a package the transfer data from a flat file to another one.
In the source flat file connection manager, the Text Qualifier is set to <none>, while in the destination flat file connection manager the text qualifier is set to ". The source flat file only contains the following value: my name is "hadi".
Is set the EscapeQualifier property as True in the flat file destination and execute the package. As shown in the screenshot below, the destination file contains the following value: "My name is ""hadi""" which means that this property worked as excepted.
Make sure that you have set a text qualifier in the flat file connection manager to ensure that this property will work as excepted.

Access Link Text Wizard - Syntax error in PARAMETER clause

I'm trying to link to a text file from the Access Link Text Wizard (as I've successfully done hundreds of times before) but this time I'm getting an error stating
Syntax error in PARAMETER clause.
No parameterized query is being used so I'm at a bit of a loss, I'm only trying to link to a text file.
In Access 2016 I go to the External Data tab and click Text File. I browse to my tab-delimited text file and select the Link to the data source creating a linked table option. As soon as I click OK it gives me the error above.
My text file is very simple with 2 columns and about 100 rows of data. This file is created from a Stored Procedure in SQL Server 2016 using BCP. If I manually create a text file with test data using the same format I don't get the error, which leads me to believe it may be some data in the file causing the error? I can't figure out how to attach text files to my question so any suggestions are welcome.
EDIT: I copied all data from the offending file into a new text file and it linked properly so it's not the data. I am often creating text files from a SQL Server Stored Procedure then linking to it from Access. This is the first time I've experienced this particular error.
EDIT2: I recreated a text file manually with the same data from the offending file and named it the same this time (Procedure Class Listing.txt) and I got the error. Is something wrong with this title???
EDIT3: Sorry for so many edits. I tried naming the file without the spaces and it links properly. I have linked to files with spaces in the name before so I don't understand.
FINAL EDIT: So it appears that a text file starting with the word "Procedure" followed by a space is giving me this error. I can remove all spaces (ProcedureClassListing.txt) and it works fine (which is the solution I'm going with).
So it appears that a file starting with the word "Procedure" followed by a space is giving me this error. I can remove all spaces (ProcedureClassListing.txt) and it works fine (which is the solution I'm going with).

"Text was truncated or one or more characters had no match in the target code page."

this might be a really old question, but I am getting this error using a SSIS package that someone created, I checked the error and it seems that a column from the flat file that the package is reading has a lot of blank spaces or maybe tabs that make the field longer that it should be.
1.- I tried to change the outputcolumnwidth from 50 to 100 or + but it is not working.
2.- I search on this page but all make reference to csv files or add an step creating an xls files, but I need to see if is possible to solve it without doing it because another option that I though was to remove the file from reading it.
any advice is appreciated...

Puppet - CSV file header

I'm, writing a Puppet (3.6.2) module that reads data fields from a CSV file via the extlookup function and I cannot figure out how to tell extlookup that the first line is the header field. Does extlookup support this? If not, can anyone recommend an external function I could import and use?
thanks,
PS - Yes I know about hiera, and having the data in YAML or JSON files but my requirement is CSV files only.
Brandon
The behavior of extlookup() is pretty well documented. It makes no special provision for column headers, which are by no means an inherent feature of CSV format. Indeed, if your header line is not readable as a data line, then your file is not CSV at all.
Supposing that your file is indeed valid CSV, the absolute simplest solution would be to ignore the issue. It presents a problem only if the first column heading duplicates an actual or potential data name. If it does not, then you will never look up or use the psuedo-value represented by the first row.
If your file in fact is not CSV on account of its first line, or if the first column name conflicts with a real data name, then it seems the next best alternative would be to just remove that line, or to avoid creating it in the first place. I don't see any reason why one of these should not be possible.
I know about heira, and having the data in YAML or JSON files but my requirement is CSV files only.
How sad. Do be aware that extlookup() has long been deprecated, and it was removed from Puppet 4.
I'm inclined to suggest you implement a translator from CSV to Hiera-friendly YAML, and use Hiera in your module. Alternatively, Hiera supports custom backends, and it's not too hard to write one. I am unaware of an existing CSV backend for Hiera, but you could write one. Ignoring a header line would then be under your control, and you would simultaneously achieve a measure of future-proofing.

SSIS Package not reading the last row in flat file

I have SSIS Package which will load .EXT file into my Database table.
The package Flat File connection manager Editor properties are
Format: Ragged Right
Code Page: 1252 ANSI (Latin-I)
Text Qualifier: <None>
Header Row Delimiter: <LF>
While trying to preview the file before loading, i am able to see all the rows in columns and
preview tab of Flat File connection manager Editor.
But in actual loading of the file, last record alone is not getting imported into table.
It was loading fine and still it is processing the file on daily basis.
Only for two days file, it was not imported last records. I am trying to find the root cause.
I suspected something wrong with the file, but i do not find any differences between the
working and not-working version of files.
Please suggest us to resolve the same. Kindly let me know if any informations required.
I ran into the same issue and did some research to find a solution that worked from me. Apparently the SSIS package had gone through a conversion from an earlier version at one point. When the conversion was done, the text qualifier property on the flat file connection was mangled. It had originally been <none>, but the conversion changed it to _x003C_none_x003E_. I opened the flat file connection manager and changed the text qualifier property on the general tab back to the proper value of <none>.
Credit goes to this thread for providing the answer.
I had a similar issue. My flat file didn't had any text qualifiers. When i added a text qualifier the package ran successfully. My guess is that the file is read as text and the CRLF is not recognized at the last line.
If you can provide a sample of the data from the file