MySQL importing CSV file with phpmyadmin without cell quotes - mysql

I have a huge CSV file with the data like this:
ID~Name~Location~Price~Rating
02~Foxtrot~Scotland~~9
08~Alpha~Iceland~9.90~4
32~ForestLane~Germany~14.35~
The issue is that when importing using PHPMyAdmin, it asks for Columns enclosed with: and Columns escaped with:. The trouble is, that this CSV doesn't have quotes for the cells.
If I leave this blank, it gives the error: Invalid parameter for CSV import: Columns escaped with
Is there a way to import without having quotes on the CSV?

I can reproduce this behavior. I'll bring it up on the phpMyAdmin development discussion list, but in the meantime, you can can work around it by using some nonsense character for "Columns escaped with" and leaving "Columns enclosed with" blank. Make sure your data doesn't contain, say a " or £ and use that for "Columns escaped with". For instance, I have a data set where I know £ doesn't exist, so I can use that for the "Columns escaped with" character -- if you don't have any escaped characters, you can enter any character there.
I'll update if I can provide any more useful information, but certainly that workaround should allow you to import your data.

Related

formatting MySQL output to valid CSV or XLSX

I have a query whose output I format and dump onto a CSV file.
This is the code I'm using,
(query.....)
INTO OUTFILE
"/tmp/dump.csv"
FIELDS TERMINATED BY
','
ENCLOSED BY
'"'
LINES TERMINATED BY
'\n'
;
However when I open the CSV in Google Sheets or Excel, the columns are broken up into hundreds of smaller ones.
When I open the CSV in a plain text editor, I see that the column values itself contain quotes (single and double), commas, line-breaks.
Only the double-quotes are escaped.
Even though the double-quotes are escaped, they are omitted when interpreted by Google Sheets and Excel.
I tried manually editing the CSV entries; escaping the commas and such. But no luck. The commas still break the columns. However, in a couple of instances they didn't break the column. I am not able to figure why though.
So my question is how do I correctly format the output to accommodate for these characters and dump it onto a CSV or even an XLXS ( in case a CSV is not capable for situations like these )?
For context, I'm operating in a WordPress environment. If there is a solution in PHP, that can work too.
EDIT ::
Here is a sample line from the CSV,
"1369","Blaze Pannier Mounts for KTM Duke 200 & 390","HTA.04.740.80200/B","<strong>Product Description</strong><span data-sheets-value=\"[null,2,"SW Motech brings you the Blaze Pannier Brackets for the Duke 200 & 390. "]\" data-sheets-userformat=\"[null,null,15293,[null,0],11]\">SW Motech brings you the Blaze Pannier Brackets for the Duke 200 & 390.</span>"," <strong>What's in the box? </strong><span data-sheets-value=\"[null,2,"2 Quick Lock SupportsnMounting materialnMounting Instructions"]\" data-sheets-userformat=\"[null,null,15293,[null,0],null,[null,[[null,2,0,null,null,[null,2,13421772]],[null,0,0,3],[null,1,0,null,1]]],[null,[[null,2,0,null,null,[null,2,13421772]],[null,0,0,3],[null,1,0,null,1]]],[null,[[null,2,0,null,null,[null,2,13421772]],[null,0,0,3],[null,1,0,null,1]]],[null,[[null,2,0,null,null,[null,2,13421772]],[null,0,0,3],[null,1,0,null,1]]],null,0,1,0,null,[null,2,0],"calibri,arial,sans,sans-serif",11]\">2 Quick Lock SupportsMounting materialMounting Instructions</span> ","Installation Instructions"
From RFC 4180
If double-quotes are used to enclose fields, then a double-quote
appearing inside a field must be escaped by preceding it with
another double quote. For example:
"aaa","b""bb","ccc"
Any double quotes inside fields enclosed with double quotes need to be escaped with another double quote. So given abc,ab"c," the expected formatting would be abc,"ab""c","""".

Using Excel to create a CSV file with special characters and then Importing it into a db using SSIS

Take this XLS file
I then save this XLS file as CSV and then open it up with a text editor. This is what I see:
Col1,Col2,Col3,Col4,Col5,Col6,Col7
1,ABC,"AB""C","D,E",F,03,"3,2"
I see that the double quote character in column C was stored as AB""C, the column value was enclosed with quotations and the double quote character in the data was replaced with 2 double quote characters to indicate that the quote is occurring within the data and not terminating the column value. I also see that the value for column G, 3,2, is enclosed in quotes so that it is clear that the comma occurs within the data rather than indicating a new column. So far, so good.
I am a little surprised that all of the column values are not enclosed by quotes but even this seems reasonable OK when I assume that EXCEL only specifies column delimieters when special characters like a commad or a dbl quote character exists in the data.
Now I try to use SQL Server to import the csv file. Note that I specify a double quote character as the Text Qualifier character.
And a command char as the Column delimiter character. However, note that SSIS imports column 3 incorrectly,eg, not translating the two consecutive double quote characters as a single occurence of a double quote character.
What do I have to do to get Excel and SSIS to get along?
Generally people avoid the issue by using column delimiter chactacters that are LESS LIKELY to occur in the data but this is not a real solution.
I find that if I modify the file from this
Col1,Col2,Col3,Col4,Col5,Col6,Col7
1,ABC,"AB""C","D,E",F,03,"3,2"
...to this:
Col1,Col2,Col3,Col4,Col5,Col6,Col7
1,ABC,"AB"C","D,E",F,03,"3,2"
i.e, removing the two consecutive quotes in column C's value, that the data is loaded properly, however, this is a little confusing to me. First of all, how does SSIS determine that the double quote between the B and the C is not terminating that column value? Is it because the following characters are not a comma column delimiter or a row delimiter (CRLF)? And why does Excel export it this way?
According to Wikipedia, here are a couple of traits of a CSV file:
Fields containing line breaks (CRLF), double quotes, and commas
should be enclosed in double-quotes. For example:
"aaa","b CRLF
bb","ccc" CRLF
zzz,yyy,xxx
If double-quotes are used to enclose fields, then a double-quote
appearing inside a field must be escaped by preceding it with
another double quote. For example:
"aaa","b""bb","ccc"
However, it looks like SSIS doesn't like it that way when importing. What can be done to get Excel to create a CSV file that could contain ANY special characters used as column delimiters, text delimiters or row delimiters in the data? There's no reason that it can't work using the approach specified in Wikipedia,. which is what I thought the old MS DTS packages used to do...
Update:
If I use Notepad change the input file to
Col1,Col2,Col3,Col4,Col5,Col6,Col7,Col8
"1","ABC","AB""C","D,E","F","03","3,2","AB""C"
Excel reads it just fine
but SSIS returns
The preview sample contains embedded text qualifiers ("). The flat file parser does not support embedding text qualifiers in data. Parsing columns that contain data with text qualifiers will fail at run time.
Conclusion:
Just like the error message says in your update...
The flat file parser does not support embedding text qualifiers in data. Parsing columns that contain data with text qualifiers will fail at run time.
Confirmed bug in Microsoft Connect. I encourage everyone reading this to click on this aforementioned link and place your vote to have them fix this stinker. This is in the top 10 of the most egregious bugs I have encountered.
Do you need to use a comma delimiter.
I used a pipe delimiter with no Text qualifier and it worked fine. Here is my output form the text file.
1|ABC|AB"C|D,E|F|03|3,2
You have 3 options in my opinion.
Read the data into a stage table.
Run any update queries you need on the columns
Now select your data from the stage table and output it to a flat file.
OR
Use pipes are you delimiters.
OR
Do all of this in a C# application and build it in code.
You could send the row to a script in SSIS and parse and build the file you want there as well.
Using text qualifiers and "character" delimited fields is problematic for sure.
Have Fun!

How to handle single and double quotes in XSLX spreadsheet when converted to csv for phpMyAdmin import

My client is providing me with an XSLX spreadsheet that, in some columns, can have single and/or double quotes. I'm opening it up in LibreOffice and saving it as a CSV. Then I try to import it in phpMyAdmin, but every time the import gets tripped up on a line with either single or double quotes, depending on which I indicate to use for escaping.
When saving the XLSX as a CSV I select UTF-8 for encoding (it's defaulting to Windows-1252), comma for column delimiter, leave "Save cell content as shown" checked. For "Text delimiter" and "Quote all text cells", I've tried both options each (single and double quotes for delimiter and checked/unchecked for Quote).
Then in phpMyAdmin, for the import I leave UTF-8 selected, columns enclosed with double-quote (or single quote, matching what I selected in LibreOffice) and for columns escaped with I've tried backslash and double-quote and single-quote.
In ALL cases I keep getting the error "Invalid column count in CSV input on line n." The line number depends on what I selected for column escape/delimiter (single or double quote). If I selected double quote as delimiter, I get the error on the first line that has a column with an unescaped single quote in it, and vice versa for single quote delimiters.
How can I get this spreadsheet imported with both single and double quotes in the cells?
Okay, after some more research and "fiddling around" I figured it out.
In my situation, for the import I selected CSV using LOAD DATA, used double quotes for "columns enclosed with", and cleared the columns escaped with.
LOAD DATA apparently tells phpMyAdmin to allow MySQL to handle the file directly. I'm not sure why this would affect my issue if I can specify delimiters, etc for the "regular" CSV import selection, but it seems to have worked for me!
Hope this helps someone else out.

Openoffice - CSV-export: is there a default escape-charcter?

As far as I can see OpenOffice, when it comes to save a file as a csv-file, encloses all strings in quote-characters.
So is there any need for an escape character?
and related to this question:
Does OpenOffice have a default escape character?
I'm also wondering if there is a way to choose the escape character when saving OpenOffice as csv. phpmyadmin was not accepting a 9,000 line 50+ column spreadsheed in .ods format and there doesn't seem to be a way to choose the escape character when saving as CSV.
So I had to save as csv, open in word, and use some find/replace tricks to change the escape character to \ (back slash). Default is to use double quotes to escape double quotes, and phpmyadmin won't accept that format.
To properly convert the file to use \ (back-slash) to escape double-quotes, you have to do this:
Pick a placeholder character string, e.g. 'abcdefg', that does
not occur anywhere in the csv.
Find/replace """ (three double-quotes in a row) with the placeholder. This is to prevent possibly incorrect results in the next step.
Find/replace "" (two quotes in a row, representing one quote that should be escaped), with \" (back-slash double-quote). If you did this without find/replacing """ it's conceivable you could get a result like "\" instead of \"". Better safe than sorry.
Find/replace the placeholder string with \"" (back-slash double-quote double-quote).
That will work, unless you happen to have more than one double-quote in a row in your original text fields, which would result in as many as five double-quotes in a row in the resulting .ods or .xlsx csv file (two double-quotes for each escaped double quote, plus another double quote if its at the end of the field).
Escaping in quotes makes life easier for tools parsing the CSV file.
In a recent version of LibreOffice (3.4.4), the CSV export was not handled correctly by phpMyAdmin. Since LibreOffice doesn't provide an escape character, the phpMyAdmin's default "CSV" import feature "Columns escaped with:" didn't work well. The data was always inconsistent.
However, using the option CSV using LOAD DATA did work, only if the value in Columns escaped by option was removed. I presume phpMyAdmin uses the default MySQL LOAD DATA command, and thus the control is passed to MySQL for data processing. In my scenario it resulted in accurate data import.

mysqlimport and double-quotes

We have a large tab-delimited text file (approximately 120,000 records, 50MB) that we're trying to shove into MySQL using mysqlimport. Some fields are enclosed in double-quotes, some not. We're using the fields-optionally-enclosed-by='\"' switch, but the problem is some of the field values themselves contain double-quotes (indicating inches) so the delimited field value might be something "ABCDEF19"". Make sense?
We have no control over the source of the file, so we can't change the formatting there. I tried removing the fields-optionally-enclosed-by switch, but then the double-quotes that surround the values are imported.
he records with quotes in the values are getting seriously messed up. Is there a way we can tell mysqlimport that some fields are optionally enclosed by quotes, but may still contain quotes? We've thought maybe a global search and replace to escape the double-quotes in field values? Or any other suggestions?
If your data is including quotes inside of the body of the field quote without delimiting that somehow, you have a problem. You can't guarantee that mysqlimport will do this properly.
Massage the data first before trying to insert it in this way.
Luckily, it is tab-delimited, so you can run a regex to replace the quotes with a delimited version and then tell mysqlimport the delimiter.
You could import it with the quotes (fields-optionally-enclosed-by switch removed) and then run a check where if the value has double quotes at the beginning and end (assuming none of the values have inches at the beginning) then truncate by 1 character from the beginning and end to remove the extra quotes you got from importing.
EDIT: after reading kekoav's response I have to agree that if you are able to manipulate the file before importing that would be a much wiser option, but if you are forced to remove quotes afterwards, you could use something like this:
UPDATE table
SET column =
IF(
STRCMP(LEFT(table.column,1),'"'),
MID(table.column,2,(LENGTH(table.column)-2)),
table.column
)
for every 'column' in 'table'