I'm trying to load a CSV into MySQL using the LOAD DATA INFILE technique. It's working fine, but I have a problem where some columns use double quotes and some do not.
Example:
something,123,something,"Bauer, Jack",123,something
What happens is the commas inside the quotes break the import, so my data is all jacked up at the end. Not sure how to get the import to escape commas inside the double quotes.
mysql --user=<USER> --password=<PASS> -e "LOAD DATA INFILE '<FILENAME>' INTO TABLE <TABLENAME> FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' (col1, col2, col3, ...)" <DATABASE>
You need to execute the statement LOAD DATA INFILE with the additional option
FIELDS OPTIONALLY ENCLOSED BY '"'
Thus the whole statement becoming
LOAD DATA INFILE '<FILENAME>' INTO TABLE <TABLENAME>
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(col1, col2, col3, ...)
For further readings, please consult the excellent MySQL Reference Manual e.g. for MySQL 5.1 GA.
Hopefully you have figured out what to do by now, but If you haven't hopefully this will help you out.
I have had trouble in Excel 2007 exporting to a customized CSV file. I found that you can change the fields terminator here: http://www.tek-tips.com/viewthread.cfm?qid=1635599&page=15
I prefer to use the pipe | character as it is uncommon, and a little more flexible when dealing with inch", foot" measurements.
Related
I have two records for example like this
11,avec myName à EX,Ex,0,2021-06-25
22,"andone \"ttt\"",Ex,0,2021-06-25
I am trying to load this into a table with MySQL with load data, and every time I try it, the quotes get cut off, or the back spaces don't show up.
I need to know if this is even possible. Can those two records go into a table and look exactly like they are in the CSV file?
I am using MySQL and trying
LOAD DATA LOCAL INFILE 'example.csv' INTO TABLE example;
Use the ENCLOSED BY and ESCAPED BY options.
LOAD DATA LOCAL INFILE 'example.csv'
INTO TABLE example
FIELDS OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\';
LOAD DATA LOCAL INFILE 'example.csv' INTO TABLE example
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(col1, col2, col3.col4,col5);
In mysql 8 this will get you an error please read
https://dev.mysql.com/doc/refman/8.0/en/load-data.html#load-data-file-location
Windws uses LINES TERMINATED BY '\r\n'
if the file comes from a linux system use `LINES TERMINATED BY '\n'
it os hard to tell without seeing the file what parameters would help exactly so you should try some variants out`.
also a editor with hexfile capability helps in such cases to analyse the struicture
I am using load data infile to upload a 2gig csv file into mysql and getting some warnings that indicate that some of the cells include a semicolon when the actual csv file I am uploading is semicolon separated.
I suppose that changing the delimiter from semicolon may be an option but I do not have control over the export I am working with. Not sure how to do this without a find replace which obviously wouldnt work as it would just replace the semicolon that needs to remain in the field it has been exported to.
Here is what I am using:
load data infile '/file/path/bigfile.csv'
into table table_name fields terminated by ';'
LINES TERMINATED BY '\r\n' ignore 1 lines
(Column1, Column2, Column3, ....);
I guess the ultimate question is can you use a regex as a delimter when using load data infile...having problems finding this answer.
You have to mention the fields separated by in LOAD DATA command as:
LOAD DATA LOCAL INFILE 'abc.csv' INTO TABLE abc
FIELDS TERMINATED BY **';'**
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
FIELDS TERMINATED BY ';' is the main part to edit in your case.
Also you might be change the delimiter from semicolon to something else..
I have a large MySQL table (8-9k records, I'm using phpMyAdmin) with some values that are descriptions, with commas.
I need to Export the SQL table into CSV form to use on a new platform and I can't for the life of me solve this issue:
When there are commas in certain fields, the data gets shoved to the next column and things get out of order.
I JUST NEED the EXACT table in MySQL to become a CSV. Any tips, ideas?
Thanks.
Use this query:
SELECT * INTO OUTFILE 'filename.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM [tablename]
Source: http://www.wausita.com/2011/02/mysql-export-csv/
Simply go to the database's Export tab. Be sure to enclose your fields. This way values are wrapped in " to avoid erroneous commas.
Otherwise, you can use mysqldump from the command line.
Or you can export data with a query:
SELECT *
FROM table
INTO OUTFILE '/tmp/table.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
My solution was exporting as an .xlsx from MySQL. The columns were rendered perfectly. Using several combinations of enclosing/escaping character was to no avail.
I then saved the .xlsx as a .csv and it was fine.
Thanks for your time guys.
I'm migrating an old Sybase database to a new MySQL db.
Since the moment that Sybase can export data to .dat files (something similar to new csv), I decided to use it.
The problem is that Sybase uses commas as column separator and commas in strings are ignored because are enclosed in '', but not in MySQL.
Is there a way to solve the problem? Here's my query:
LOAD DATA LOCAL INFILE 'C:\\UNLOAD\\166.dat'
INTO TABLE linea_col
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\r\n';
Thank you in advance.
If they are enclosed by single quotes try this:
LOAD DATA LOCAL INFILE 'C:\\UNLOAD\\166.dat'
INTO TABLE linea_col
COLUMNS TERMINATED BY ','
ENCLOSED BY '\''
LINES TERMINATED BY '\r\n';
If that doesn't work then you should do ENCLOSED BY '''' but I'm 99.99% certain the first is correct.
I'm not familiar with how DAT files may differ to CSV, but you can use CSV as the common format.
You can export from Sybase using BCP and forcing the output to use CSV format
bcp database.dbo.tbl_name out c:\temp\output.csv -c -t, -U user_name -P password -S server_name
Then import into MySQL with
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
I've been using the following command to expert mysql data to a csv file.
SELECT * INTO OUTFILE 'output.csv' FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' FROM table1;
It works for simple tables with simple data. However, if the table contains html tags, double quotes, single quotes, ascii characters etc, it does not work propertly, i.e. it will put tabs and new lines in incorrect places, breaking up data where it shouldn't. How can the sql script above be improved to export data with html?
I have tried SELECT...INTO OUTFILE statement, and then LOAD DATA INFILE statement, everything is OK, the HTML text was exported/imported without any mistakes (on MySQL 5.5).
Try to add ENCLOSED BY option, it should help you, e.g. -
SELECT *
INTO OUTFILE 'output.csv'
FIELDS TERMINATED BY '\t' ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM
table1;
LOAD DATA INFILE 'output.csv'
INTO TABLE table1
FIELDS TERMINATED BY '\t' ENCLOSED BY '"'
LINES TERMINATED BY '\n';