MySQL import csv only gets half the lines - mysql

I use the following command to read in a csv:
LOAD DATA INFILE '/Users/Tyler/Desktop/players_escaped.txt'
INTO TABLE players
FIELDS TERMINATED BY ','
ENCLOSED BY '^'
LINES TERMINATED BY '\n';
The csv looks like this:
^1^,^False^,^False^,^Ovie^,^Soko^,^6^,^8^,^210^,^^,^M^,^London^,^^,^^,^^,^^,^0^,^2009^,^^,^False^,^False^,^{299F909C-88D9-4D26-8ADC-3EC1A66168BB}^,^844^,^2013^,^^,^^,^0^,^^,^^,^2011-02-16 20:53:34.877000000^,^^,^2011-02-16 20:53:34.877000000^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^1^,^2^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^^
^2^,^False^,^False^,^Jordan^,^Swing^,^6^,^6^,^200^,^^,^M^,^Birmingham^,^AL^,^35218^,^^,^^,^0^,^2009^,^^,^False^,^False^,^{299F909C-88D9-4D26-8ADC-3EC1A66168BB}^,^844^,^2013^,^^,^^,^0^,^^,^^,^2011-02-16 20:53:34.877000000^,^^,^2011-02-16 20:53:34.877000000^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^1^,^2^,^^,^^,^^,^^,^^,^^,^^,^^,^^,^^
I also tried \ as a delimiter and got the same results.
I'm only getting the odd numbered rows.
There are 250k records in the csv.
Thanks

Have you checked the line endings of the .txt file?

Related

Is there a way to import a CSV comma delimited file into mysql?

Hi there I am new to web development.
I am trying to import a CSV file into mysql workbench using 'Table Data Import Wizard'. However, I have read my file needs to be a CSV (MS-DOS), or I get the following error: Can't analyze file. Please try to change encoding type. If that doesn't help, maybe the file is not: csv, or the file is empty.
I cannot use a CSV (MS-DOS) as my data contains a lot of different special characters including those from Nordic Europe. When I convert my CSV (comma delimited) to CSV (MS-DOS) the special characters are no longer the same.
Is there a way to import a CSV comma delimited file into mysql workbench? Or is there a better solution to getting my data into the table such as keeping the special characters the same in the MS-DOS file somehow?
You can import regular CSVs without an issue, just make sure the encoding matches.
Something like
LOAD DATA
INFILE yourfile.csv
INTO TABLE tablename
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES
TERMINATED BY '\n'
IGNORE 1 LINES
should work. If your CSV doesn't have headers, remove the ignore 1 lines line from the code. If your formatting is different, change the enclosing and terminating characters accordingly.
You can look up the exact syntax in the manual.
Your CSV should work fine. You just need to Load Data Infile
You will likely need to define these settings though
LOAD DATA INFILE 'c:/tmp/discounts.csv'
INTO TABLE discounts
-- comma seperated? maybe pipes '|'?
FIELDS TERMINATED BY ','
-- what surrounds input and is it optional? then add OPTIONALLY before ENCLOSED
ENCLOSED BY '"'
-- what is at the end of files
LINES TERMINATED BY '\n'
-- how many header rows are there, if any?
IGNORE 1 ROWS;

Export JSON data from MySQL table to CSV

I used the following command to export some fields of MySQL table including a JSON field(attributes) into CSV file:
SELECT name, attributes, product_url FROM products INTO OUTFILE '/var/lib/mysql-files/toys.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n';
But, I get each key-value pair of attributes(JSON field) in separate columns.
How to get all those key-values(attributes column of MySQL table) in a single column of CSV file?
I found a solution that was enough to get my job done. I exported those fields into TSV instead of CSV using following slightly modified command:
SELECT name, attributes, product_url FROM products INTO OUTFILE '/var/lib/mysql-files/toys.tsv' FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n';
Still, if anyone has an exact solution to the problem, that would be greatly appreciated.

How to LOAD csv file with extra empty row?

I've got the CSV file which I need to load to mysql database. Problem is that every row of data is followed by empty row:
open_test_uuid|open_uuid|asn|bytes_download
O0037c645-0c7b-4bd0-a6dc-1983e6d0f814|Pf13e1f22-92f6-4a49-9bd3-2882373d0266|25255|11704458
O0037c645-0c7b-4bd0-a6dc-1983e6d0f814|Pf13e1f22-92f6-4a49-9bd3-2882373d0266|25255|11704458
I tried differend combinations of LOAD command but nothing works.
LOAD DATA LOCAL INFILE '/Users/BauchMAC/Desktop/details_201301.csv'
INTO TABLE netztest
FIELDS TERMINATED BY '|'
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n\r' <--HERE should be some magic line
IGNORE 1 ROWS;
I have no idea how to solve that...
Thank you for every idea
You could try
LINES TERMINATED BY '\n\n'
or
LINES TERMINATED BY '\r\n\r\n'
and if that doesn't work you could pre-process the file before feed it to MySQL.
If you're on linux you could run this:
cat file.csv | grep -v '^\s*$' > fixed.csv
UPDATE
Perhaps a little Perl will do the trick:
perl -e '$_=`cat file.csv`; s/[\r\n]+/\n/g; print'

Import CSV data into SQL database

I have a table with the following fields:
Name (varchar)
Ranking (int)
Age (int)
Favourite Court (varchar)
and a CSV file like this
name;age;current_ranking;favourite_court
sample1;22;5;Hard
sample2;21;6;Clay
I try to import it with MySQL using: LOAD DATA LOCAL INFILE 'c:/players.csv' INTO TABLE players FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n'; or LOAD DATA LOCAL INFILE 'c:/players.csv' INTO TABLE players FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n' (name,age,current_ranking,favourite_court);, with or without file header (that is name;age...).
It does not work, fields are messed up. Why?
You need to name the columns from your table, not the columns as headed in the file:
LOAD DATA LOCAL INFILE 'c:/players.csv' INTO TABLE players
FIELDS TERMINATED BY ';'
IGNORE 1 LINES
(Name, Age, Ranking, `Favourite Court`)
Also, as noted under LOAD DATA INFILE Syntax:
NoteĀ 
If you have generated the text file on a Windows system, you might have to use LINES TERMINATED BY '\r\n' to read the file properly, because Windows programs typically use two characters as a line terminator. Some programs, such as WordPad, might use \r as a line terminator when writing files. To read such files, use LINES TERMINATED BY '\r'

Exporting data escaping new lines not working

I need to export data from a mysql table to a csv file. Two fields in particular are giving me trouble. Their data types are varchar and TEXT.
Data in these columns contain all sorts of gross characters, particularly new line (\n) and tabs (\t). I need to export these into a .csv file for a migration.
Unfortunately, I cannot enclose the fields with '"' because the destination database doesn't support that formatting.
So, my query looks like the following:
SELECT `varchar_col`,`text_col` FROM `db`.`tbl` INTO OUTFILE '/path/to/my/file.csv' FIELDS TERMINATED BY '\t' ESCAPED BY "\\" LINES TERMINATED BY '\n';
When I look at my output file (used gedit and nano), I simply see that each new line or tab in the file is preceded by a backslash (see example below). I would like a new line or tab instead to read the literal chacters '\n' or '\t' instead of actual new lines and tabs.
Example of problem with new line-
field value of:
value one
value two
exported to .csv gives me:
value one\
value two\
instead of:
value 1\nvalue 2
Can anyone tell me what im doing wrong?
Thanks
I'm not quite sure. But is this what you want? (Output literal chacters '\n' or '\t' instead of actual new lines and tabs.)
SELECT * FROM block INTO OUTFILE '/var/tmp/d.csv' FIELDS TERMINATED BY '\\t' ESCAPED BY "\\" LINES TERMINATED BY '\\n';