different line break characters in CSVs - mysql

I use LOAD DATA LOCAL INFILE to upload csv files to MySQL.
If the csv was created by a mac, I include LINES TERMINATED BY '\r' in the query.
If the csv was created by a MS Office, I include LINES TERMINATED BY '\n' in the query.
If the csv was created in Open Office, I omit LINES TERMINATED BY altogether.
Is there a way I can formulate my query so the csv will be uploaded regardless of how it was created?

Look up line-termination stripping procs. There should be a few resources out there which look at \r\n issues and how to fix them in different coding sets.

Related

Loading data into Mysql table

I'm quite sure this question probably have been posted here somewhere but I can't find it. My problem seems a bit different. I tried loading a .txt file from my local computer into MYSQL table but the output comes out really scrambled. The data are flying all over the place and not in order as I want it to be.
Could anyone please explain why it is so and a way out?
I used the query; LOAD DATA LOCAL INFILE \home\minilab\desktop\data.txt1.txt INTO clientrecord;
Hope txt values in a line match the fields in table, If the data are separated by ',' use fields terminated by ',' or if separated by tab use fields terminated by '\t' and add lines terminated by '\n'
also share the sample data of ur infile
Ex:
LOAD DATA LOCAL INFILE \home\minilab\desktop\data.txt1.txt INTO
clientrecord fields terminated by ',' lines terminated by '\n';
if you not using the proper formating the values will scattered across the table.
reference: http://dev.mysql.com/doc/refman/5.7/en/load-data.html

LOAD DATA INFILE not upload all the lines in a huge file

I wrote this SQL source:
LOAD DATA LOCAL INFILE '"+exportDataFile+"' INTO TABLE ex_patients_variations FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
This source should load to the database a huge file (right now it contain over than 1 million lines, and in the future it could contain billions lines).
The problem is its not load all the lines in the file.
I try to run it some times, and everytime I use this SQL source, it upload different number of lines (between 550,000 to 660,000 lines, when I have more than 1 million lines).
How can I solve it and upload all the file lines in one time to the database?
when it is only for one time you can split the file in some pieces
with
split --lines=100000 FILENAME
So it will split it in 10 file with 100000 lines each.
also you can test to set the variable max-allowed-packet= in my.cnf to a higher value. you must restart the server

PHPMyadmin tool slow in importing

One of my customer reports that while importing a 1mb CSV file it takes more than the usual time (more than 30 min) which is abnormal. While using the format type as CSV using LOAD DATA it throws an error.
SQL query
LOAD DATA INFILE '/tmp/phpzfQQR8' INTO TABLE `course1` FIELDS TERMINATED BY ';'
ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\r\n'
MySQL said: Documentation
1085 - The file '/tmp/phpzfQQR8' must be in the database directory or be readable by all
It actually works the other way but just abnormally slow. Could someone help me on this really quick.
Instead of CSV using LOAD DATA, try just CSV.

skip header of csv file to be imported in mysql workbench

What I did is to type this query:
SELECT * FROM report;
then click the Import Records from an External File in MySQL Workbench. It worked and it was imported in my database. The problem is that the header is also imported.
What should I do to import everything except the header?
Ive never really used MySQL workbench, but the best way to import a .csv is with LOAD DATA INFILE.
Example:
LOAD DATA INFILE '{$file_path}' INTO TABLE {$table} FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES
Note that your LINES TERMINATED BY may be different based on the way your .csv is formatted. It could also be \n' or less commonly '\r'.

Easy way to import rows into MySQL from csv file

I have a csv file that corresponds to what our user table looks like. Essentially we need to do a bulk import, what would be a good way to do this in MySQL without writing a custom script that just issues an SQL insert?
You could use load data infile
IE:
load data local infile 'uniq.csv' into table tblUniq
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
you can use phpmyadmin
which is the best tools for developers to do every thing easyly
here you have to choose the file click csv and just lload it
very easy............right
you can download it from here
http://www.phpmyadmin.net/home_page/downloads.php ( also install apache along with this )
you can also use wamp server which include phpmyadmin (best for windows)