Import Big csv file into mySql - mysql

I am trying to import 300 mg csv file into mySql table. I am using this command:
LOAD DATA INFILE 'c:/csv/bigCSV.csv' IGNORE
INTO TABLE table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
And it works great with small files (1 mg and so on), but when I try to load a big file like the one mentioned, MySql Workbench (which I use to execute my queries) runs the command, everything ok and green but 0 rows affected. No changes at all in the table.
I am 10000% sure that the table is ok because when I take a portion of that file, eg 1mg and load it into the same table it works fine.
Did anyone had this kind of problems?
Thank you.

I have "solved" it. Don`t know why and I feel stupid for not playing with the statement earlier but like this:
LOAD DATA INFILE 'c:/csv/eventsbig.csv' IGNORE
INTO TABLE std9
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
Without the "IGNORE 1 LINES" at the end and it works with files of any size.

LOAD DATA LOW_PRIORITY LOCAL INFILE 'C:\\Learning\\App6_DBconnection\\CC.csv'
INTO TABLE `test`.`ccd`
CHARACTER SET armscii8
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n'
IGNORE 1 LINES (`Cd_Type`, `Full_Name`, `Billing_Date`);
This will work even for large data sets of more than 1.5 million records.

Related

Can I LOAD DATA with ugly data that has escape characters and quotes?

I have two records for example like this
11,avec myName à EX,Ex,0,2021-06-25
22,"andone \"ttt\"",Ex,0,2021-06-25
I am trying to load this into a table with MySQL with load data, and every time I try it, the quotes get cut off, or the back spaces don't show up.
I need to know if this is even possible. Can those two records go into a table and look exactly like they are in the CSV file?
I am using MySQL and trying
LOAD DATA LOCAL INFILE 'example.csv' INTO TABLE example;
Use the ENCLOSED BY and ESCAPED BY options.
LOAD DATA LOCAL INFILE 'example.csv'
INTO TABLE example
FIELDS OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\';
LOAD DATA LOCAL INFILE 'example.csv' INTO TABLE example
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(col1, col2, col3.col4,col5);
In mysql 8 this will get you an error please read
https://dev.mysql.com/doc/refman/8.0/en/load-data.html#load-data-file-location
Windws uses LINES TERMINATED BY '\r\n'
if the file comes from a linux system use `LINES TERMINATED BY '\n'
it os hard to tell without seeing the file what parameters would help exactly so you should try some variants out`.
also a editor with hexfile capability helps in such cases to analyse the struicture

mySQL COLUMNS TERMINATED BY DELETE Character?

I am trying to load large text files that are delimited by DEL. Using Coldfusion I can do chr(127), but that doesn't seem to work here
LOAD DATA LOCAL INFILE "C:\\inetpub\\vhosts\\mysite.com\\dataload.data"
INTO TABLE tbl_data
COLUMNS TERMINATED BY '\0x7F'
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
I have been searching and trying a number of things that look like they can be DEL characters. Does anyone have any suggestions for the COLUMNS TERMINATED BY line?
After a lot of searching, I found that my original statement was close. The answer is:
COLUMNS TERMINATED BY X'7F'

Load Data Local Infile SQL

I have an extract of data that is separated by using this character |
I need to load this data into a MySQL database table but I'm struggling with the lines terminated by function.
Can anyone please help me identify the correct method to terminate these fields? My example data is below.
I've already tried '"|"' but this doesn't work for the first and last fields.
Example:
00000000|OLD TRUCK|9|13|02|Z |Z |9999|111|99|ZZ|ZZ|ZZ|ZZ||ZZ|ZZ|99|999|2
Try something like this:
LOAD DATA LOCAL INFILE 'example.txt' INTO TABLE example_table
FIELDS TERMINATED BY '|'
LINES TERMINATED BY ',';
This worked for me
LOAD DATA LOCAL INFILE 'd:/test.csv' INTO TABLE test
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n';

CSV import omit a field

I'm trying to import a table from CSV file to phpMyAdmin.
This is my table:
1 |Evan|Chigur |Male|1987-05-25|codefiscale0123|Via Calatafini 17, Palermo, 23451|user#user.it|marco_r|ee11cbb19052e40b07aac0ca060c23ee
22 |Hank|Zappasorca|Male|1987-05-25|codefiscale0123|Via Calatafini 17, Palermo, 23451|user#user.it|marco_r|ee11cbb19052e40b07aac0ca060c23ee
When i try to fill my table with this, the first field is omitted.
I've already checked out the correspondences of the fields between the cvs and mysql table.
Can anyone help me?
i use this query:
LOAD DATA LOCAL INFILE 'C:\\wamp\\tmp\\php7E3C.tmp' REPLACE INTO TABLE `bw_users`FIELDS TERMINATED BY '|'ENCLOSED BY '"'ESCAPED BY '\\'LINES TERMINATED BY '*'
how does this work for you?
LOAD DATA INFILE 'C:\\wamp\\tmp\\php7E3C.tmp' REPLACE INTO TABLE bw_users
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
for windows change to
LINES TERMINATED BY '\r\n'

MySQL LINES TERMINATED BY

What are the potential choices for LINES TERMINATED BY in MySQL?
I have a CSV file that I am trying to upload and neither \r\n, \n, or \r work.
Try inspecting the file using a binary-displaying utility like od (http://swoolley.org/man.cgi/1/od). That will show you imediately how your lines are terminated.
I agree with brian.
Query which i used to work are as follows.
load data local infile 'test.csv'
into table test_field fields
terminated by ','
lines terminated by '\n'