Skipped lines on mysql load local data - mysql

There is an example of my csv file:
direction,latitude,longitude,metrictimestamp,odometer,routecode,speed,device_deviceid,vehicle_vehicleid
180,-3.724404,-38.557694,20180201025934,161245809,0,0,148469,33089
0,-3.878595,-38.533493,20180201025955,19030291,0,0,8155064,34489
135,-3.744851,-38.545571,20180201025959,55697826,0,3,134680,32040
And there is the execution of the import query:
Any ideas of why this lines are being skipped??

You can use SHOW WARNINGS after LOAD DATA to show what went wrong. Please note that by default it doesn't show all 35 million warnings but just a first few.
That said, your data doesn't include anything for 'id' so that's probably at least part of the issue. Another is that you don't enclose the fields in double quotes but do have ENCLOSED BY '"'
See LOAD DATA documentation to see how you can choose which fields to populate:
LOAD DATA LOCAL INFILE 'C:/Users/Public/Downloads/Dados.csv'
INTO TABLE data
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(direction, latitude, longnitude, metrictimestamp,
odometer, routecode, speed, deviceid, vehicleid)
As id is AUTO INCREMENT, it will be populated automatically.

Related

Can I LOAD DATA with ugly data that has escape characters and quotes?

I have two records for example like this
11,avec myName à EX,Ex,0,2021-06-25
22,"andone \"ttt\"",Ex,0,2021-06-25
I am trying to load this into a table with MySQL with load data, and every time I try it, the quotes get cut off, or the back spaces don't show up.
I need to know if this is even possible. Can those two records go into a table and look exactly like they are in the CSV file?
I am using MySQL and trying
LOAD DATA LOCAL INFILE 'example.csv' INTO TABLE example;
Use the ENCLOSED BY and ESCAPED BY options.
LOAD DATA LOCAL INFILE 'example.csv'
INTO TABLE example
FIELDS OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\';
LOAD DATA LOCAL INFILE 'example.csv' INTO TABLE example
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(col1, col2, col3.col4,col5);
In mysql 8 this will get you an error please read
https://dev.mysql.com/doc/refman/8.0/en/load-data.html#load-data-file-location
Windws uses LINES TERMINATED BY '\r\n'
if the file comes from a linux system use `LINES TERMINATED BY '\n'
it os hard to tell without seeing the file what parameters would help exactly so you should try some variants out`.
also a editor with hexfile capability helps in such cases to analyse the struicture

Import Big csv file into mySql

I am trying to import 300 mg csv file into mySql table. I am using this command:
LOAD DATA INFILE 'c:/csv/bigCSV.csv' IGNORE
INTO TABLE table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
And it works great with small files (1 mg and so on), but when I try to load a big file like the one mentioned, MySql Workbench (which I use to execute my queries) runs the command, everything ok and green but 0 rows affected. No changes at all in the table.
I am 10000% sure that the table is ok because when I take a portion of that file, eg 1mg and load it into the same table it works fine.
Did anyone had this kind of problems?
Thank you.
I have "solved" it. Don`t know why and I feel stupid for not playing with the statement earlier but like this:
LOAD DATA INFILE 'c:/csv/eventsbig.csv' IGNORE
INTO TABLE std9
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
Without the "IGNORE 1 LINES" at the end and it works with files of any size.
LOAD DATA LOW_PRIORITY LOCAL INFILE 'C:\\Learning\\App6_DBconnection\\CC.csv'
INTO TABLE `test`.`ccd`
CHARACTER SET armscii8
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n'
IGNORE 1 LINES (`Cd_Type`, `Full_Name`, `Billing_Date`);
This will work even for large data sets of more than 1.5 million records.

MySQL load data infile from CSV (fields optionally enclosed by quotes, numbers have comma thousand separators)

Sorry for the wordy title. I'm running the following query:
LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE tablename
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY ',\r\n'
I have a CSV with data in the following format:
1111, 2222, "32", "1,234.304", 1023.53,
All lines are terminated with an extra comma.
When attempting to load, I get an error when trying to put the "1,234.304" type field into a NUMERIC(12,4) column. I end up with 1.0000 and a warning that the data was truncated. I had expected 1234.304.
Edit: It seems that doing a regular insert for a value which has comma thousand-separators upsets MySQL. Is there a way to modify the load command for the desired behavior?
Break the problem down into two steps:
Get the data into the database
Get the data into the table/format you want it in
If you get all the data into a table with VARCHAR fields, you can easily manipulate it and explicitly convert the values into the types you want them in, rather than relying on whatever implicit conversion you get with LOAD DATA LOCAL INFILE.

LOAD DATA LOCAL INFILE MySQL/PHP issue

Here is my MySQL table structure
id | tracking_number | order_id
Here is the structure of the CSV file:
(Sometimes the order_id is missing, and this seems to be causing issues)
"1R2689Y603406","33097"
"1R2689Y603404","33096"
"1R2689Y603414",
"1R2689Y603429","33093"
"1R2689Y603452",
Here is my current SQL Query which isn't working:
(The file is being uploaded, and is being read correctly, it's the query itself which is causing issues)
$sql = 'LOAD DATA LOCAL INFILE "'.$_FILES['my_file']['tmp_name'].'"
INTO TABLE table_tracking
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY "\""
LINES TERMINATED BY "\n"
(tracking_number,order_id)';
mysql_query($sql) or die(myqsl_error());
What is wrong with my query? Thanks for any help!
Edit: Changed CSV structure to represent missing data that sometimes occurs.
Changed query to match the one I am now using
Just trying here (not 100% confident) but did you try adding the destination columns like this:
mysql_query("LOAD DATA LOCAL INFILE '".$_FILES['my_file']['tmp_name']."'
INTO TABLE table_tracking FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(tracking_number,order_id)");
You have three columns in your table and are only providing two data.
This worked for me:
$sql = 'load data local infile "c:/users/ramon/desktop/1.csv"
into table test fields terminated by ","
optionally enclosed by "\""
lines terminated by "\n"';
mysql_query($sql) or die(myqsl_error());
You also probably need to make sure that your third column has a default value.
Instead of $_FILES['my_file']['tmp_name'], try
dirname(__FILE__)."/".$_FILES['my_file']['tmp_name']
dirname(__FILE__) will give the full directory path like 'C:/path/to/directory'.

Importing space-delimited data into MySQL

I have file in the following format:
`e00` `e01` `e02` `e03`
`e10` `e11` `e12` `e13
Trying to import the data with
LOAD DATA INFILE 'file' INTO TABLE 'foo' FIELDS TERMINATED BY ' ' ENCLOSED BY '`'
only seems to get the first 3 fields of each line. Is there a way to load the data without altering the file format?
Let's all jump in the way-back machine to answer a 5-year old question!
The fact that the last item is not being loaded is a big hint. According to the manual:
If LINES TERMINATED BY is an empty string and FIELDS TERMINATED BY is
nonempty, lines are also terminated with FIELDS TERMINATED BY.
So, it's looking for a trailing space at the end of the line.
You could add a space to the end of each line in your input file, or try LINES TERMINATED BY '\n'