MySQL load infile from csv NULL last column - mysql

I have a .csv file that contains 5 columns of data. Occasionally, the last column contains a NULL value, like so.
2014-07-11 23:55:00,1,245,0.05,0.01,0.0003
2014-07-11 23:57:00,1,245,0.05,0.01,\N
2014-01-17 20:14:00,2,215,0.05,0.009,0.002
I'm attempting to load this into a local database, so from the MySQL console I run the following code:
LOAD DATA LOCAL INFILE 'C:/<redacted>/data.csv'
INTO TABLE tbl_data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
This method does work just fine when the column that contains the NULL value is moved so that it isn't in the last column, but it throws warnings back at me when the data is formatted as shown above.
Query OK, 71735 rows affected, 49 warnings
' for column 'energy' at row 5253 | value: 'N
I followed advice from this thread, and attempted to utilize a local variable, but that didn't appease the MySQL gods either.
LOAD DATA LOCAL INFILE 'C:/<redacted>/data.csv'
INTO TABLE tbl_data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
(time, device, voltage, amps, power, #energy)
SET energy = nullif(#energy,'\N')
Oh, and the table structure is set up like so, and I'm running MySQL 5.6.17:
+--------+-------------+-----+---------+
|NAME |TYPE |NULL | DEFAULT |
|time |datetime |No | None |
|device |tinyint(1) |No | None |
|voltage |smallint(3) |No | None |
|amps |decimal(6,3) |No | None |
|power |decimal(6,3) |No | None |
|energy |decimal(7,4) |Yes | NULL |
+--------+-------------+-----+---------+
What am I doing wrong here??

Rimas was correct, the last clause in my initial statement should have been:
LINES TERMINATED BY '\r\n'
The relevant documentation is worded as follows:
If you have generated the text file on a Windows system, you might have to use LINES TERMINATED BY '\r\n' to read the file properly, because Windows programs typically use two characters as a line terminator. Some programs, such as WordPad, might use \r as a line terminator when writing files. To read such files, use LINES TERMINATED BY '\r'.

Related

MySQL "LOAD DATA INFILE" query, issue with "TERMINATED BY"

I have an hashes table with 2 columns, hash | plain
And a text file looking like that:
acbd18db4cc2f85cedef654fccc4a4d8:foo
37b51d194a7513e45b56f6524f2d51f2:bar
4e99e8c12de7e01535248d2bac85e732:foo:bar
I'm trying to execute this query:
LOAD DATA LOCAL INFILE 'file.txt' INTO TABLE hashes COLUMNS TERMINATED BY ':' LINES TERMINATED BY '\n'
The issue is, for the hash 4e99e8c12de7e01535248d2bac85e732, it will only insert foo, not foo:bar, because of COLUMNS TERMINATED BY ':'.
How can I make it "only split once" to fix this issue?
You could load into a user variable and use a bit of string maniplulation.
drop table if exists t;
create table t
(hash varchar(100),plain varchar(100));
LOAD DATA INFILE 'file.txt'
INTO TABLE t
LINES TERMINATED BY '\r\n'
(#var)
set
hash = substring_index(#var,':',1),
plain = replace(#var,substring_index(#var,':',1),'')
;
select *
from t;
+--------+----------+
| hash | plain |
+--------+----------+
| abc | :def |
| abc | :ghi |
| abc | :def:ghi |
+--------+----------+
3 rows in set (0.001 sec)
Note I have used \r\n to load this properly - you should test for your environment

How to import some columns of csv file to mysql phpmyadmin [duplicate]

I have a 350MB file named text_file.txt containing this tab delimited data:
345868230 1646198120 1531283146 Keyword_1531283146 1.55 252910000
745345566 1646198120 1539847239 another_1531276364 2.75 987831000
...
MySQL Database name: Xml_Date
Database table: PerformanceReport
I have already created the table with all the destination fields.
I want to import this text file data into a MySQL. I googled and found some commands like LOAD DATA INFILE and quite confused on how to use it.
How can I import this text file data?
It should be as simple as...
LOAD DATA INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine.
Walkthrough on using MySQL's LOAD DATA command:
Create your table:
CREATE TABLE foo(myid INT, mymessage VARCHAR(255), mydecimal DECIMAL(8,4));
Create your tab delimited file (note there are tabs between the columns):
1 Heart disease kills 1.2
2 one out of every two 2.3
3 people in America. 4.5
Use the load data command:
LOAD DATA LOCAL INFILE '/tmp/foo.txt'
INTO TABLE foo COLUMNS TERMINATED BY '\t';
If you get a warning that this command can't be run, then you have to enable the --local-infile=1 parameter described here: How can I correct MySQL Load Error
The rows get inserted:
Query OK, 3 rows affected (0.00 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
Check if it worked:
mysql> select * from foo;
+------+----------------------+-----------+
| myid | mymessage | mydecimal |
+------+----------------------+-----------+
| 1 | Heart disease kills | 1.2000 |
| 2 | one out of every two | 2.3000 |
| 3 | people in America. | 4.5000 |
+------+----------------------+-----------+
3 rows in set (0.00 sec)
How to specify which columns to load your text file columns into:
Like this:
LOAD DATA LOCAL INFILE '/tmp/foo.txt' INTO TABLE foo
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(#col1,#col2,#col3) set myid=#col1,mydecimal=#col3;
The file contents get put into variables #col1, #col2, #col3. myid gets column 1, and mydecimal gets column 3. If this were run, it would omit the second row:
mysql> select * from foo;
+------+-----------+-----------+
| myid | mymessage | mydecimal |
+------+-----------+-----------+
| 1 | NULL | 1.2000 |
| 2 | NULL | 2.3000 |
| 3 | NULL | 4.5000 |
+------+-----------+-----------+
3 rows in set (0.00 sec)
If your table is separated by others than tabs, you should specify it like...
LOAD DATA LOCAL
INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport
COLUMNS TERMINATED BY '\t' ## This should be your delimiter
OPTIONALLY ENCLOSED BY '"'; ## ...and if text is enclosed, specify here
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.
LOAD DATA INFILE '/tmp/test.txt'
INTO TABLE test
FIELDS TERMINATED BY ','
LINES STARTING BY 'xxx';
If the data file looks like this:
xxx"abc",1
something xxx"def",2
"ghi",3
The resulting rows will be ("abc",1) and ("def",2). The third row in the file is skipped because it does not contain the prefix.
LOAD DATA INFILE 'data.txt'
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
You can also load data files by using the mysqlimport utility; it operates by sending a LOAD DATA INFILE statement to the server
mysqlimport -u root -ptmppassword --local test employee.txt
test.employee: Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
You should set the option:
local-infile=1
into your [mysql] entry of my.cnf file or call mysql client with the --local-infile option:
mysql --local-infile -uroot -pyourpwd yourdbname
You have to be sure that the same parameter is defined into your [mysqld] section too to enable the "local infile" feature server side.
It's a security restriction.
LOAD DATA LOCAL INFILE '/softwares/data/data.csv' INTO TABLE tableName;
LOAD DATA INFILE '/home/userlap/data2/worldcitiespop.txt' INTO TABLE cc FIELDS TERMINATED BY ','LINES TERMINATED BY '\r \n' IGNORE 1 LINES;
IGNORE 1 LINES to skip over an initial header line containing column names
FIELDS TERMINATED BY ',' is to read the comma-delimited file
If you have generated the text file on a Windows system, you might have to use LINES TERMINATED BY '\r\n' to read the file properly, because Windows programs typically use two characters as a line terminator. Some programs, such as WordPad, might use \r as a line terminator when writing files. To read such files, use LINES TERMINATED BY '\r'.
For me just adding the "LOCAL" Keyword did the trick, please see the attached image for easier solution.
My attached image contains both use cases:
(a) Where I was getting this error.
(b) Where error was resolved by just adding "Local" keyword.
Make Sure your Local-Infile variable is set to True (ON)
mysql> show global variables like 'local_infile';
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| local_infile | OFF |
+---------------+-------+
1 row in set (0.04 sec)
mysql> set global local_infile=true;
Query OK, 0 rows affected (0.01 sec)
Find the correct path to store the txt files for loading in SQL tables
mysql> SELECT ##GLOBAL.secure_file_priv;
+------------------------------------------------+
| ##GLOBAL.secure_file_priv |
+------------------------------------------------+
| C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\ |
+------------------------------------------------+
1 row in set (0.00 sec)
Load using data infile from the path (Use backward slashes in path)
mysql> load data infile 'C:/ProgramData/MySQL/MySQL Server
8.0/Uploads/text_file.txt' into table TABLE_NAME fields terminated by '\t' lines terminated by '\n';
1. if it's tab delimited txt file:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
LINES TERMINATED BY '\r\n';
2. otherwise:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
FIELDS TERMINATED BY 'x' (here x could be comma ',', tab '\t', semicolon ';', space ' ')
LINES TERMINATED BY '\r\n';

MySQL - Load Data In File Returning Errors For Date Fields

I am trying to load data from a .CSV file into a MySQL database using the following command.
LOAD DATA LOCAL INFILE "FILEPATH/EQUIPMENT.CSV"
INTO TABLE EQUIPMENT
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\R\N'
IGNORE 1 LINES;
However the following error is returned by MySQL when attempting to load information into the database:
Error Code: 1292. Incorrect date value: '0000-00-00' for column 'END_OF_LIFE' at row 1
This error doesn't make any sense. There as the value '0000-00-00' is not specified in row one for the data set. See example data set below.
+---------------+-------------+
| EQUIPMENT_ID | END_OF_LIFE |
+---------------+-------------+
| B1010-V003 | 1800-01-01 |
| B1010-V001 | 1800-01-01 |
| B1010-V005 | 1800-01-01 |
+---------------+-------------+
Any one know why the system may be returning this error? (I have tried changing the dates to more reasonable values but this does not work either).
try this:
/usr/bin/mysql --local-infile=1 -u$DB_user -p$DB_pw --database=$DB_database < $working_dir/load_ic2_loc_batch.sh 2>> $logfile
I actually figured out what was wrong. It was due to the line endings being incorrectly specified as capitals in my script (I guess the MS DOS line endings in the input CSV file are case sensitive). See corrected script below.
LOAD DATA LOCAL INFILE "FILEPATH/EQUIPMENT.CSV"
INTO TABLE EQUIPMENT
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;

MySQL CSV import: datetime value

I have been banging my head against a wall trying to import datetime values from a .csv file.
Here's the import statement.
LOAD DATA LOCAL INFILE 'myData.csv'
INTO TABLE equity_last_import
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(equity,last,#last_date)
SET last_date = STR_TO_DATE( #last_date, '%Y-%m-%d %H:%i:%s')
Here's a sample of the data:
4108,48.74,"2013-09-16 16:15:04"
4249,8.1,"2013-09-16 16:15:04"
4197,3.81,"2013-09-16 17:20:00"
4139,26.81,"2013-09-16 16:15:04"
4218,24.83,"2013-09-16 17:20:00"
4260,79.72,"2013-09-16 16:15:04"
4270,450.12,"2013-09-16 17:20:00"
4242,30.38,"2013-09-16 16:15:04"
4193,1.42,"2013-09-16 16:15:04"
4134,3.77,"2013-09-16 16:15:04"
I am able to import date values using STR_TO_DATE() but I not able to get datetime values to import. I have tried several different date formats other than '%Y-%m-%d %H:%i:%s' and I always get a null datetime [0000-00-00 00:00:00]. I have also tried not using STR_TO_DATE(), since the string is in the default MySQL datetime format.
Any help will be appreciated.
I ran into the same problem. I fixed it by changing the format for the date column in my CSV file to match the MySQL datetime format.
Open CSV in Excel.
Highlight the column.
Right-click on the column.
Click on Format Cells.
Pick Custom.
Use yyyy/mm/dd hh:mm:ss in the Type field.
Click ok
My CSV successfully imported after I changed the datetime format as above.
The date in your data file is already in a format MySQL should natively understand. It's just enclosed in double quotes. You need to tell LOAD DATA INFILE how to deal with the quotes. Try something like this:
LOAD DATA LOCAL INFILE 'myData.csv'
INTO TABLE equity_last_import
FIELDS OPTIONALLY ENCLOSED BY '"' TERMINATED BY ','
LINES TERMINATED BY '\n'
(equity,last,last_date)
Update:
Since you've said it doesn't work, I created a test table and verified that it does work. Here's the proof:
I've highlighted your csv data from the question and pasted into a new file called myData.csv in my system's /tmp folder. Then I connected to the mysql console, switched to the test database and ran the following:
mysql> create table equity_last_import (equity int, last decimal(10,2), last_date datetime) engine=innodb;
Query OK, 0 rows affected (0.02 sec)
mysql> LOAD DATA LOCAL INFILE '/tmp/myData.csv'
-> INTO TABLE equity_last_import
-> FIELDS OPTIONALLY ENCLOSED BY '"' TERMINATED BY ','
-> LINES TERMINATED BY '\n'
-> (equity,last,last_date);
Query OK, 10 rows affected (0.00 sec)
Records: 10 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select * from equity_last_import;
+--------+--------+---------------------+
| equity | last | last_date |
+--------+--------+---------------------+
| 4108 | 48.74 | 2013-09-16 16:15:04 |
| 4249 | 8.10 | 2013-09-16 16:15:04 |
| 4197 | 3.81 | 2013-09-16 17:20:00 |
| 4139 | 26.81 | 2013-09-16 16:15:04 |
| 4218 | 24.83 | 2013-09-16 17:20:00 |
| 4260 | 79.72 | 2013-09-16 16:15:04 |
| 4270 | 450.12 | 2013-09-16 17:20:00 |
| 4242 | 30.38 | 2013-09-16 16:15:04 |
| 4193 | 1.42 | 2013-09-16 16:15:04 |
| 4134 | 3.77 | 2013-09-16 16:15:04 |
+--------+--------+---------------------+
10 rows in set (0.00 sec)
See? It works perfectly.
Another Update:
You've specified that you're getting the following error now:
Out of range value for column 'last_date' at row 1
Does your CSV file have a header? If so, you may want to add IGNORE 1 LINES to your LOAD DATA INFILE command to tell MySQL to skip over the header.
Pulled my hair out over this also because I'm not importing in the above suggested way.
Workaround: Created a temporary field temp_date of type "VARCHAR" on your import table and have no problems loading the data. I then was able to perform an update on my date column which is of type date.
update table
set date = temp_date
I was having the same trouble and here's what I discovered, I think it might help you
The problem regards a GMT conflict:
The database I was extracting the .csv file was on GMT 00:00, so there wasn't daylight saving time.
My local server (which was the one I was trying to insert the .csv file) was running on my computer (system's) GMT, by default. In my case it was GMT -3, which is affected by daylight saving time.
SQL has a special way to deal with daylight saving time, it ignores the exact date and time when the daylight saving time starts to happen. In my case, it was october 20 and all the records between 00 and 1 AM of that day simply weren't recognized by the server, even if the format was correct, because they simply 'didn't exist' (see more here). This was affecting all timestamp records, not only the specifics of daylight saving time.
My solution was to set the local server to GMT +00, before creating the new table and import the .csv, using
SET time_zone='+00:00';
And then when I imported the .csv file all the records were read properly. I think if you set the time zone equal to the one that generated the .csv file should work!
Open CSV in Excel.
Highlight the column.
Right-click on the column.
Click on Format Cells.
Pick Custom.
Use same format as SQL date format yyyy-mm-dd
Click ok and save
it's working fine for me

How to import data from text file to mysql database

I have a 350MB file named text_file.txt containing this tab delimited data:
345868230 1646198120 1531283146 Keyword_1531283146 1.55 252910000
745345566 1646198120 1539847239 another_1531276364 2.75 987831000
...
MySQL Database name: Xml_Date
Database table: PerformanceReport
I have already created the table with all the destination fields.
I want to import this text file data into a MySQL. I googled and found some commands like LOAD DATA INFILE and quite confused on how to use it.
How can I import this text file data?
It should be as simple as...
LOAD DATA INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine.
Walkthrough on using MySQL's LOAD DATA command:
Create your table:
CREATE TABLE foo(myid INT, mymessage VARCHAR(255), mydecimal DECIMAL(8,4));
Create your tab delimited file (note there are tabs between the columns):
1 Heart disease kills 1.2
2 one out of every two 2.3
3 people in America. 4.5
Use the load data command:
LOAD DATA LOCAL INFILE '/tmp/foo.txt'
INTO TABLE foo COLUMNS TERMINATED BY '\t';
If you get a warning that this command can't be run, then you have to enable the --local-infile=1 parameter described here: How can I correct MySQL Load Error
The rows get inserted:
Query OK, 3 rows affected (0.00 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
Check if it worked:
mysql> select * from foo;
+------+----------------------+-----------+
| myid | mymessage | mydecimal |
+------+----------------------+-----------+
| 1 | Heart disease kills | 1.2000 |
| 2 | one out of every two | 2.3000 |
| 3 | people in America. | 4.5000 |
+------+----------------------+-----------+
3 rows in set (0.00 sec)
How to specify which columns to load your text file columns into:
Like this:
LOAD DATA LOCAL INFILE '/tmp/foo.txt' INTO TABLE foo
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(#col1,#col2,#col3) set myid=#col1,mydecimal=#col3;
The file contents get put into variables #col1, #col2, #col3. myid gets column 1, and mydecimal gets column 3. If this were run, it would omit the second row:
mysql> select * from foo;
+------+-----------+-----------+
| myid | mymessage | mydecimal |
+------+-----------+-----------+
| 1 | NULL | 1.2000 |
| 2 | NULL | 2.3000 |
| 3 | NULL | 4.5000 |
+------+-----------+-----------+
3 rows in set (0.00 sec)
If your table is separated by others than tabs, you should specify it like...
LOAD DATA LOCAL
INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport
COLUMNS TERMINATED BY '\t' ## This should be your delimiter
OPTIONALLY ENCLOSED BY '"'; ## ...and if text is enclosed, specify here
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.
LOAD DATA INFILE '/tmp/test.txt'
INTO TABLE test
FIELDS TERMINATED BY ','
LINES STARTING BY 'xxx';
If the data file looks like this:
xxx"abc",1
something xxx"def",2
"ghi",3
The resulting rows will be ("abc",1) and ("def",2). The third row in the file is skipped because it does not contain the prefix.
LOAD DATA INFILE 'data.txt'
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
You can also load data files by using the mysqlimport utility; it operates by sending a LOAD DATA INFILE statement to the server
mysqlimport -u root -ptmppassword --local test employee.txt
test.employee: Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
You should set the option:
local-infile=1
into your [mysql] entry of my.cnf file or call mysql client with the --local-infile option:
mysql --local-infile -uroot -pyourpwd yourdbname
You have to be sure that the same parameter is defined into your [mysqld] section too to enable the "local infile" feature server side.
It's a security restriction.
LOAD DATA LOCAL INFILE '/softwares/data/data.csv' INTO TABLE tableName;
LOAD DATA INFILE '/home/userlap/data2/worldcitiespop.txt' INTO TABLE cc FIELDS TERMINATED BY ','LINES TERMINATED BY '\r \n' IGNORE 1 LINES;
IGNORE 1 LINES to skip over an initial header line containing column names
FIELDS TERMINATED BY ',' is to read the comma-delimited file
If you have generated the text file on a Windows system, you might have to use LINES TERMINATED BY '\r\n' to read the file properly, because Windows programs typically use two characters as a line terminator. Some programs, such as WordPad, might use \r as a line terminator when writing files. To read such files, use LINES TERMINATED BY '\r'.
For me just adding the "LOCAL" Keyword did the trick, please see the attached image for easier solution.
My attached image contains both use cases:
(a) Where I was getting this error.
(b) Where error was resolved by just adding "Local" keyword.
Make Sure your Local-Infile variable is set to True (ON)
mysql> show global variables like 'local_infile';
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| local_infile | OFF |
+---------------+-------+
1 row in set (0.04 sec)
mysql> set global local_infile=true;
Query OK, 0 rows affected (0.01 sec)
Find the correct path to store the txt files for loading in SQL tables
mysql> SELECT ##GLOBAL.secure_file_priv;
+------------------------------------------------+
| ##GLOBAL.secure_file_priv |
+------------------------------------------------+
| C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\ |
+------------------------------------------------+
1 row in set (0.00 sec)
Load using data infile from the path (Use backward slashes in path)
mysql> load data infile 'C:/ProgramData/MySQL/MySQL Server
8.0/Uploads/text_file.txt' into table TABLE_NAME fields terminated by '\t' lines terminated by '\n';
1. if it's tab delimited txt file:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
LINES TERMINATED BY '\r\n';
2. otherwise:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
FIELDS TERMINATED BY 'x' (here x could be comma ',', tab '\t', semicolon ';', space ' ')
LINES TERMINATED BY '\r\n';