MySQL Data Import with secure_file_priv issue - mysql

I've got a newbie question about MySQL data import.
I am trying to import a .csv file into my MySQL database, then i have the "secure_file_priv" error, it spends me whole day to google and try, i found that the value for secure_file_priv is '/var/lib/mysql-files', as i am using a Centos 8. I placed the csv file into the directory, but it keeps showing the same error.
my code:
mysql> show variables like "secure_file_priv";
+------------------+-----------------------+
| Variable_name | Value |
+------------------+-----------------------+
| secure_file_priv | /var/lib/mysql-files/ |
+------------------+-----------------------+
1 row in set (0.01 sec)
mysql> load data infile '/var/lib/mysql-files/master.clientlist.csv'
-> into table agentcompany_compare
-> fields terminated by ','
-> lines terminated by '\n'
-> ignore 1 rows;
ERROR 1290 (HY000): The MySQL server is running with the --secure-file-priv
option so it cannot execute this statement

New Progress:
I have no clue but it "works", yet no data is correctly write into mysql, please see my codes:
mysql> load data infile '/var/lib/mysql-
files/clientlist.csv' into table agentcompany_compare
-> fields terminated by ','
-> lines terminated by ';\n'
-> ignore 1 rows;
Query OK, 0 rows affected (0.01 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
i've tried a lot, for your kind reference:
mysql> load data infile '/var/lib/mysql-
files/clientlist.csv' into table agentcompany_compare
-> fields terminated by ','
-> lines terminated by ';\n';
ERROR 1300 (HY000): Invalid utf8mb4 character string:
'idagentcompany acid acname actel acaddress
logo
1 1 "
'
mysql> load data infile '/var/lib/mysql-
files/clientlist.csv' into table agentcompany_compare
-> fields terminated by ','
-> lines terminated by '\r\n';
ERROR 1366 (HY000): Incorrect integer value:
'idagentcompany acid acname actel acaddress
logo' for column 'idagentcompany' at row 1
Tough problem.

Related

How to import some columns of csv file to mysql phpmyadmin [duplicate]

I have a 350MB file named text_file.txt containing this tab delimited data:
345868230 1646198120 1531283146 Keyword_1531283146 1.55 252910000
745345566 1646198120 1539847239 another_1531276364 2.75 987831000
...
MySQL Database name: Xml_Date
Database table: PerformanceReport
I have already created the table with all the destination fields.
I want to import this text file data into a MySQL. I googled and found some commands like LOAD DATA INFILE and quite confused on how to use it.
How can I import this text file data?
It should be as simple as...
LOAD DATA INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine.
Walkthrough on using MySQL's LOAD DATA command:
Create your table:
CREATE TABLE foo(myid INT, mymessage VARCHAR(255), mydecimal DECIMAL(8,4));
Create your tab delimited file (note there are tabs between the columns):
1 Heart disease kills 1.2
2 one out of every two 2.3
3 people in America. 4.5
Use the load data command:
LOAD DATA LOCAL INFILE '/tmp/foo.txt'
INTO TABLE foo COLUMNS TERMINATED BY '\t';
If you get a warning that this command can't be run, then you have to enable the --local-infile=1 parameter described here: How can I correct MySQL Load Error
The rows get inserted:
Query OK, 3 rows affected (0.00 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
Check if it worked:
mysql> select * from foo;
+------+----------------------+-----------+
| myid | mymessage | mydecimal |
+------+----------------------+-----------+
| 1 | Heart disease kills | 1.2000 |
| 2 | one out of every two | 2.3000 |
| 3 | people in America. | 4.5000 |
+------+----------------------+-----------+
3 rows in set (0.00 sec)
How to specify which columns to load your text file columns into:
Like this:
LOAD DATA LOCAL INFILE '/tmp/foo.txt' INTO TABLE foo
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(#col1,#col2,#col3) set myid=#col1,mydecimal=#col3;
The file contents get put into variables #col1, #col2, #col3. myid gets column 1, and mydecimal gets column 3. If this were run, it would omit the second row:
mysql> select * from foo;
+------+-----------+-----------+
| myid | mymessage | mydecimal |
+------+-----------+-----------+
| 1 | NULL | 1.2000 |
| 2 | NULL | 2.3000 |
| 3 | NULL | 4.5000 |
+------+-----------+-----------+
3 rows in set (0.00 sec)
If your table is separated by others than tabs, you should specify it like...
LOAD DATA LOCAL
INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport
COLUMNS TERMINATED BY '\t' ## This should be your delimiter
OPTIONALLY ENCLOSED BY '"'; ## ...and if text is enclosed, specify here
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.
LOAD DATA INFILE '/tmp/test.txt'
INTO TABLE test
FIELDS TERMINATED BY ','
LINES STARTING BY 'xxx';
If the data file looks like this:
xxx"abc",1
something xxx"def",2
"ghi",3
The resulting rows will be ("abc",1) and ("def",2). The third row in the file is skipped because it does not contain the prefix.
LOAD DATA INFILE 'data.txt'
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
You can also load data files by using the mysqlimport utility; it operates by sending a LOAD DATA INFILE statement to the server
mysqlimport -u root -ptmppassword --local test employee.txt
test.employee: Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
You should set the option:
local-infile=1
into your [mysql] entry of my.cnf file or call mysql client with the --local-infile option:
mysql --local-infile -uroot -pyourpwd yourdbname
You have to be sure that the same parameter is defined into your [mysqld] section too to enable the "local infile" feature server side.
It's a security restriction.
LOAD DATA LOCAL INFILE '/softwares/data/data.csv' INTO TABLE tableName;
LOAD DATA INFILE '/home/userlap/data2/worldcitiespop.txt' INTO TABLE cc FIELDS TERMINATED BY ','LINES TERMINATED BY '\r \n' IGNORE 1 LINES;
IGNORE 1 LINES to skip over an initial header line containing column names
FIELDS TERMINATED BY ',' is to read the comma-delimited file
If you have generated the text file on a Windows system, you might have to use LINES TERMINATED BY '\r\n' to read the file properly, because Windows programs typically use two characters as a line terminator. Some programs, such as WordPad, might use \r as a line terminator when writing files. To read such files, use LINES TERMINATED BY '\r'.
For me just adding the "LOCAL" Keyword did the trick, please see the attached image for easier solution.
My attached image contains both use cases:
(a) Where I was getting this error.
(b) Where error was resolved by just adding "Local" keyword.
Make Sure your Local-Infile variable is set to True (ON)
mysql> show global variables like 'local_infile';
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| local_infile | OFF |
+---------------+-------+
1 row in set (0.04 sec)
mysql> set global local_infile=true;
Query OK, 0 rows affected (0.01 sec)
Find the correct path to store the txt files for loading in SQL tables
mysql> SELECT ##GLOBAL.secure_file_priv;
+------------------------------------------------+
| ##GLOBAL.secure_file_priv |
+------------------------------------------------+
| C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\ |
+------------------------------------------------+
1 row in set (0.00 sec)
Load using data infile from the path (Use backward slashes in path)
mysql> load data infile 'C:/ProgramData/MySQL/MySQL Server
8.0/Uploads/text_file.txt' into table TABLE_NAME fields terminated by '\t' lines terminated by '\n';
1. if it's tab delimited txt file:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
LINES TERMINATED BY '\r\n';
2. otherwise:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
FIELDS TERMINATED BY 'x' (here x could be comma ',', tab '\t', semicolon ';', space ' ')
LINES TERMINATED BY '\r\n';

Why isn't all the data getting loaded into my MySQL table?

So I have a file of Twitter data that looks like this
Robert_Aderholt^&^&^2013-06-12 18:32:02^&^&^RT #financialcmte: In 2012, the Obama Admin published 1,172 new regulations totaling 79,000 pages. 57 were expected to have costs of at...
Robert_Aderholt^&^&^2013-06-12 13:42:09^&^&^The Administration's idea of a 'recovery' is 4 million fewer private sector jobs than the average post WWII recovery http://t.co/gSVW0Q8MYK
Robert_Aderholt^&^&^2013-06-11 13:51:17^&^&^As manufacturing jobs continue to decrease, its time to open new markets #4Jobs http://t.co/X2Mswr1i43
(The ^&^&^ words are separators, and I chose that separator because it's unlikely to occur in any of the tweets.)
This file is 90663 lines long (I checked by typing "wc -l tweets_parsed-6-12.csv").
However, when I load them into the table, I only get a table with 40456 entries:
mysql> source ../code/tweets2tables.sql;
Query OK, 0 rows affected (0.03 sec)
Query OK, 0 rows affected (0.08 sec)
Query OK, 40456 rows affected, 2962 warnings (0.81 sec)
Records: 40456 Deleted: 0 Skipped: 0 Warnings: 2962
mysql> SELECT COUNT(*) FROM tweets;
+----------+
| COUNT(*) |
+----------+
| 40456 |
+----------+
1 row in set (0.02 sec)
Why is that? I deleted all lines that didn't contain ^&^&^ so I didn't think there was any funny business going on with the data, but I could be wrong.
My loading code is
DROP TABLE IF EXISTS tweets;
CREATE TABLE tweets (
twitter_id VARCHAR(20),
post_date DATETIME,
body VARCHAR(140)
);
LOAD DATA
LOCAL INFILE 'tweets_parsed-6-12.csv'
INTO TABLE tweets
FIELDS TERMINATED BY '^&^&^'
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(twitter_id, post_date, body);
The lines that weren't loaded probably contained the " character. If you specify that your fields are terminated with ", the quotes inside of them should be escaped like this - "" (double quotes).
The OPTIONALLY keyword before ENCLOSED may help.

How to import data from text file to mysql database

I have a 350MB file named text_file.txt containing this tab delimited data:
345868230 1646198120 1531283146 Keyword_1531283146 1.55 252910000
745345566 1646198120 1539847239 another_1531276364 2.75 987831000
...
MySQL Database name: Xml_Date
Database table: PerformanceReport
I have already created the table with all the destination fields.
I want to import this text file data into a MySQL. I googled and found some commands like LOAD DATA INFILE and quite confused on how to use it.
How can I import this text file data?
It should be as simple as...
LOAD DATA INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine.
Walkthrough on using MySQL's LOAD DATA command:
Create your table:
CREATE TABLE foo(myid INT, mymessage VARCHAR(255), mydecimal DECIMAL(8,4));
Create your tab delimited file (note there are tabs between the columns):
1 Heart disease kills 1.2
2 one out of every two 2.3
3 people in America. 4.5
Use the load data command:
LOAD DATA LOCAL INFILE '/tmp/foo.txt'
INTO TABLE foo COLUMNS TERMINATED BY '\t';
If you get a warning that this command can't be run, then you have to enable the --local-infile=1 parameter described here: How can I correct MySQL Load Error
The rows get inserted:
Query OK, 3 rows affected (0.00 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
Check if it worked:
mysql> select * from foo;
+------+----------------------+-----------+
| myid | mymessage | mydecimal |
+------+----------------------+-----------+
| 1 | Heart disease kills | 1.2000 |
| 2 | one out of every two | 2.3000 |
| 3 | people in America. | 4.5000 |
+------+----------------------+-----------+
3 rows in set (0.00 sec)
How to specify which columns to load your text file columns into:
Like this:
LOAD DATA LOCAL INFILE '/tmp/foo.txt' INTO TABLE foo
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(#col1,#col2,#col3) set myid=#col1,mydecimal=#col3;
The file contents get put into variables #col1, #col2, #col3. myid gets column 1, and mydecimal gets column 3. If this were run, it would omit the second row:
mysql> select * from foo;
+------+-----------+-----------+
| myid | mymessage | mydecimal |
+------+-----------+-----------+
| 1 | NULL | 1.2000 |
| 2 | NULL | 2.3000 |
| 3 | NULL | 4.5000 |
+------+-----------+-----------+
3 rows in set (0.00 sec)
If your table is separated by others than tabs, you should specify it like...
LOAD DATA LOCAL
INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport
COLUMNS TERMINATED BY '\t' ## This should be your delimiter
OPTIONALLY ENCLOSED BY '"'; ## ...and if text is enclosed, specify here
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed.
LOAD DATA INFILE '/tmp/test.txt'
INTO TABLE test
FIELDS TERMINATED BY ','
LINES STARTING BY 'xxx';
If the data file looks like this:
xxx"abc",1
something xxx"def",2
"ghi",3
The resulting rows will be ("abc",1) and ("def",2). The third row in the file is skipped because it does not contain the prefix.
LOAD DATA INFILE 'data.txt'
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
You can also load data files by using the mysqlimport utility; it operates by sending a LOAD DATA INFILE statement to the server
mysqlimport -u root -ptmppassword --local test employee.txt
test.employee: Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
You should set the option:
local-infile=1
into your [mysql] entry of my.cnf file or call mysql client with the --local-infile option:
mysql --local-infile -uroot -pyourpwd yourdbname
You have to be sure that the same parameter is defined into your [mysqld] section too to enable the "local infile" feature server side.
It's a security restriction.
LOAD DATA LOCAL INFILE '/softwares/data/data.csv' INTO TABLE tableName;
LOAD DATA INFILE '/home/userlap/data2/worldcitiespop.txt' INTO TABLE cc FIELDS TERMINATED BY ','LINES TERMINATED BY '\r \n' IGNORE 1 LINES;
IGNORE 1 LINES to skip over an initial header line containing column names
FIELDS TERMINATED BY ',' is to read the comma-delimited file
If you have generated the text file on a Windows system, you might have to use LINES TERMINATED BY '\r\n' to read the file properly, because Windows programs typically use two characters as a line terminator. Some programs, such as WordPad, might use \r as a line terminator when writing files. To read such files, use LINES TERMINATED BY '\r'.
For me just adding the "LOCAL" Keyword did the trick, please see the attached image for easier solution.
My attached image contains both use cases:
(a) Where I was getting this error.
(b) Where error was resolved by just adding "Local" keyword.
Make Sure your Local-Infile variable is set to True (ON)
mysql> show global variables like 'local_infile';
+---------------+-------+
| Variable_name | Value |
+---------------+-------+
| local_infile | OFF |
+---------------+-------+
1 row in set (0.04 sec)
mysql> set global local_infile=true;
Query OK, 0 rows affected (0.01 sec)
Find the correct path to store the txt files for loading in SQL tables
mysql> SELECT ##GLOBAL.secure_file_priv;
+------------------------------------------------+
| ##GLOBAL.secure_file_priv |
+------------------------------------------------+
| C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\ |
+------------------------------------------------+
1 row in set (0.00 sec)
Load using data infile from the path (Use backward slashes in path)
mysql> load data infile 'C:/ProgramData/MySQL/MySQL Server
8.0/Uploads/text_file.txt' into table TABLE_NAME fields terminated by '\t' lines terminated by '\n';
1. if it's tab delimited txt file:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
LINES TERMINATED BY '\r\n';
2. otherwise:
LOAD DATA LOCAL INFILE 'D:/MySQL/event.txt' INTO TABLE event
FIELDS TERMINATED BY 'x' (here x could be comma ',', tab '\t', semicolon ';', space ' ')
LINES TERMINATED BY '\r\n';

Importing CSV using LOAD DATA INFILE quote problem

I'm trying to get this CSV file that I exported from excel loaded into my database and I can't seem to get the formatting correct no matter what I try.
Here is the SQL:
LOAD DATA INFILE 'path/file.csv'
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(column1, column2, column3);
This works fine but then I run into trouble when the end of a line (column 3) ends in a quote. For example:
Actual value: These are "quotes"
Value in CSV: "These are ""quotes"""
What happens is that I will get an extra quote on that value in the database and also any additional lines until it reaches another quote in the CSV. Any ideas on how to solve this?
Hmm. I tried to duplicate this problem but can't. Where does my data differ from yours? Can you provide sample data to duplicate this? Here's what I did:
> cat /tmp/data.csv
"aaaa","bbb ""ccc"" ddd",xxx
xxx,yyy,"zzz ""ooo"""
foo,bar,baz
mysql> CREATE TABLE t2 (a varchar(20), b varchar(20), c varchar(20));
Query OK, 0 rows affected (0.01 sec)
mysql> LOAD DATA INFILE '/tmp/data.csv' INTO TABLE t2 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (a, b, c);
Query OK, 3 rows affected (0.00 sec)
Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select * from t2;
+------+---------------+-----------+
| a | b | c |
+------+---------------+-----------+
| aaaa | bbb "ccc" ddd | xxx |
| xxx | yyy | zzz "ooo" |
| foo | bar | baz |
+------+---------------+-----------+
3 rows in set (0.00 sec)
Looks ok to me(?)
Also note that if you're working on a Windows platform you might need to use
LINES TERMINATED BY '\r\n' instead.

MySql import CSV space problem

I have a data file which looks like that:
001,000,D,Bla bla bla
I import it into a mysql database with following code:
LOAD DATA LOCAL
INFILE 'D:\test.dat'
INTO TABLE typen
FIELDS TERMINATED BY ','
IGNORE 1 LINES;
I get warnings for every line:
Warning | 1265 | Data truncated for column 'typ1' at row 1
and when I look at the content of the table there is a space between every character. It looks like that:
0 0 1 | 0 0 0 | D | B l a b l a b l a
Am I missing something?
The problem is the encoding of the file. I did not find the correct encoding for this file so I opened it in Notepad++ changed the encoding to utf8 and used follwing code:
LOAD DATA LOCAL
INFILE 'D:\test.dat'
INTO TABLE typen
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ','
IGNORE 1 LINES;