Load infile statement throws error - mysql

I have a table where i am trying to insert data from a text file but it throws some error
the command is
mysql -u user -p<pwd> -h <server> --local-infile bsm -sse LOAD DATA LOCAL INFILE '/tmp/file.txt' INTO table test_jan2 FIELDS terminated by '|' LINES terminated by '\n' (value1,value2,value3) set id = NULL;
the error it throws is
bash: syntax error near unexpected token `('
the table structure is
+---------------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| value1 | varchar(50) | YES | | NULL | |
| value2 | varchar(50) | YES | | NULL | |
| value3 | varchar(50) | YES | | NULL | |
| date_created | varchar(50) | YES | | NULL | |
+---------------+-------------+------+-----+---------+----------------+

Try:
$ mysql -u user -p<pwd> -h <server> --local-infile bsm -e \
"LOAD DATA LOCAL INFILE '/tmp/file.txt'
INTO table test_jan2
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
(value1,value2,value3)"

Related

Export Impala Table from HDFS to MySQL

I'm trying to use Sqoop to export an impala table from HDFS to MySQL. The table has already been made in MySQL and the schema of the two tables should match.
Impala table information:
1 start_date string
2 start_station_code string
3 end_date string
4 end_station_code string
5 duration_sec int
6 is_member int
7 cnt bigint
Impala table sample:
2019-05-05 14:07:42100022019-05-05 14:31:087143140611322
2019-05-08 17:51:57100022019-05-08 17:55:29705221101322
2019-05-05 14:07:40100022019-05-05 14:31:087143140711322
2019-05-07 09:55:48100022019-05-07 10:02:28672439911322
2019-05-03 06:54:38100022019-05-03 06:59:51705231201322
2019-05-07 09:56:33100022019-05-07 10:02:17705234311322
2019-05-05 14:06:40100022019-05-05 14:18:04642768411322
2019-05-01 08:54:36100022019-05-01 08:58:20705222301322
2019-05-02 09:17:22100022019-05-02 09:22:16692129401322
2019-05-02 09:16:37100022019-05-02 09:19:30705217201322
2019-05-06 07:09:54100022019-05-06 07:18:45608453111322
MySQL Table information:
+--------------------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+--------------------+-------------+------+-----+---------+-------+
| start_date | varchar(10) | YES | | NULL | |
| start_station_code | varchar(20) | YES | | NULL | |
| end_date | varchar(20) | YES | | NULL | |
| end_station_code | varchar(20) | YES | | NULL | |
| duration_sec | int(11) | YES | | NULL | |
| is_member | int(11) | YES | | NULL | |
| cnt | bigint(20) | YES | | NULL | |
+--------------------+-------------+------+-----+---------+-------+
Export code:
sqoop export --connect jdbc:mysql://localhost/oozie --username root --password root --table bixirides_export --export-dir /user/hive/warehouse/impala_out/6* -m 1 --input-fields-terminated-by "|";
For some reason the sqoop export fails as soon as the Map task reaches 100%. The schema should match properly, but for some reason the export fails.
Error Message:
ERROR tool.ExportTool: Error during export:
Export job failed!
I see couple of issues.. based on your qn
start and end date are varchar(10) but data size seems to longer than that.
2019-05-05 14:07:42
I see delimiter as | but don't see that in Hive table.
Did you create the table with
ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
STORED AS textfile

Add rows into a mysql table from a .sql file

I have a question on How I can insert a .sql file into a MySQL table which already contains lot of data ?
My .sql file looks like (1200 rows) :
--
-- Descriptif plan comptable SYSCOHADA (utf-8)
--
INSERT INTO llx_accounting_system (rowid, pcg_version, fk_pays, label, active) VALUES (10,'SYSCOHADA', 49, 'Plan comptable Ouest-Africain', 1);
INSERT INTO llx_accounting_account (rowid, fk_pcg_version, pcg_type, pcg_subtype, account_number, account_parent, label, active) VALUES (15000,'SYSCOHADA','CAPITAUX','XXXXXX','1',0,"Capital",'1');
INSERT INTO llx_accounting_account (rowid, fk_pcg_version, pcg_type, pcg_subtype, account_number, account_parent, label, active) VALUES (15001,'SYSCOHADA','CAPITAUX','XXXXXX','101',15000,"Capital social",'1');
INSERT INTO llx_accounting_account (rowid, fk_pcg_version, pcg_type, pcg_subtype, account_number, account_parent, label, active) VALUES (15002,'SYSCOHADA','CAPITAUX','XXXXXX','1011',15001,"Capital souscrit, non appele);
My MySQL table looks like :
mysql> describe llx_accounting_account ;
+----------------+--------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+----------------+--------------+------+-----+-------------------+-----------------------------+
| rowid | int(11) | NO | PRI | NULL | auto_increment |
| entity | int(11) | NO | | 1 | |
| datec | datetime | YES | | NULL | |
| tms | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| fk_pcg_version | varchar(32) | NO | MUL | NULL | |
| pcg_type | varchar(20) | NO | | NULL | |
| pcg_subtype | varchar(20) | NO | | NULL | |
| account_number | varchar(32) | NO | MUL | NULL | |
| account_parent | varchar(32) | YES | | NULL | |
| label | varchar(255) | NO | | NULL | |
| fk_user_author | int(11) | YES | | NULL | |
| fk_user_modif | int(11) | YES | | NULL | |
| active | tinyint(4) | NO | | 1 | |
+----------------+--------------+------+-----+-------------------+-----------------------------+
13 rows in set (0.00 sec)
My MySQL table is not empty. There is already data and I want to add my .sql file after my data table.
I didn't execute this command because I think it's false :
LOAD DATA LOCAL INFILE 'data_3.9.sql'
INTO TABLE llx_accounting_account
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\n'
Do you have the solution ?
Thank you :)
----------------------------------------------------------------------------------
Solution :
With comments by #RakeshKumar and #PaulF, I found a way to solve my problem :
1) I deleted all rows where fk_pcg_version = 'SYSCOHADA':
delete from llx_accounting_account where fk_pcg_version = 'SYSCOHADA' ;
2) I imported the .sql file :
mysql -u root -p****** dolibarr < data_3.9.sql
3) I modified one information because account_number was 1 instead of 10 where rowid = 15000 :
UPDATE llx_accounting_account SET account_number = 10 WHERE rowid=15000 ;
Seems good :)
Thank you ;)
Use following way to import file
mysql -u username -p'password' dbname < filename.sql
Your import didn't work because you had already your same old SYSCOHADA rows in your table.
You can delete all rows where fk_pcg_version = 'SYSCOHADA' and import again your file corrected.
I had the same error importing data from a SQL file that was created from a mysqldump table output to file command. The reason was that the SQL file has the DROP TABLE IF EXISTS and CREATE TABLE statement at the start of the file.
Once this is removed, then it effectively appends the rows to the existing records.
Maybe this will help others.

mySQL Command Line Import CSV Gives me NULL

I am used to using PHPmyadmin to manage my mySQL databases but I am starting to use the command line a lot more. I am trying to import a CSV file into a table called source_data that looks like this...
+-----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| code | varchar(10) | YES | | NULL | |
| result | char(1) | YES | | NULL | |
| source | char(1) | YES | | NULL | |
| timestamp | varchar(30) | YES | | NULL | |
+-----------+-------------+------+-----+---------+----------------+
And my CSV file looks like this...
code,result,source,timestamp
123 ABC,,,
456 DEF,,,
789 GHI,,,
234 JKL,,,
567 MNO,,,
890 PQR,,,
I am using this command..
LOAD DATA INFILE '/home/user1/data.csv' INTO TABLE source_data FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
This inserts the correct number of rows but each one just says NULL, where am I going wrong?
Since the CSV file doesn't have all the table columns (it's missing the id column), you need to specify the columns that they should be written into explicitly.
LOAD DATA INFILE '/home/user1/data.csv'
INTO TABLE source_data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(code, result, source, timestamp);
Well, I hope this wouldn't be a case still, You said table name is source_data and in command the name is data

mysql insert statements with '(' fail

I have a file with more than 86000 insert statements . some of the insert statements are having '(', ',', '\' type of data in column data. Mysql is throwing an error and does not recognize the data as column data.
Is there any setting in MySQL like set define off in oracle ?
error
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use near 'guntur
district, ap','08644 - 285237','chiluvuru','guntur','and pradesh'); inser' at line 1
| Field | Type | Null | Key | Default | Extra |
+--------------+---------------+------+-----+---------+-------+
| bank_name | varchar(200) | YES | | NULL | |
| ifsc_code | varchar(200) | YES | | NULL | |
| micr_code | varchar(200) | YES | | NULL | |
| branch_name | varchar(200) | YES | | NULL | |
| address | varchar(200) | YES | | NULL | |
| phone_number | varchar(200) | YES | | NULL | |
| city | varchar(200) | YES | | NULL | |
| district | varchar(200) | YES | | NULL | |
| state | varchar(200) | YES | | NULL | |
+--------------+---------------+------+-----+---------+-------+
When I try to add '\', ')', '(', ',' in my table, I am able to insert it..
Below is how I created table
create table mytab4 (myChar char) and create table mytab4 (myChar varchar(20)) both ways..
Could you please provide your structure of table and query trying to execute...
Table structure can be seen by executing query Describe myTable
Good Luck...
You might try:
shell> mysqlimport --fields-enclosed-by="'" db_name textfile1.sql
Check out: http://dev.mysql.com/doc/refman/5.6/en/mysqlimport.html
If all your fields are enclosed by a single quote and your single quotes within are escaped out (eg. values ('you're', 'no you\'re'));.
There is also an
--fields-optionally-enclosed-by=string option too
If you want to insert a '\' char, you should use '\\' else it consider the \' as an escape char and the query fail with the syntax error

MySql file import (LOAD DATA LOCAL INFILE)

I have a table called city:
+------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+--------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| country_id | mediumint(9) | NO | MUL | NULL | |
| region_id | bigint(20) | NO | MUL | NULL | |
| city | varchar(45) | NO | | NULL | |
| latitude | float(18,2) | NO | | NULL | |
| longitude | float(18,2) | NO | | NULL | |
| timezone | varchar(10) | NO | | NULL | |
| dma_id | mediumint(9) | YES | | NULL | |
| code | varchar(4) | YES | | NULL | |
+------------+--------------+------+-----+---------+----------------+
I have a simple file (just a test file) to import:
"id","country_id","region_id","city","latitude","longitude","timezone","dma_id","code"
42231,1,833,"Herat","34.333","62.2","+04:30",0,"HERA"
5976,1,835,"Kabul","34.517","69.183","+04:50",0,"KABU"
42230,1,852,"Mazar-e Sharif","36.7","67.1","+4:30",0,"MSHA"
42412,2,983,"Korce","40.6162","20.7779","+01:00",0,"KORC"
5977,2,1011,"Tirane","41.333","19.833","+01:00",0,"TIRA"
5978,3,856,"Algiers","36.763","3.051","+01:00",0,"ALGI"
5981,3,858,"Skikda","36.879","6.907","+01:00",0,"SKIK"
5980,3,861,"Oran","35.691","-0.642","+01:00",0,"ORAN"
I run this command:
LOAD DATA LOCAL INFILE 'cities_test.txt' INTO TABLE city FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;
Output:
Query OK, 0 rows affected (0.00 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
No records are inserted and I don't know why.
Any ideas?
Thanks!
Jamie
Worked it out. Silly mistake.
Had to change this:
LINES TERMINATED BY '\r\n'
To this:
LINES TERMINATED BY '\n'
:-)
I had the same problem, but I try this, erase the first row
`("id","country_id","region_id","city,"latitude","longitude",
"timezone","dma_id","code")` in your file to import.
Now when you run the comand write like this
mysql> LOAD DATA LOCAL
INFILE 'cities_test.txt'
INTO TABLE city FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
And that is all.
It worked for me :D
I had same issue on mac,
Try this if you are using mac
LOAD DATA INFILE 'sqlScript1.txt' INTO TABLE USER
FIELDS TERMINATED BY ',' LINES STARTING BY '\r';
For me, what worked on a mac was
LOAD DATA LOCAL
INFILE 'cities_test.txt'
INTO TABLE city FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r';
Since Macs use carriage return for its line break you must use '/r'