sqoop export to mysql fails - mysql

My scoop export from VM with CentOS to mysql on windows is failing with
java.io.IOException: Cannot run program "mysqlimport": error=2, No such file or directory.
My sqoop export command is
sqoop export --verbose --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table marksheet --direct --export-dir /user/hive/warehouse/marksheet --input-fields-terminated-by '\001'
My Import from mysql (windows) to centOS is successful though.
sqoop import --verbose --fields-terminated-by ',' --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table student --hive-import --create-hive-table --hive-home /home/training/hive --warehouse-dir /user/hive/warehouse --fields-terminated-by ',' --hive-table studentmysql
I have checked in the internet, did not get much help. Any help please...

I got the solution myself from the error...
I had to install mysql in my VM using yum install mysql.
The export works then...

Related

Sqoop unable to import data from MYSQL to HBASE

Hi I am new in bigdata and I am trying to import data from mysql to hbase using sqoop.
sqoop import –connect jdbc:mysql://xxx.xxx.xx.xx:3306/FBI_DB –table FBI_CRIME –hbase-table H_FBI_CRIME –column-family cd –hbase-row-key ID –m 1 –username root -P
ERROR tool.ImportTool: Import failed: java.io.IOException: No columns
to generate for ClassWriter.
Once I had used ––driver com.mysql.jdbc.Driver but still didn’t get success.
Please help, What is wrong.
The problem is that you have to specify the names of the columns that you want to import, like this:
sqoop import --connect jdbc:mysql://xxx.xxx.xx.xx:3306/FBI_DB \
--table FBI_CRIME \
--hbase-table H_FBI_CRIME \
--columns "columnA,columnB" \
--column-family cd \
--hbase-row-key ID \
-m 1 \
--username root -P

Hive import using sqoop from Mysql taking too long

I'm using hive and sqoop on top of hadoop in Ubuntu 18.04.
Hadoop, sqoop and Hive are working as expected but whenever I'm trying to import a data into Hive database I created, the job is halting for too long.
Sqoop command used:
sqoop import \
--connect jdbc:mysql://localhost/project? \
--zeroDateTimeBehavior=CONVERT_TO_NULL \
--username hiveuser \
-P \
--table rooms \
-- hive-import \
--hive-database sqoop \
--hive-table room_info
you can expedite the process using multiple mappers. for that you need to find out the column which has evenly distributed data and use that column as --split-by <column_name> and increase your mappers using -m <count> option.
sqoop import \
--connect jdbc:mysql://localhost/project? \
--zeroDateTimeBehavior=CONVERT_TO_NULL \
--username hiveuser \
-P \
--table rooms \
-- hive-import \
--hive-database sqoop \
--hive-table room_info
--split-by <column_name>
-m 5
Please read the following page to understand more in details.
https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html
particularly this topic: 7.2.4. Controlling Parallelism

Error during export: Mixed update/insert is not supported against the target database yet

sqoop export --connect "jdbc:mysql://localhost:3306/retail_db" \
--driver com.mysql.jdbc.Driver \
--username root \
--table departments \
--export-dir /user/root/departments_export \
--batch \
--outdir java_files \
-m 1 \
--update-key department_id \
--update-mode allowinsert
Upsert function is not working with MySQL database, regarding this issue got the reason (link) but need a solution to resolve this issue.
You can try to get rid of --driver. My code worked by this way.

Data import from MySQL with Sqoop - Error : No manager for connect string

[training#localhost ~]$ sqoop import-all-tables --connect "jbdc:mysql://localhost/training" --username training -P -m 1
Enter password:
16/07/10 08:01:45 ERROR tool.BaseSqoopTool: Got error creating database manager: java.io.IOException: No manager for connect string: jbdc:mysql://localhost/training
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:119)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:200)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:83)
at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:48)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
Your syntax is wrong, you have to use:
sqoop import-all-tables --connect "jdbc:mysql://localhost/training" --username training -P -m 1
Your wrote jbdc instead of jdbc.

Import data from mysql to specific database table in hive using sqoop is failed

installed UBUNTU 14.04
installed HADOOP 2.7.1 version
installed SQOOP 1.4.6 version
installed XAMPP 7.0.1 version (for mysql)
installed HIVE 1.2.1
Trying to import data to HIVE by using below SQOOP command.
SQOOP Command example
sqoop import --connect jdbc:mysql://localhost/world \
--username root \
--table CountryLanguage \
--hive-import \
--hive-table world.countrylanguage -m 1
Error Message
database does not exists: world
Question
1. is there anything wrong in the above command, please correct me?