Sqoop unable to import data from MYSQL to HBASE - mysql

Hi I am new in bigdata and I am trying to import data from mysql to hbase using sqoop.
sqoop import –connect jdbc:mysql://xxx.xxx.xx.xx:3306/FBI_DB –table FBI_CRIME –hbase-table H_FBI_CRIME –column-family cd –hbase-row-key ID –m 1 –username root -P
ERROR tool.ImportTool: Import failed: java.io.IOException: No columns
to generate for ClassWriter.
Once I had used ––driver com.mysql.jdbc.Driver but still didn’t get success.
Please help, What is wrong.

The problem is that you have to specify the names of the columns that you want to import, like this:
sqoop import --connect jdbc:mysql://xxx.xxx.xx.xx:3306/FBI_DB \
--table FBI_CRIME \
--hbase-table H_FBI_CRIME \
--columns "columnA,columnB" \
--column-family cd \
--hbase-row-key ID \
-m 1 \
--username root -P

Related

Hive import using sqoop from Mysql taking too long

I'm using hive and sqoop on top of hadoop in Ubuntu 18.04.
Hadoop, sqoop and Hive are working as expected but whenever I'm trying to import a data into Hive database I created, the job is halting for too long.
Sqoop command used:
sqoop import \
--connect jdbc:mysql://localhost/project? \
--zeroDateTimeBehavior=CONVERT_TO_NULL \
--username hiveuser \
-P \
--table rooms \
-- hive-import \
--hive-database sqoop \
--hive-table room_info
you can expedite the process using multiple mappers. for that you need to find out the column which has evenly distributed data and use that column as --split-by <column_name> and increase your mappers using -m <count> option.
sqoop import \
--connect jdbc:mysql://localhost/project? \
--zeroDateTimeBehavior=CONVERT_TO_NULL \
--username hiveuser \
-P \
--table rooms \
-- hive-import \
--hive-database sqoop \
--hive-table room_info
--split-by <column_name>
-m 5
Please read the following page to understand more in details.
https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html
particularly this topic: 7.2.4. Controlling Parallelism

Sqoop Import: Command --password: command not found, --table: command not found

I have to write a scoop import script which pulls certain columns from a data la_crime only from the year 2016
My script is below"
sqoop import \
--connect jdbc:mysql://XXXXXXXXXXX \
--driver com.mysql.jdbc.Driver \
--username XXXX \
--password XXXX \
--table la_crime \
--query "SELECT DR_NUMBER,DATE_REPORTED,DATE_OCCURED,TIME_OCCURED,AREA_ID,AREA_N AME,REPORTING_DISTRICT,CRIME_CODE,CRIME_CODE_DESC,VICTIM_AGE,VICTIM _GENDER,VICTIM_DESCENT,ADDRESS,CROSS_STREET,AUTO_ID FROM la_crime WHERE\$YEAR=2016\
--target-dir /user/sqoop_script \
-m 1
Could you tell me if my code is wrong somewhere? What changes do I have to make?
You may use the following syntax to import:
sqoop import \
--connect jdbc:mysql://localhost/yourdatabase \
--driver com.mysql.jdbc.Driver \
--username XXXX \
--password XXXX \
--table la_crime \
--query ‘SELECT DR_NUMBER,DATE_REPORTED,DATE_OCCURED,TIME_OCCURED,AREA_ID,AREA_N AME,REPORTING_DISTRICT,CRIME_CODE,CRIME_CODE_DESC,VICTIM_AGE,VICTIM _GENDER,VICTIM_DESCENT,ADDRESS,CROSS_STREET,AUTO_ID FROM la_crime WHERE $YEAR=2016' \
--target-dir /user/sqoop_script \
-m 1
For more details, refer “Sqoop User Guide”.
The command you're trying has some things wrong in the query option, first you need to close the double quotes at the end. Second, it seems weird to me that you're using a variable to specify the column to filter the year.
And third, if you use the query option it's mandatory to include the $CONDITIONS token and since you're using double quotes to issuing the query you need to \$CONDITIONS instead of just $CONDITIONS to disallow your shell from treating it as a shell variable.
Also, if you're using the query option you shouldn't use the table option.
I think this would be the command you're looking for:
sqoop import \
--connect jdbc:mysql://XXXXXXXXXXX \
--driver com.mysql.jdbc.Driver \
--username XXXX \
--password XXXX \
--query "SELECT DR_NUMBER,DATE_REPORTED,DATE_OCCURED,TIME_OCCURED,AREA_ID,AREA_N AME,REPORTING_DISTRICT,CRIME_CODE,CRIME_CODE_DESC,VICTIM_AGE,VICTIM _GENDER,VICTIM_DESCENT,ADDRESS,CROSS_STREET,AUTO_ID FROM la_crime WHERE YEAR = 2016 AND $CONDITIONS" \
--target-dir /user/sqoop_script \
-m 1

Error during export: Mixed update/insert is not supported against the target database yet

sqoop export --connect "jdbc:mysql://localhost:3306/retail_db" \
--driver com.mysql.jdbc.Driver \
--username root \
--table departments \
--export-dir /user/root/departments_export \
--batch \
--outdir java_files \
-m 1 \
--update-key department_id \
--update-mode allowinsert
Upsert function is not working with MySQL database, regarding this issue got the reason (link) but need a solution to resolve this issue.
You can try to get rid of --driver. My code worked by this way.

Import data from mysql to specific database table in hive using sqoop is failed

installed UBUNTU 14.04
installed HADOOP 2.7.1 version
installed SQOOP 1.4.6 version
installed XAMPP 7.0.1 version (for mysql)
installed HIVE 1.2.1
Trying to import data to HIVE by using below SQOOP command.
SQOOP Command example
sqoop import --connect jdbc:mysql://localhost/world \
--username root \
--table CountryLanguage \
--hive-import \
--hive-table world.countrylanguage -m 1
Error Message
database does not exists: world
Question
1. is there anything wrong in the above command, please correct me?

sqoop export to mysql fails

My scoop export from VM with CentOS to mysql on windows is failing with
java.io.IOException: Cannot run program "mysqlimport": error=2, No such file or directory.
My sqoop export command is
sqoop export --verbose --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table marksheet --direct --export-dir /user/hive/warehouse/marksheet --input-fields-terminated-by '\001'
My Import from mysql (windows) to centOS is successful though.
sqoop import --verbose --fields-terminated-by ',' --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table student --hive-import --create-hive-table --hive-home /home/training/hive --warehouse-dir /user/hive/warehouse --fields-terminated-by ',' --hive-table studentmysql
I have checked in the internet, did not get much help. Any help please...
I got the solution myself from the error...
I had to install mysql in my VM using yum install mysql.
The export works then...