Hi I am new in bigdata and I am trying to import data from mysql to hbase using sqoop.
sqoop import –connect jdbc:mysql://xxx.xxx.xx.xx:3306/FBI_DB –table FBI_CRIME –hbase-table H_FBI_CRIME –column-family cd –hbase-row-key ID –m 1 –username root -P
ERROR tool.ImportTool: Import failed: java.io.IOException: No columns
to generate for ClassWriter.
Once I had used ––driver com.mysql.jdbc.Driver but still didn’t get success.
Please help, What is wrong.
The problem is that you have to specify the names of the columns that you want to import, like this:
sqoop import --connect jdbc:mysql://xxx.xxx.xx.xx:3306/FBI_DB \
--table FBI_CRIME \
--hbase-table H_FBI_CRIME \
--columns "columnA,columnB" \
--column-family cd \
--hbase-row-key ID \
-m 1 \
--username root -P
I have MySQL Inventory table which don't have Auto Increment Id but has composite key and last modified date(YYYY-mm-DD HH:MM:SS) and will update very frequently.It has last 3 years data around 10 million records.
I want to move this data to HDFS by using Sqoop or some other way. Please suggest some approach.
Check this sqoop code below (that I use in similar tasks) based on --lastmodified: I want to assume here that you may have a date-like column to use with the --check-column argument.
sqoop import \
--connect jdbc:mysql://<server>:3306/db \
--username=your_username \
-P \
--table=your_table \
--append \
--incremental lastmodified \
--check-column creation_date \
--last-value "YYYY-mm-DD HH:MM:SS.x" \
--split-by some_numeric_id_column \
--target-dir /user/dir \
--num-mappers <MAPPER#>
I have to write a scoop import script which pulls certain columns from a data la_crime only from the year 2016
My script is below"
sqoop import \
--connect jdbc:mysql://XXXXXXXXXXX \
--driver com.mysql.jdbc.Driver \
--username XXXX \
--password XXXX \
--table la_crime \
--query "SELECT DR_NUMBER,DATE_REPORTED,DATE_OCCURED,TIME_OCCURED,AREA_ID,AREA_N AME,REPORTING_DISTRICT,CRIME_CODE,CRIME_CODE_DESC,VICTIM_AGE,VICTIM _GENDER,VICTIM_DESCENT,ADDRESS,CROSS_STREET,AUTO_ID FROM la_crime WHERE\$YEAR=2016\
--target-dir /user/sqoop_script \
-m 1
Could you tell me if my code is wrong somewhere? What changes do I have to make?
You may use the following syntax to import:
sqoop import \
--connect jdbc:mysql://localhost/yourdatabase \
--driver com.mysql.jdbc.Driver \
--username XXXX \
--password XXXX \
--table la_crime \
--query ‘SELECT DR_NUMBER,DATE_REPORTED,DATE_OCCURED,TIME_OCCURED,AREA_ID,AREA_N AME,REPORTING_DISTRICT,CRIME_CODE,CRIME_CODE_DESC,VICTIM_AGE,VICTIM _GENDER,VICTIM_DESCENT,ADDRESS,CROSS_STREET,AUTO_ID FROM la_crime WHERE $YEAR=2016' \
--target-dir /user/sqoop_script \
-m 1
For more details, refer “Sqoop User Guide”.
The command you're trying has some things wrong in the query option, first you need to close the double quotes at the end. Second, it seems weird to me that you're using a variable to specify the column to filter the year.
And third, if you use the query option it's mandatory to include the $CONDITIONS token and since you're using double quotes to issuing the query you need to \$CONDITIONS instead of just $CONDITIONS to disallow your shell from treating it as a shell variable.
Also, if you're using the query option you shouldn't use the table option.
I think this would be the command you're looking for:
sqoop import \
--connect jdbc:mysql://XXXXXXXXXXX \
--driver com.mysql.jdbc.Driver \
--username XXXX \
--password XXXX \
--query "SELECT DR_NUMBER,DATE_REPORTED,DATE_OCCURED,TIME_OCCURED,AREA_ID,AREA_N AME,REPORTING_DISTRICT,CRIME_CODE,CRIME_CODE_DESC,VICTIM_AGE,VICTIM _GENDER,VICTIM_DESCENT,ADDRESS,CROSS_STREET,AUTO_ID FROM la_crime WHERE YEAR = 2016 AND $CONDITIONS" \
--target-dir /user/sqoop_script \
-m 1
sqoop export --connect "jdbc:mysql://localhost:3306/retail_db" \
--driver com.mysql.jdbc.Driver \
--username root \
--table departments \
--export-dir /user/root/departments_export \
--batch \
--outdir java_files \
-m 1 \
--update-key department_id \
--update-mode allowinsert
Upsert function is not working with MySQL database, regarding this issue got the reason (link) but need a solution to resolve this issue.
You can try to get rid of --driver. My code worked by this way.
My scoop export from VM with CentOS to mysql on windows is failing with
java.io.IOException: Cannot run program "mysqlimport": error=2, No such file or directory.
My sqoop export command is
sqoop export --verbose --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table marksheet --direct --export-dir /user/hive/warehouse/marksheet --input-fields-terminated-by '\001'
My Import from mysql (windows) to centOS is successful though.
sqoop import --verbose --fields-terminated-by ',' --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table student --hive-import --create-hive-table --hive-home /home/training/hive --warehouse-dir /user/hive/warehouse --fields-terminated-by ',' --hive-table studentmysql
I have checked in the internet, did not get much help. Any help please...
I got the solution myself from the error...
I had to install mysql in my VM using yum install mysql.
The export works then...