I have copied the jar file in sqoop/lib folder and the commands which i was using in sqoop is :
bin/sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password admin --table cities
Error Message :
14/06/21 08:44:44 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:772)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:660)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:683)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:223)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:347)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1277)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1089)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
you need to add the mysql-connector jar file to sqoop/lib folder to access the mysql database.
and then execute your sqoop command.
download link : http://mvnrepository.com/artifact/mysql/mysql-connector-java/5.1.6
Download mysql-connector jar from this link.
Extract and copy the mysql-connector-java-5.1.46.jar file to $SQOOP_HOME/lib/ directory.
Copy the same jar to lib wich is present in hdfs , if sqoop is using with hdfs.
Try to add this option
--driver com.mysql.jdbc.Driver
also make sure you have run the below commands
% mysql -u root -p
Enter password:
mysql> GRANT ALL PRIVILEGES ON sqoop.* TO ''#'localhost';
Query OK, 0 rows affected (0.00 sec)
mysql> quit;
Bye
Related
Importing data from MySQL sqoopdb to HDFS from table employee
Commands
sqoop import --connect jdbc:mysql://localhost:3306/sqoopdb --username sqoop -P --table employee
sqoop import --connect jdbc:mysql://localhost:3306/sqoopdb --username sqoop -P --table employee --m 1
Both generate the below error:
Warning: /usr/local/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
2019-03-24 16:08:19,779 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Enter password:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/lang/StringUtils
at org.apache.sqoop.tool.BaseSqoopTool.validateHiveOptions(BaseSqoopTool.java:1583)
at org.apache.sqoop.tool.ImportTool.validateOptions(ImportTool.java:1178)
at org.apache.sqoop.Sqoop.run(Sqoop.java:137)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.lang.StringUtils
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more`
Download and copy the commons-lang-2.6.jar to /usr/local/sqoop/lib folder and import will work fine.
I am using Hadoop - 3.2.0 and Sqoop 1.4.7
Try the below command:
sqoop import --connect jdbc:mysql://localhost:3306/sqoopdb --driver com.mysql.jdbc.Driver --username sqoop -P --table employee --m 1
Also make sure you have granted the access for this database using
% mysql -u root -p
Enter password:
mysql> GRANT ALL PRIVILEGES ON sqoopdb.* TO ''#'localhost';
Query OK, 0 rows affected (0.00 sec)
mysql> quit;
Bye
I am using Hadoop-1.2.1 and Sqoop-1.4.6. I am using sqoop to import the table test from the database meshtree into HDFS using this command:
`sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test`
But, it shows this error:
17/06/17 18:15:21 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/06/17 18:15:21 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/06/17 18:15:21 INFO tool.CodeGenTool: Beginning code generation
17/06/17 18:15:22 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
17/06/17 18:15:22 INFO orm.CompilationManager: HADOOP_HOME is /home/student /Installations/hadoop-1.2.1/libexec/..
Note: /tmp/sqoop-student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/17 18:15:24 ERROR orm.CompilationManager: Could not rename /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java to /home/student /Installations/hadoop-1.2.1/./test.java
org.apache.commons.io.FileExistsException: Destination '/home/student /Installations/hadoop-1.2.1/./test.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:367)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
17/06/17 18:15:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.jar
17/06/17 18:15:24 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/06/17 18:15:24 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/06/17 18:15:24 WARN manager.MySQLManager: option to exercise a MySQL- specific fast path.
17/06/17 18:15:24 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/06/17 18:15:24 INFO mapreduce.ImportJobBase: Beginning import of test
17/06/17 18:15:27 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:9000/home/student/Installations/hadoop-1.2.1/data/mapred /staging/student/.staging/job_201706171814_0001
17/06/17 18:15:27 ERROR security.UserGroupInformation: PriviledgedActionException as:student cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory test already exists
17/06/17 18:15:27 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory test already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileO utputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
Is there any way to figure out this problem?
It’s important that you do not use the URL localhost if you intend to use Sqoop with a distributed Hadoop cluster. The connect string you supply will be used on TaskTracker nodes throughout your MapReduce cluster; if you specify the literal name localhost, each node will connect to a different database (or more likely, no database at all). Instead, you should use the full hostname or IP address of the database host that can be seen by all your remote nodes.
Please visit Sqoop Document Connecting to a Database Server section for more information.
You don't permissions.So contact myql dba to grant you the same.
Or you may do yourself if you have admin access to mysql.
grant all privileges on databasename.* to 'username'#'%' identified by 'password';
*-for all tables
%- allow from any host
The above syntax is to grant permission to user in mysql server.In your case it will be:-
grant all privileges on meshtree.test to 'root'#'localhost' identified by 'yourpassword';
you are importing without providing target directory of hdfs. when we are not providing any target directory sqoop run import only once and create directory in hdfs with your mysql table name.
So your query
sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test
this create a directory with the name test1 in hdfs
Just add following script
sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test --target-dir test1
hope fully its work fine and just refer sqoop import and all related sqoop
I was using the export command in sqoop and facing this error while exporting from hdfs to MySQL
The command is:
sqoop export
--connect jdbc:mysql://localhost/property
--username root
--password root
--table xyz
--m 1
--export-dir abc.csv
The error is:
16/08/30 23:11:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/30 23:11:34 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/30 23:11:34 INFO tool.CodeGenTool: Beginning code generation
16/08/30 23:11:34 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:848)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:736)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:759)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:269)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1773)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1578)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Add mysql-connector.jar in $SQOOP_HOME/lib.
As per Sqoop docs,
You can use Sqoop with any other JDBC-compliant database. First, download the appropriate JDBC driver for the type of database you want to import, and install the .jar file in the $SQOOP_HOME/lib directory on your client machine
Also,
Each driver .jar file also has a specific driver class which defines the entry-point to the driver. For example, MySQL’s Connector/J library has a driver class of com.mysql.jdbc.Driver. Refer to your database vendor-specific documentation to determine the main driver class. This class must be provided as an argument to Sqoop with --driver.
So, add --driver com.mysql.jdbc.Driver in your command.
this is the command i am executing.
$ sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
the error->>
$ sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/20 14:08:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/20 14:08:06 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/05/20 14:08:06 INFO tool.CodeGenTool: Beginning code generation
16/05/20 14:08:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `widgets` AS t LIMIT 1
16/05/20 14:08:06 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#1d296da is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#1d296da is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2518)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1748)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1961)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2537)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2466)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1383)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:2939)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:576)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:440)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
16/05/20 14:08:06 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
i've read somewhere that warnings are harmless.
also when i am executing this command i am getting the output
$ sqoop list-tables --connect jdbc:mysql://localhost/hadoopguide
the output ->
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/20 14:10:32 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/20 14:10:32 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
widgets
please provide some help/suggestions
Make sure to specify following arguments:
The HDFS directory where to import your data:
--target-dir $hdfs_path
If your Mysql Database needs a username and a password:
--username $user --password $pass
If you haven't done this yet, provide a port number to your MySQL Path:
jdbc:mysql://$server:$port/$database
This should help!
#akshay naidu: It seems that the mysql connector version is not the latest which you are using.Check the below jira link:
https://issues.apache.org/jira/browse/SQOOP-1400
Upgrade your connector and then try.
I Installed Sqoop in my local machine. Following are the config information.
Bash.bashrc:
export HADOOP_HOME=/home/hduser/hadoop
export HBASE_HOME=/home/hduser/hbase
export HIVE_HOME=/home/hduser/hive
export HCAT_HOME=/home/hduser/hive/hcatalog
export SQOOP_HOME=/home/hduser/sqoop
export PATH=$PATH:$HIVE_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HBASE_HOME/bin
export PATH=$PATH:$SQOOP_HOME/bin
export PATH=$PATH:$HCAT_HOME/bin
Hadoop:
Version: Hadoop 1.0.3
Hive:
Version: hive 0.11.0
Mysql Connector driver
version: mysql-connector-java-5.1.29
"The driver is copied to the lib folder of sqoop"
Sqoop :
version: sqoop 1.4.4
After making all the installation I create a table in mysql named practice_1, But when I run the load command to load data from mysql to hdfs the command throws an exception:
ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
Coud anyone please guide me what can be the possible problem.
You need database driver in 'SQOOP' classpath check this
It has wonderful explanation about the 'SQOOP'
SQOOP has other options like
Ex: --driver com.microsoft.jdbc.sqlserver.SQLServerDriver -libjars=".*jar"
from here
You can use Sqoop with any other JDBC-compliant database. First, download the appropriate JDBC driver for the type of database you want to import, and install the .jar file in the $SQOOP_HOME/lib directory on your client machine. (This will be /usr/lib/sqoop/lib if you installed from an RPM or Debian package.) Each driver .jar file also has a specific driver class which defines the entry-point to the driver. For example, MySQL's Connector/J library has a driver class of com.mysql.jdbc.Driver. Refer to your database vendor-specific documentation to determine the main driver class. This class must be provided as an argument to Sqoop with --driver.
You may be interested in understanding the difference between connector and driver here is the article
Another solution which avoids using a shared library is adding the driver jar to the classpath of sqoop by using HADOOP_CLASSPATH. I haven't got the -libjars option to work. This solution works also on a secure cluster using kerberos.
HADOOP_CLASSPATH=/use.case/lib/postgresql-9.2-1003-jdbc4.jar
sqoop export --connect jdbc:postgresql://db:5432/user \
--driver org.postgresql.Driver \
--connection-manager org.apache.sqoop.manager.GenericJdbcManager \
--username user \
-P \
--export-dir /user/hive/warehouse/db1/table1 \
--table table2
This one works at least with sqoop 1.4.3-cdh4.4.0
You need to add the MySql connector to /usr/lib/sqoop/lib.
MySQL JDBC Driver by default is not present in Sqoop distribution in order to ensure that the default distribution is fully Apache license compliant.
Hope this helps...!!!
copy the 'mysql-connector-java-5.1.41-bin.jar' into sqoop/lib folder and execute sqoop import statements
If you have copied mysql driver to the sqoop lib folder. It will work for sure. Make sure you sqoop command is correct
/home/hduser/sqoop/bin/sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password root -–table practice_1 -m 1
It's a Oozie ShareLib problem. The script below works for my:
At Shell
sudo -u hdfs hadoop fs -chown cloudera:cloudera /user/oozie/share/lib/lib_20170719053712/sqoop
hdfs dfs -put /var/lib/sqoop/mysql-connector-java.jar /user/oozie/share/lib/lib_20170719053712/sqoop
sudo -u hdfs hadoop fs -chown oozie:oozie /user/oozie/share/lib/lib_20170719053712/sqoop
oozie admin -oozie http://localhost:11000/oozie -sharelibupdate
oozie admin -oozie http://localhost:11000/oozie -shareliblist sqoop
At Hue Sqoop Client
sqoop list-tables --connect jdbc:mysql://localhost/retail_db --username root --password cloudera
More detail at:
https://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/
You need to grant priveleges to the tables as below:
grant all privileges on marksheet.* to 'root'#'192.168.168.1'
identified by 'root123';
flush privileges;
Here is sample command that I have successfully executed:
sqoop import --verbose --fields-terminated-by ',' --connect
jdbc:mysql://192.168.168.1/test --username root --password root123
--table student --hive-import --create-hive-table --hive-home /home/training/hive --warehouse-dir /user/hive/warehouse
--fields-terminated-by ',' --hive-table studentmysql