Sqoop: Could not load mysql driver exception - mysql

I Installed Sqoop in my local machine. Following are the config information.
Bash.bashrc:
export HADOOP_HOME=/home/hduser/hadoop
export HBASE_HOME=/home/hduser/hbase
export HIVE_HOME=/home/hduser/hive
export HCAT_HOME=/home/hduser/hive/hcatalog
export SQOOP_HOME=/home/hduser/sqoop
export PATH=$PATH:$HIVE_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HBASE_HOME/bin
export PATH=$PATH:$SQOOP_HOME/bin
export PATH=$PATH:$HCAT_HOME/bin
Hadoop:
Version: Hadoop 1.0.3
Hive:
Version: hive 0.11.0
Mysql Connector driver
version: mysql-connector-java-5.1.29
"The driver is copied to the lib folder of sqoop"
Sqoop :
version: sqoop 1.4.4
After making all the installation I create a table in mysql named practice_1, But when I run the load command to load data from mysql to hdfs the command throws an exception:
ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
Coud anyone please guide me what can be the possible problem.

You need database driver in 'SQOOP' classpath check this
It has wonderful explanation about the 'SQOOP'
SQOOP has other options like
Ex: --driver com.microsoft.jdbc.sqlserver.SQLServerDriver -libjars=".*jar"
from here
You can use Sqoop with any other JDBC-compliant database. First, download the appropriate JDBC driver for the type of database you want to import, and install the .jar file in the $SQOOP_HOME/lib directory on your client machine. (This will be /usr/lib/sqoop/lib if you installed from an RPM or Debian package.) Each driver .jar file also has a specific driver class which defines the entry-point to the driver. For example, MySQL's Connector/J library has a driver class of com.mysql.jdbc.Driver. Refer to your database vendor-specific documentation to determine the main driver class. This class must be provided as an argument to Sqoop with --driver.
You may be interested in understanding the difference between connector and driver here is the article

Another solution which avoids using a shared library is adding the driver jar to the classpath of sqoop by using HADOOP_CLASSPATH. I haven't got the -libjars option to work. This solution works also on a secure cluster using kerberos.
HADOOP_CLASSPATH=/use.case/lib/postgresql-9.2-1003-jdbc4.jar
sqoop export --connect jdbc:postgresql://db:5432/user \
--driver org.postgresql.Driver \
--connection-manager org.apache.sqoop.manager.GenericJdbcManager \
--username user \
-P \
--export-dir /user/hive/warehouse/db1/table1 \
--table table2
This one works at least with sqoop 1.4.3-cdh4.4.0

You need to add the MySql connector to /usr/lib/sqoop/lib.
MySQL JDBC Driver by default is not present in Sqoop distribution in order to ensure that the default distribution is fully Apache license compliant.
Hope this helps...!!!

copy the 'mysql-connector-java-5.1.41-bin.jar' into sqoop/lib folder and execute sqoop import statements

If you have copied mysql driver to the sqoop lib folder. It will work for sure. Make sure you sqoop command is correct
/home/hduser/sqoop/bin/sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password root -–table practice_1 -m 1

It's a Oozie ShareLib problem. The script below works for my:
At Shell
sudo -u hdfs hadoop fs -chown cloudera:cloudera /user/oozie/share/lib/lib_20170719053712/sqoop
hdfs dfs -put /var/lib/sqoop/mysql-connector-java.jar /user/oozie/share/lib/lib_20170719053712/sqoop
sudo -u hdfs hadoop fs -chown oozie:oozie /user/oozie/share/lib/lib_20170719053712/sqoop
oozie admin -oozie http://localhost:11000/oozie -sharelibupdate
oozie admin -oozie http://localhost:11000/oozie -shareliblist sqoop
At Hue Sqoop Client
sqoop list-tables --connect jdbc:mysql://localhost/retail_db --username root --password cloudera
More detail at:
https://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/

You need to grant priveleges to the tables as below:
grant all privileges on marksheet.* to 'root'#'192.168.168.1'
identified by 'root123';
flush privileges;
Here is sample command that I have successfully executed:
sqoop import --verbose --fields-terminated-by ',' --connect
jdbc:mysql://192.168.168.1/test --username root --password root123
--table student --hive-import --create-hive-table --hive-home /home/training/hive --warehouse-dir /user/hive/warehouse
--fields-terminated-by ',' --hive-table studentmysql

Related

Sqoop error for MYSQL-HDFS single table import - Windows 10

I am trying to import a table from MySql to HDFS.
Below is the query which I am trying to run:
sqoop import --connect jdbc:mysql://localhost:3306/demodb --table Categories --username root --target-dir /user/msingh/demodb -P
I am getting error:
Exception message: '/tmp/hadoop-Martand' is not recognized as an internal or external command,
Installation is fine. I verified with the following command:
sqoop list-databases --connect jdbc:mysql://localhost/ --username root -P
it is returning the list of databases available.
Any idea what is the mistake?
Screenshot:
Exception-Screenshot
I found the issue, So answering my own question.
I think this is happening because of "space" in your user profile folder C:/Users/{foldername}. Hadoop does not support "space" in the folder name. So when you execute the Hadoop job, it creates some folders during backend jobs, which causes the issue.
So I changed the user folder name. you can follow the below link to change the folder name:
https://superuser.com/questions/890812/how-to-rename-the-user-folder-in-windows-10#:~:text=Go%20to%20the%20C%3A%5C,to%20the%20new%20path%20name.
after that, My issue got resolved.

Sqoop Import error while loading data from MySQL to HDFS

I am trying to load data from MySQL to HDFS by using below query:
$ sqoop import --connect jdbc:mysql://127.0.0.1:3306/userdb \
--username root --password pwd \
--driver com.mysql.jdbc.Driver --table import-all-tables \
--m 1 --target-dir /user/cloudera/foldername
I get a "communications link failure error".
MYSQL Source: Installed in my local computer
HDFS Destination: Cloudera in virtual machine
Please help me how to load data from my local computer to cloudera HDFS .

Sqoop functionality has been removed from DSE

I am new to cassandra. Here I am tring to transfer whole my MYSQL database to cassandra using sqoop. But after all setup, when i execute following command.
bin/dse sqoop import-all-tables -m 1 --connect jdbc:mysql://127.0.0.1:3306/ABCDatabase --username root --password root --cassandra-thrift-host localhost --cassandra-create-schema --direct
I have received following error.
Sqoop functionality has been removed from DSE.
It said that sqoop functionality is removed from datastax. can you please if it removed then is there any other way to do that?
Thanks
You can use Spark to transfer data - it should be easy, something like:
val table = spark.read.jdbc(jdbcUrl, "table", connectionProperties)
table.write.format("org.apache.spark.sql.cassandra").options(
Map("table" -> "TBL", "keyspace" -> "KS")).save()
Examples of jdbc URLs, options, etc. are described in Databrick's documentation as they could be different for different databases.

Sqoop command for import table

I'm using dse 5.0.3 with have sqoop version 1.4.5.15.1 while im importing my data from mysql to cql this error i get
./dse sqoop cql-import --table npa_nxx --connect jdbc:mysql://localhost/npa_nxx_demo --username root --password 123
ERROR 13:20:53,886 Imported Failed: Parameter 'directory' is not a directory.
Please help me to solve it.!!
It's difficult to be exact about the problem but I'd suggest that you aren't including any cassandra parameters to the cql-import command is attempting to import to an hdfs directory that also isn't declared. Try including a cassandra-keyspace and cassandra-table on the command. Like:
./dse sqoop cql-import --table npa_nxx --connect jdbc:mysql://localhost/npa_nxx_demo --username root --password 123 --cassandra-keyspace npa_nxx --cassandra-table npa_nxx_data
This assumes that the cassandra keyspace and table are correctly setup.
Since this import looks like it is from the dse sqoop demo I'd suggest following the README.txt more closely as it has the correct options for this import in it.

How to import MySQL data to Hadoop file system?

In my system I have database in Mysql. I want to import that to hadoop file system. I found something about Sqoop, but i'm not getting command to do that.
sqoop import --connect jdbc:mysql://mysql-server-name/db_name --username user --password password --table table_name --target-dir target_directory_name -m1
Hope it helps..
You need to install mysql jdbc/java connector and run sqoop command.
sudo yum install mysql-connector-java
ln -s /usr/share/java/mysql-connector-java.jar /var/lib/sqoop/mysql-connector-java.jar
You can run sqoop command from
Save data into mysql from hive hadoop through sqoop?
1) Install & configure MySQL first. Create database in MySQL
2) sqoop import --connect jdbc:mysql://localhost/databasename --username $USER_NAME --password $PASSWORD$ --table tablename --m 1 command will import data.
e.g.
sqoop import --connect jdbc:mysql://localhost/testDb --username root --password hadoop123 --table student --m 1
In above command, values of various parameters database:‘testDb’ , username: ‘root’, password: ‘hadoop123’, and table student.
Have a look at this article 1 and article 2 for better understanding in step-by-step manner