Unable to import all tables - sql-server-2008

I am trying to import all the tables of sequel server using sqoop
sqoop import-all-tables --connect "jdbc:sqlserver://40.112.254.xxx;database=IDSTC" --username DbReader
--password Plexus123! --driver com.microsoft.sqlserver.jdbc.SQLServerDriver --target-dir
'/landing/IDSTC
it is showing me following error is there any modifications i have to do to my command
16/04/10 01:12:17 INFO sqoop.Sqoop: Running Sqoop version:
1.4.5.2.2.9.1-8
16/04/10 01:12:17 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
16/04/10 01:12:17 ERROR tool.BaseSqoopTool: Unrecognized argument: all-tables
16/04/10 01:12:17 ERROR tool.BaseSqoopTool: Unrecognized argument: --connect
16/04/10 01:12:17 ERROR tool.BaseSqoopTool: Unrecognized argument: jdbc:sqlserver://40.112.254.104;database=IDSTC

Try this:
sqoop import-all-tables --connect "jdbc:sqlserver://40.112.254.xxx;database=IDSTC" --username DbReader --password Plexus123! --driver com.microsoft.sqlserver.jdbc.SQLServerDriver --warehouse-dir /landing/IDSTC

Related

Unable to load JDBC driver when running Sqoop import

I created a SQL table and I am trying to import it data by using Sqoop to hdfs. But when I run sqoop import command, I am getting the following error.
This is the command:
hduser#ubuntu:~$sqoop import --connect jdbc:mysql://192.168.94.255/Cause_of_Death --table Death2014 --username root -P --target-dir /sqoop_out -m 1
Result:
Warning: /home/hduser/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /home/hduser/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hduser/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/hduser/sqoop-1.4.7.bin__hadoop-2.6.0/bin/../../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
19/12/20 02:54:21 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Enter password:
19/12/20 02:54:27 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/12/20 02:54:27 INFO tool.CodeGenTool: Beginning code generation
19/12/20 02:54:27 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:875)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1872)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1671)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

cloudera link error while run sqoop list database command

Am trying to run the below command in cloudera and getting link failure error. I have tried to restart mysqld service too, no use. Kindly some one help friends.
Code and error:
[cloudera#quickstart ~]$ sqoop list-databases --connect "jdbc:mysql://quickstart.cloudera:3306" --username=retail_dba --password=cloudera
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/09/22 09:45:59 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.10.0
17/09/22 09:45:59 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/09/22 09:45:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/09/22 09:46:16 ERROR manager.CatalogQueryManager: Failed to list databases
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
Download mysql-connector-java-5.1.21.jar and copy it into the sqoop lib folder then try to run the sqoop import as follows:
sqoop list-databases \
--connect "jdbc:mysql://localhost:3306" \
--username=retail_dba \
--password=cloudera

Data import from MySQL with Sqoop - Error : No manager for connect string

[training#localhost ~]$ sqoop import-all-tables --connect "jbdc:mysql://localhost/training" --username training -P -m 1
Enter password:
16/07/10 08:01:45 ERROR tool.BaseSqoopTool: Got error creating database manager: java.io.IOException: No manager for connect string: jbdc:mysql://localhost/training
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:119)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:200)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:83)
at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:48)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
Your syntax is wrong, you have to use:
sqoop import-all-tables --connect "jdbc:mysql://localhost/training" --username training -P -m 1
Your wrote jbdc instead of jdbc.

Apache Sqoop connectivity error

I am getting the following error while trying to list databases from mysql database using sqoop. I am using Cloudera VM CDH4 were it does not come with MySql pre installed by default. I installed MySql as per cloudera tutorial. Now I am trying to list database from MySQl and it fails. Is there any jdbc connectivity issue?
[cloudera#localhost ~]$ sqoop list-databases --connect "jdbc:mysql://localhost.localdomain" --user root --password aaaaaaaa
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/12/17 11:52:07 INFO sqoop.Sqoop: Running Sqoop version: 1.4.3-cdh4.7.0
14/12/17 11:52:07 ERROR tool.BaseSqoopTool: Error parsing arguments for list-databases:
14/12/17 11:52:07 ERROR tool.BaseSqoopTool: Unrecognized argument: --user
14/12/17 11:52:07 ERROR tool.BaseSqoopTool: Unrecognized argument: root
14/12/17 11:52:07 ERROR tool.BaseSqoopTool: Unrecognized argument: --password
14/12/17 11:52:07 ERROR tool.BaseSqoopTool: Unrecognized argument: aaaaaaaa
Try --help for usage instructions.
usage: sqoop list-databases [GENERIC-ARGS] [TOOL-ARGS]
Common arguments:
--connect <jdbc-uri> Specify JDBC connect
string
--connection-manager <class-name> Specify connection manager
class name
--connection-param-file <properties-file> Specify connection
parameters file
--driver <class-name> Manually specify JDBC
driver class to use
--hadoop-home <hdir> Override
$HADOOP_MAPRED_HOME_ARG
--hadoop-mapred-home <dir> Override
$HADOOP_MAPRED_HOME_ARG
--help Print usage instructions
-P Read password from console
--password <password> Set authentication
password
--password-file <password-file> Set authentication
password file path
--skip-dist-cache Skip copying jars to
distributed cache
--username <username> Set authentication
username
--verbose Print more information
while working
Generic Hadoop command-line arguments:
(must preceed any tool-specific arguments)
Generic options supported are
-conf <configuration file> specify an application configuration file
-D <property=value> use value for given property
-fs <local|namenode:port> specify a namenode
-jt <local|jobtracker:port> specify a job tracker
-files <comma separated list of files> specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars> specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives> specify comma separated archives to be unarchived on the compute machines.
The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]
Try this instead:
sqoop list-databases --connect jdbc:mysql://localhost \
--username root \
--password aaaaaaaa
Problem was there is not options --user for Sqoop instead you have to use --username.

sqoop export to mysql fails

My scoop export from VM with CentOS to mysql on windows is failing with
java.io.IOException: Cannot run program "mysqlimport": error=2, No such file or directory.
My sqoop export command is
sqoop export --verbose --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table marksheet --direct --export-dir /user/hive/warehouse/marksheet --input-fields-terminated-by '\001'
My Import from mysql (windows) to centOS is successful though.
sqoop import --verbose --fields-terminated-by ',' --connect jdbc:mysql://192.168.168.1/test --username root --password root123 --table student --hive-import --create-hive-table --hive-home /home/training/hive --warehouse-dir /user/hive/warehouse --fields-terminated-by ',' --hive-table studentmysql
I have checked in the internet, did not get much help. Any help please...
I got the solution myself from the error...
I had to install mysql in my VM using yum install mysql.
The export works then...