Sqoop error while importing to hdfs - mysql

Im trying to export a small table from mysql to HDFS using sqoop.The table has 2 columns id (primary key ) and name. Im able to list databases and tables via sqoop. But getting exception while importing table to HDFS. Kindly help . Below is the error log.
13/12/04 02:05:38 WARN conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
13/12/04 02:05:38 INFO jvm.JvmMetrics:
Initializing JVM Metrics withprocessName=JobTracker,sessionId=
13/12/04 02:05:39 INFO mapreduce.JobSubmitter:
Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop1439217057
/.staging/job_local1439217057_0001
13/12/04 02:05:39 ERROR
security.UserGroupInformation:PriviledgedActionException as:hadoop
(auth:SIMPLE)
cause:java.io.FileNotFoundException:
File does not exist: hdfs://prat1:9000/home/hadoop/usr/sqoop-1.4.3-cdh4.3.0/lib/commons- compress-1.4.1.jar
13/12/04 02:05:39 DEBUG util.ClassLoaderStack:
Restoring classloader:sun.misc.Launcher$AppClassLoader#35a16869
13/12/04 02:05:39 ERROR tool.ImportTool:
Encountered IOException running import job: java.io.FileNotFoundException: File does not exist:
hdfs://prat1:9000/home/hadoop/usr/
sqoop-1.4.3-cdh4.3.0/lib/commons-compress-1.4.1.jar
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:824)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:254)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:290)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:361)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1269)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1266)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1266)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:226)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:555)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:111)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

As you are not having write permissions, you are getting security.UserGroupInformation:PriviledgedActionException
Try to login as a hdfs user and then run the sqoop command.
su root
type password
su hdfs
And then run sqoop export command.

Give /usr/lib/sqoop/lib/<mysqljavaconnector>-bin.jar file 777 permissions, and also make sure that files in /usr/lib/sqoop/* and /usr/local/hadoop/* are owned by the same user.

Try to do this:
su root
Type root password
su hdfs
And then run sqoop command and it will work like a champ!

Related

Import data from mysql into HDFS using Sqoop

I am using Hadoop-1.2.1 and Sqoop-1.4.6. I am using sqoop to import the table test from the database meshtree into HDFS using this command:
`sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test`
But, it shows this error:
17/06/17 18:15:21 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/06/17 18:15:21 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/06/17 18:15:21 INFO tool.CodeGenTool: Beginning code generation
17/06/17 18:15:22 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
17/06/17 18:15:22 INFO orm.CompilationManager: HADOOP_HOME is /home/student /Installations/hadoop-1.2.1/libexec/..
Note: /tmp/sqoop-student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/17 18:15:24 ERROR orm.CompilationManager: Could not rename /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java to /home/student /Installations/hadoop-1.2.1/./test.java
org.apache.commons.io.FileExistsException: Destination '/home/student /Installations/hadoop-1.2.1/./test.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:367)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
17/06/17 18:15:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.jar
17/06/17 18:15:24 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/06/17 18:15:24 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/06/17 18:15:24 WARN manager.MySQLManager: option to exercise a MySQL- specific fast path.
17/06/17 18:15:24 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/06/17 18:15:24 INFO mapreduce.ImportJobBase: Beginning import of test
17/06/17 18:15:27 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:9000/home/student/Installations/hadoop-1.2.1/data/mapred /staging/student/.staging/job_201706171814_0001
17/06/17 18:15:27 ERROR security.UserGroupInformation: PriviledgedActionException as:student cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory test already exists
17/06/17 18:15:27 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory test already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileO utputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
Is there any way to figure out this problem?
It’s important that you do not use the URL localhost if you intend to use Sqoop with a distributed Hadoop cluster. The connect string you supply will be used on TaskTracker nodes throughout your MapReduce cluster; if you specify the literal name localhost, each node will connect to a different database (or more likely, no database at all). Instead, you should use the full hostname or IP address of the database host that can be seen by all your remote nodes.
Please visit Sqoop Document Connecting to a Database Server section for more information.
You don't permissions.So contact myql dba to grant you the same.
Or you may do yourself if you have admin access to mysql.
grant all privileges on databasename.* to 'username'#'%' identified by 'password';
*-for all tables
%- allow from any host
The above syntax is to grant permission to user in mysql server.In your case it will be:-
grant all privileges on meshtree.test to 'root'#'localhost' identified by 'yourpassword';
you are importing without providing target directory of hdfs. when we are not providing any target directory sqoop run import only once and create directory in hdfs with your mysql table name.
So your query
sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test
this create a directory with the name test1 in hdfs
Just add following script
sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test --target-dir test1
hope fully its work fine and just refer sqoop import and all related sqoop

Error in export sqoop command

I was using the export command in sqoop and facing this error while exporting from hdfs to MySQL
The command is:
sqoop export
--connect jdbc:mysql://localhost/property
--username root
--password root
--table xyz
--m 1
--export-dir abc.csv
The error is:
16/08/30 23:11:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/30 23:11:34 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/30 23:11:34 INFO tool.CodeGenTool: Beginning code generation
16/08/30 23:11:34 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:848)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:736)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:759)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:269)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1773)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1578)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Add mysql-connector.jar in $SQOOP_HOME/lib.
As per Sqoop docs,
You can use Sqoop with any other JDBC-compliant database. First, download the appropriate JDBC driver for the type of database you want to import, and install the .jar file in the $SQOOP_HOME/lib directory on your client machine
Also,
Each driver .jar file also has a specific driver class which defines the entry-point to the driver. For example, MySQL’s Connector/J library has a driver class of com.mysql.jdbc.Driver. Refer to your database vendor-specific documentation to determine the main driver class. This class must be provided as an argument to Sqoop with --driver.
So, add --driver com.mysql.jdbc.Driver in your command.

error on my first Sqoop job.

this is the command i am executing.
$ sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
the error->>
$ sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/20 14:08:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/20 14:08:06 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/05/20 14:08:06 INFO tool.CodeGenTool: Beginning code generation
16/05/20 14:08:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `widgets` AS t LIMIT 1
16/05/20 14:08:06 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#1d296da is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#1d296da is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2518)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1748)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1961)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2537)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2466)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1383)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:2939)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:576)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:440)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
16/05/20 14:08:06 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
i've read somewhere that warnings are harmless.
also when i am executing this command i am getting the output
$ sqoop list-tables --connect jdbc:mysql://localhost/hadoopguide
the output ->
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/20 14:10:32 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/20 14:10:32 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
widgets
please provide some help/suggestions
Make sure to specify following arguments:
The HDFS directory where to import your data:
--target-dir $hdfs_path
If your Mysql Database needs a username and a password:
--username $user --password $pass
If you haven't done this yet, provide a port number to your MySQL Path:
jdbc:mysql://$server:$port/$database
This should help!
#akshay naidu: It seems that the mysql connector version is not the latest which you are using.Check the below jira link:
https://issues.apache.org/jira/browse/SQOOP-1400
Upgrade your connector and then try.

Caused by: java.sql.SQLException: The database is already in use by another process: org.hsqldb.persist.NIOLockFile

I have replaced sqoop metatsore with a MySQL db.
When I try to run a saved sqoop job from the command line, it "asks" me for the password and run the job when I supply it.
However, I need to run this job through oozie.
--password is not recognized as a valid argument here in command line while executing sqoop saved job and anyway, it keeps asking for password through the prompt.
Now, when I try to run it through oozie,
I get
Caused by: java.sql.SQLException: The database is already in use by
another process: org.hsqldb.persist.NIOLockFile#950abfc6[file
=/home/yarn/.sqoop/metastore.db.lck, exists=false, locked=false,
valid=false, fl =null]: java.io.FileNotFoundException:
/home/yarn/.sqoop/metastore.db.lck (No such file or directory)
What is this and why is this? I am directly connecting to my MySQL metatsore and have configured sqoop-site.xml accordingly.
Why is this trying to to connect to hsqldb?
Why is it locked?
How do I fix this?
Also, how do I supply password for execution of sqoop job in oozie?

java.sql.SQLException: expection being thrown for SQOOP import

When i'm trying to store table stored in mysql database into my HDFS using
sqoop import --connect jdbc:mysql://hostname1.com/mydb --username user1 --password pwd1 --table emp1;
I'm getting following exception:
Warning: /opt/cloudera/parcels/CDH-5.4.3-1.cdh5.4.3.p0.6/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/07/24 22:48:54 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
15/07/24 22:48:54 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/07/24 22:48:55 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
15/07/24 22:48:55 INFO tool.CodeGenTool: Beginning code generation
15/07/24 22:48:55 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp1` AS t LIMIT 1
15/07/24 22:48:55 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#2d749418 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#2d749418 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
15/07/24 22:48:55 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
My sqoop version is Running Sqoop version: 1.4.5-cdh5.4.3 and mysql-connector version is 5.1.31.
Any help in fixing my exception?
Thanks in advance.
In this version of Sqoop there is some problem with fetch size which is being set to Integer.MIN_VALUE (-2147483648) in MySQLManager class in /src/java/org/apache/sqoop/manager/MySQLManager.java. The default value set is 0 which is being overridden in this class. Due to this the resultset associated to the row are not releasing the lock and thus the error message. See below code from Sqoop source code:
#Override
protected void initOptionDefaults() {
if (options.getFetchSize() == null) {
LOG.info("Preparing to use a MySQL streaming resultset.");
options.setFetchSize(Integer.MIN_VALUE); //this line is causing the error
} else if (
!options.getFetchSize().equals(Integer.MIN_VALUE)
&& !options.getFetchSize().equals(0)) {
LOG.info("Argument '--fetch-size " + options.getFetchSize()
+ "' will probably get ignored by MySQL JDBC driver.");
// see also
// http://dev.mysql.com/doc/refman/5.5/en
// /connector-j-reference-implementation-notes.html
}
}
To solve this error, remove the above initOptionDefaults method from the MySQLManager class of the source code of your Sqoop version.
REF:https://issues.apache.org/jira/browse/SQOOP-1400
If you are using Sqoop Client Java API, you can specify the fetch size explicitly in the SqoopOptions as :
private static SqoopOptions SqoopOptions = new SqoopOptions();
SqoopOptions.setFetchSize(1000);