I am using Hadoop-1.2.1 and Sqoop-1.4.6. I am using sqoop to import the table test from the database meshtree into HDFS using this command:
`sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test`
But, it shows this error:
17/06/17 18:15:21 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/06/17 18:15:21 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/06/17 18:15:21 INFO tool.CodeGenTool: Beginning code generation
17/06/17 18:15:22 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1
17/06/17 18:15:22 INFO orm.CompilationManager: HADOOP_HOME is /home/student /Installations/hadoop-1.2.1/libexec/..
Note: /tmp/sqoop-student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/17 18:15:24 ERROR orm.CompilationManager: Could not rename /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java to /home/student /Installations/hadoop-1.2.1/./test.java
org.apache.commons.io.FileExistsException: Destination '/home/student /Installations/hadoop-1.2.1/./test.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:367)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
17/06/17 18:15:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop- student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.jar
17/06/17 18:15:24 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/06/17 18:15:24 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/06/17 18:15:24 WARN manager.MySQLManager: option to exercise a MySQL- specific fast path.
17/06/17 18:15:24 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/06/17 18:15:24 INFO mapreduce.ImportJobBase: Beginning import of test
17/06/17 18:15:27 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:9000/home/student/Installations/hadoop-1.2.1/data/mapred /staging/student/.staging/job_201706171814_0001
17/06/17 18:15:27 ERROR security.UserGroupInformation: PriviledgedActionException as:student cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory test already exists
17/06/17 18:15:27 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory test already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileO utputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
Is there any way to figure out this problem?
It’s important that you do not use the URL localhost if you intend to use Sqoop with a distributed Hadoop cluster. The connect string you supply will be used on TaskTracker nodes throughout your MapReduce cluster; if you specify the literal name localhost, each node will connect to a different database (or more likely, no database at all). Instead, you should use the full hostname or IP address of the database host that can be seen by all your remote nodes.
Please visit Sqoop Document Connecting to a Database Server section for more information.
You don't permissions.So contact myql dba to grant you the same.
Or you may do yourself if you have admin access to mysql.
grant all privileges on databasename.* to 'username'#'%' identified by 'password';
*-for all tables
%- allow from any host
The above syntax is to grant permission to user in mysql server.In your case it will be:-
grant all privileges on meshtree.test to 'root'#'localhost' identified by 'yourpassword';
you are importing without providing target directory of hdfs. when we are not providing any target directory sqoop run import only once and create directory in hdfs with your mysql table name.
So your query
sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test
this create a directory with the name test1 in hdfs
Just add following script
sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test --target-dir test1
hope fully its work fine and just refer sqoop import and all related sqoop
Related
I was using the export command in sqoop and facing this error while exporting from hdfs to MySQL
The command is:
sqoop export
--connect jdbc:mysql://localhost/property
--username root
--password root
--table xyz
--m 1
--export-dir abc.csv
The error is:
16/08/30 23:11:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/08/30 23:11:34 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/08/30 23:11:34 INFO tool.CodeGenTool: Beginning code generation
16/08/30 23:11:34 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:848)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:736)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:759)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:269)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:226)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1773)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1578)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Add mysql-connector.jar in $SQOOP_HOME/lib.
As per Sqoop docs,
You can use Sqoop with any other JDBC-compliant database. First, download the appropriate JDBC driver for the type of database you want to import, and install the .jar file in the $SQOOP_HOME/lib directory on your client machine
Also,
Each driver .jar file also has a specific driver class which defines the entry-point to the driver. For example, MySQL’s Connector/J library has a driver class of com.mysql.jdbc.Driver. Refer to your database vendor-specific documentation to determine the main driver class. This class must be provided as an argument to Sqoop with --driver.
So, add --driver com.mysql.jdbc.Driver in your command.
this is the command i am executing.
$ sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
the error->>
$ sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/20 14:08:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/20 14:08:06 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/05/20 14:08:06 INFO tool.CodeGenTool: Beginning code generation
16/05/20 14:08:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `widgets` AS t LIMIT 1
16/05/20 14:08:06 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#1d296da is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#1d296da is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2518)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1748)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1961)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2537)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2466)
at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1383)
at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:2939)
at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:576)
at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:440)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
16/05/20 14:08:06 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
i've read somewhere that warnings are harmless.
also when i am executing this command i am getting the output
$ sqoop list-tables --connect jdbc:mysql://localhost/hadoopguide
the output ->
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /media/akshay/MyMedia/sqoop-1.4.6.bin__hadoop-1.0.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/05/20 14:10:32 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/20 14:10:32 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
widgets
please provide some help/suggestions
Make sure to specify following arguments:
The HDFS directory where to import your data:
--target-dir $hdfs_path
If your Mysql Database needs a username and a password:
--username $user --password $pass
If you haven't done this yet, provide a port number to your MySQL Path:
jdbc:mysql://$server:$port/$database
This should help!
#akshay naidu: It seems that the mysql connector version is not the latest which you are using.Check the below jira link:
https://issues.apache.org/jira/browse/SQOOP-1400
Upgrade your connector and then try.
I am using hadoop-2.6.0 with kerberos security. I have installed hbase with kerberos security and could able to create table and scan it.
I could run sqoop job as well to import data from mysql into hdfs but sqoop job fails when trying to import from mysql into HBase.
Sqoop Command
sqoop import --hbase-create-table --hbase-table newtable --column-family ck --hbase-row-key id --connect jdbc:mysql://localhost/sample --username root --password root --table newtable -m 1
Exception
15/01/21 16:30:24 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=localhost:2181 sessionTimeout=90000 watcher=hconnection-0x734c0647, quorum=localhost:2181, baseZNode=/hbase
15/01/21 16:30:24 INFO zookeeper.ClientCnxn: Opening socket connection to server 127.0.0.1/127.0.0.1:2181. Will not attempt to authenticate using SASL (unknownerror)
15/01/21 16:30:24 INFO zookeeper.ClientCnxn: Socket connection established to 127.0.0.1/127.0.0.1:2181, initiating session
15/01/21 16:30:24 INFO zookeeper.ClientCnxn: Session establishment complete on server 127.0.0.1/127.0.0.1:2181, sessionid = 0x14b0ac124600016, negotiated timeout = 40000
15/01/21 16:30:25 ERROR tool.ImportTool: Error during import: Can't get authentication token
Could you please try the following :
In the connection string add the port number as :
jdbc:mysql://localhost:3306/sample
Remove --table newtable. Create the required table on Hbase first with the column family.
mention --split-by id
Finally mention a specific --fetch-size , as the sqoop client for MySQL have an error internally which attempts to set the default MIN fetch size which will run into an exception.
Could you attempt the import again and let us know ?
I have copied the jar file in sqoop/lib folder and the commands which i was using in sqoop is :
bin/sqoop import --connect jdbc:mysql://localhost:3306/sqoop --username root --password admin --table cities
Error Message :
14/06/21 08:44:44 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:772)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:660)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:683)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:240)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:223)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:347)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1277)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1089)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:96)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:396)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
you need to add the mysql-connector jar file to sqoop/lib folder to access the mysql database.
and then execute your sqoop command.
download link : http://mvnrepository.com/artifact/mysql/mysql-connector-java/5.1.6
Download mysql-connector jar from this link.
Extract and copy the mysql-connector-java-5.1.46.jar file to $SQOOP_HOME/lib/ directory.
Copy the same jar to lib wich is present in hdfs , if sqoop is using with hdfs.
Try to add this option
--driver com.mysql.jdbc.Driver
also make sure you have run the below commands
% mysql -u root -p
Enter password:
mysql> GRANT ALL PRIVILEGES ON sqoop.* TO ''#'localhost';
Query OK, 0 rows affected (0.00 sec)
mysql> quit;
Bye
Im trying to export a small table from mysql to HDFS using sqoop.The table has 2 columns id (primary key ) and name. Im able to list databases and tables via sqoop. But getting exception while importing table to HDFS. Kindly help . Below is the error log.
13/12/04 02:05:38 WARN conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
13/12/04 02:05:38 INFO jvm.JvmMetrics:
Initializing JVM Metrics withprocessName=JobTracker,sessionId=
13/12/04 02:05:39 INFO mapreduce.JobSubmitter:
Cleaning up the staging area file:/tmp/hadoop-hadoop/mapred/staging/hadoop1439217057
/.staging/job_local1439217057_0001
13/12/04 02:05:39 ERROR
security.UserGroupInformation:PriviledgedActionException as:hadoop
(auth:SIMPLE)
cause:java.io.FileNotFoundException:
File does not exist: hdfs://prat1:9000/home/hadoop/usr/sqoop-1.4.3-cdh4.3.0/lib/commons- compress-1.4.1.jar
13/12/04 02:05:39 DEBUG util.ClassLoaderStack:
Restoring classloader:sun.misc.Launcher$AppClassLoader#35a16869
13/12/04 02:05:39 ERROR tool.ImportTool:
Encountered IOException running import job: java.io.FileNotFoundException: File does not exist:
hdfs://prat1:9000/home/hadoop/usr/
sqoop-1.4.3-cdh4.3.0/lib/commons-compress-1.4.1.jar
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:824)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:254)
at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:290)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:361)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1269)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1266)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1266)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:226)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:555)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:111)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
As you are not having write permissions, you are getting security.UserGroupInformation:PriviledgedActionException
Try to login as a hdfs user and then run the sqoop command.
su root
type password
su hdfs
And then run sqoop export command.
Give /usr/lib/sqoop/lib/<mysqljavaconnector>-bin.jar file 777 permissions, and also make sure that files in /usr/lib/sqoop/* and /usr/local/hadoop/* are owned by the same user.
Try to do this:
su root
Type root password
su hdfs
And then run sqoop command and it will work like a champ!