Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class org.apache.hadoop.mapreduce.JobContext, but interface was expected - mysql

I am using hadoop 1.0.3 and sqoop 1.4.6. I am trying to import a table from MySQL to hdfs. I am getting following error:
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found class org.apache.hadoop.mapreduce.JobContext, but interface was expected
at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:65)
at com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36)
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:125)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:962)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:979)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Any suggestion on this?

This is due to sqoop being compiled on hadoop 1.x,
you can download sqoop compiled on hadoop 2.x on the sqoop site,
goto link - releases / sqoop 1
should look something like this,
sqoop-1.4.5.bin__hadoop-1.0.0.tar.gz
sqoop-1.4.5.bin__hadoop-2.0.4-alpha.tar.gz
get the sqoop or hadoop2.x should solve the problem.

Issue could be due to versions. Also please give the command details you have used.
Issue can be due to sqoop/sqoop2. You can look here.
Or in general, due to different versions of sqoop and haddop. Look here
General compatibility for sqoop and haddop versions info here

Related

Running Hbase mapreduce job gives HBaseConfiguration NoClassDefFoundError exception

I have set the variable in ~/.bashrc
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lcoal/Hbase/lib/hbase-client-1.2.4.jar
but when i compile the code
java -cp $HADOOP_CLASSPATH:/home/hadoopuser/Downloads/myjar.jar com.bigdata.uniquecoder.WordCountClass
It still gives me this error.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at com.bigdata.uniquecoder.WordCountClass.main(WordCountClass.java:57)
Caused by:java.lang.ClassNotFoundException:org.apache.hadoop.hbase.HBaseConfiguration at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 1 more
NOTE: It works fine when i run it in eclipse but when running on top of hadoop gives this error.
Any help will be highly appreciated.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
The missing class will be present in hbase-common-x.y.z.jar
Update the $HADOOP_CLASSPATH with
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/Hbase/lib/hbase-common-1.2.4.jar
Or,
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/Hbase/lib/
this will load all the jars under $HBASE_HOME/lib
Make sure $HADOOP_CLASSPATH contains necessary HADOOP libraries. Else, use the below export command in ~/.bashrc
export HADOOP_CLASSPATH=$HADOOP_HOME/share/hadoop/common/:$HADOOP_H‌​OME/share/hadoop/com‌​mon/lib/:$HADOOP_HOM‌​E/share/hadoop/hdfs/‌​:$HADOOP_HOME/share/‌​hadoop/hdfs/lib/:$HA‌​DOOP_HOME/share/hado‌​op/yarn/:$HADOOP_HOM‌​E/share/hadoop/yarn/‌​lib/:/usr/local/Hbase/lib/:$CLASSPATH

My account on Cosmos global instance seems to be running out of space - maybe need to increase quota

Trying to run a simple hdfs query failed with:
[ms#cosmosmaster-gi ~]$ hadoop fs -ls /user/ms/def_serv/def_servpath
Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file:
/tmp/hsperfdata_ms/21066
Try using the -Djava.io.tmpdir= option to select an alternate temp location.
Exception in thread "main" java.lang.NoClassDefFoundError: ___/tmp/hsperfdata_ms/21078
Caused by: java.lang.ClassNotFoundException: ___.tmp.hsperfdata_ms.21078
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: ___/tmp/hsperfdata_ms/21078. Program will exit.
Any idea how to fix that or increase quota?
Thanks!
ms
Your quota has not been exceeded (see command below), but this was a problem with the cluster. It should be fixed now.
$ hadoop fs -dus /user/ms
hdfs://cosmosmaster-gi/user/ms 90731

Hive mysql connector error

I have installed Hive and mysql successfully, i did the configuration for Hive as suggested in link. But i see an error as below:
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
..
..
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
so i added the mysql-connector-java.jar in lib of Hive. Now hive just hangs, i dont get the shell at all.
Kindly suggest how i can resolve it
You need to add mysql connector to the classpath in the hive. It is looking for that connector in your classpath and its not able to find it. Download mysql connector and put it to the following path
/usr/lib/hive/apache-hive-0.13.0-bin/lib

Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected?

I am using hadoop 1.0 and sqoop 1.4 which is compatible with each other.
When I am trying to import a table from MySQL to hdfs.
sqoop import --connect jdbc:mysql://localhost/mydemo --table wordcount -m 1 --username root --password root123
I am getting following error
Exception in thread "main" java.lang.IncompatibleClassChangeError:
Found interface org.apache.hadoop.mapreduce.JobContext, but class was
expected ?
I have tried running sqoop 1.4 with hadoop 1.0 and 2.0 still getting same error.
I have tried sqoop 1.99 with hadoop 2.0 also. So please suggest some different suggestions other than compatibility.
Sqoop and Sqoop2 have binary distributions that differ based on Hadoop version. This kind of error can be seen in Sqoop2 if it is compiled with the wrong version of Hadoop or if the wrong hadoop libraries are on your system. Things to try to fix this:
Look for old hadoop jars that may make it into the classpath. find / -name 'hadoop*.jar' should work.
Make sure you've downloaded a binary distrubtion of Sqoop. If you're using Hadoop2, download Sqoop2 for Hadoop2. If you're using Hadoop1 download Sqoop2 for Hadoop1.
Explicitly set the classpath via the tomcat properties in <server configuration directory>/catalina.properties. Jars can be explicitly added to common.loader.
If all else fails, reach out to the Sqoop mailing list.
when you have installed sqoop tar file of hadoop version 1 in version 2, usually this error occurs. try to download sqoop tar file of hadoop version 2 and install it. issue will be resolved for sure.
sqoop file for hadoop version 2 will look like below
sqoop-1.4.6.bin__**hadoop-2.0.4-**alpha.tar.gz

Cassandra ColumnFamilyInputformat throwing IncompatibleClassChangeError on Hadoop 2.2

When I tried to run a simple map reduce program talking to Cassandra, I get the following error. I am using Hadoop 2.2 and Cassandra 2.0.2. Can someone who solved this problem please respond back with the solution?
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.cassandra.hadoop.AbstractColumnFamilyInputFormat.getSplits(AbstractColumnFamilyInputFormat.java:116)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:491)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:508)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)