Exception Error when I try to initialize hudson.war - exception

I have installed hudson on Ubuntu server and then run java -jar hudson.war, giving me this exception error message:
Status Code: 500 Exception: The error
below occurred during context
initialisation, so no further requests
can be processed:
java.lang.ExceptionInInitializerError
at
java.lang.Class.initializeClass(libgcj.so.10)
at
hudson.WebAppMain.installLogger(WebAppMain.java:257)
at
hudson.WebAppMain.contextInitialized(WebAppMain.java:112)
at
winstone.WebAppConfiguration.(WebAppConfiguration.java:889)
at
winstone.HostConfiguration.initWebApp(HostConfiguration.java:131)
at
winstone.HostConfiguration.(HostConfiguration.java:73)
at
winstone.HostGroup.initHost(HostGroup.java:85)
at
winstone.HostGroup.(HostGroup.java:45)
at
winstone.Launcher.(Launcher.java:196)
at
winstone.Launcher.main(Launcher.java:391)
at
java.lang.reflect.Method.invoke(libgcj.so.10)
at Main.main(Main.java:200) Caused
by:
com.thoughtworks.xstream.XStream$InitializationException:
Could not instantiate converter :
com.thoughtworks.xstream.converters.extended.DurationConverter
: null at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:735)
at
com.thoughtworks.xstream.XStream.setupConverters(XStream.java:699)
at
com.thoughtworks.xstream.XStream.(XStream.java:445)
at
com.thoughtworks.xstream.XStream.(XStream.java:385)
at
com.thoughtworks.xstream.XStream.(XStream.java:323)
at
hudson.util.XStream2.(XStream2.java:61)
at
hudson.model.Hudson.(Hudson.java:3571)
at
java.lang.Class.initializeClass(libgcj.so.10)
...11 more Caused by:
java.lang.reflect.InvocationTargetException
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:728)
...18 more Caused by:
javax.xml.datatype.DatatypeConfigurationException:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.converters.extended.DurationConverter.(DurationConverter.java:33)
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
...19 more Caused by:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
java.lang.Class.forName(libgcj.so.10)
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
...21 more
Stacktrace:
java.lang.ExceptionInInitializerError
at
java.lang.Class.initializeClass(libgcj.so.10)
at
hudson.WebAppMain.installLogger(WebAppMain.java:257)
at
hudson.WebAppMain.contextInitialized(WebAppMain.java:112)
at
winstone.WebAppConfiguration.(WebAppConfiguration.java:889)
at
winstone.HostConfiguration.initWebApp(HostConfiguration.java:131)
at
winstone.HostConfiguration.(HostConfiguration.java:73)
at
winstone.HostGroup.initHost(HostGroup.java:85)
at
winstone.HostGroup.(HostGroup.java:45)
at
winstone.Launcher.(Launcher.java:196)
at
winstone.Launcher.main(Launcher.java:391)
at
java.lang.reflect.Method.invoke(libgcj.so.10)
at Main.main(Main.java:200) Caused
by:
com.thoughtworks.xstream.XStream$InitializationException:
Could not instantiate converter :
com.thoughtworks.xstream.converters.extended.DurationConverter
: null at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:735)
at
com.thoughtworks.xstream.XStream.setupConverters(XStream.java:699)
at
com.thoughtworks.xstream.XStream.(XStream.java:445)
at
com.thoughtworks.xstream.XStream.(XStream.java:385)
at
com.thoughtworks.xstream.XStream.(XStream.java:323)
at
hudson.util.XStream2.(XStream2.java:61)
at
hudson.model.Hudson.(Hudson.java:3571)
at
java.lang.Class.initializeClass(libgcj.so.10)
...11 more Caused by:
java.lang.reflect.InvocationTargetException
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:728)
...18 more Caused by:
javax.xml.datatype.DatatypeConfigurationException:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.converters.extended.DurationConverter.(DurationConverter.java:33)
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
...19 more Caused by:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
java.lang.Class.forName(libgcj.so.10)
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
...21 more
Generated by Winstone Servlet Engine
v0.9.10 at Mon Oct 25 14:55:59 PDT
20102010
Do you know what I am missing?
any suggestions would be very appreciated.
regards
Naoya

You probably run the wrong java. Check if you use sun's oracle's java.
See here for other people that had this problem:
http://ubuntuforums.org/showthread.php?t=1434376

same comment, you probably have the wrong java version.
You can specify the correct path to the java binary in /etc/default/hudson.
If you use jenkins then edit /etc/default/jenkins instead.
# /etc/default/{hudson,jenkins}
JAVA_HOME=/path/to/jdk_1.6
JAVA=$JAVA_HOME/bin/java

Encountered a similar issue due to no more space in /tmp (root partition).

Related

Spark Notebook crashes while exporting a huge data frame as CSV file in Scala

I'm using Spark Notebook to process a huge number of Tweets. My final result is stored in a dataframe. While I try to export it as a CSV file, it shows an error though I used the same method before for exporting small dataset. Any suggestions, please?
Spark Notebook 0.7.0
Scala 2.10.6
Spark 2.0.1
Hadoop 2.7.2
Scala code for exporting dataframe
df.coalesce(1)
.write.format("com.databricks.spark.csv")
.option("header", "true")
.save("/home/shoeb/results")
Error Message
org.apache.spark.SparkException: Job aborted.
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelationCommand.scala:149)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply(InsertIntoHadoopFsRelationCommand.scala:115)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply(InsertIntoHadoopFsRelationCommand.scala:115)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:115)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:60)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:58)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:510)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:194)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:98)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:105)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:107)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:109)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:111)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:113)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:115)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:117)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:119)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:121)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:123)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:125)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:127)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:129)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:131)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:133)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:135)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:137)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:139)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:141)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:143)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:145)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:147)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:149)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:151)
at $iwC$$iwC$$iwC.<init>(<console>:153)
at $iwC$$iwC.<init>(<console>:155)
at $iwC.<init>(<console>:157)
at <init>(<console>:159)
at .<init>(<console>:163)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1046)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1327)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:822)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:853)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:801)
at notebook.kernel.Repl$$anonfun$6.apply(Repl.scala:202)
at notebook.kernel.Repl$$anonfun$6.apply(Repl.scala:202)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.Console$.withOut(Console.scala:126)
at notebook.kernel.Repl.evaluate(Repl.scala:201)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$27.replEvaluate$1(ReplCalculator.scala:402)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$27.apply(ReplCalculator.scala:415)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$27.apply(ReplCalculator.scala:396)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 16 in stage 5.0 failed 1 times, most recent failure: Lost task 16.0 in stage 5.0 (TID 98, localhost): java.io.FileNotFoundException: /tmp/blockmgr-bacb743a-120b-4b06-aecc-606755c69cf8/2a/temp_shuffle_3b5b834c-6477-4849-8915-1f7afaff6f14 (Too many open files)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:88)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:181)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1454)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1441)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1441)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1667)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1622)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1611)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1890)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1903)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1923)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelationCommand.scala:143)
... 77 more
Caused by: java.io.FileNotFoundException: /tmp/blockmgr-bacb743a-120b-4b06-aecc-606755c69cf8/2a/temp_shuffle_3b5b834c-6477-4849-8915-1f7afaff6f14 (Too many open files)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:88)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:181)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
TL;DR; Don't use coalesce(1).
While useful in some cases, when data is small enough, coalesce(1) moves data to a single worker node, and possibly, escalates this back to the source.
Therefore it is inherently not scalable and should be avoided. Almost all tools these days can, one way or another, read multiple files, and if you really need one file, using dedicated utilities is a much better choice.

How to avoid WARN messages about ReflectionsException

how can I avoid WARN messages to be displayed in logs (without putting the log4j level to ERROR) when I launch Confluent ?
I have set up my plugin.path variable in the properties file with value ${CONFLUENT_HOME}/share/java/kafka-connect-jdbc (with final comma).
I tried to put in the classpath the kafka-connect-jdbc repository, without success.
The following is just an example a small part of the log file:
[2018-07-10 15:40:30,168] INFO Reflections took 1 ms to scan 1 urls, producing 5 keys and 6 values [using 1 cores] (org.reflection
s.Reflections)
[2018-07-10 15:40:30,170] WARN could not get type for name org.jmock.Mockery from any class loader (org.reflections.Reflections)
org.reflections.ReflectionsException: could not get type for name org.jmock.Mockery
at org.reflections.ReflectionUtils.forName(ReflectionUtils.java:390)
at org.reflections.Reflections.expandSuperTypes(Reflections.java:381)
at org.reflections.Reflections.<init>(Reflections.java:126)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader$InternalReflections.<init>(DelegatingClassLoader.java:
365)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:277)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:216)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.registerPlugin(DelegatingClassLoader.java:208)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initPluginLoader(DelegatingClassLoader.java:177)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:154)
at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.java:56)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:77)
Caused by: java.lang.ClassNotFoundException: org.jmock.Mockery
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.reflections.ReflectionUtils.forName(ReflectionUtils.java:388)
... 10 more
That does not seem causing any issues, but can be confusing to read about it.
Have you got any suggestions about that ?
Thanks in advance,
Diego
You can set log level for particular package in log4j config file:
log4j.logger.org.reflections=ERROR
This helped me

java.lang.ExceptionInInitializerError while running the JUNIT method

I have been running a simple Junit testclass with a single test method.
I am stuck with the below error.
The stackTrace is as follows
java.lang.ExceptionInInitializerError
at java.lang.J9VMInternals.initialize(J9VMInternals.java:263)
at com.ibm.ws.webservices.engine.soap.SAAJMetaFactoryImpl.newMessageFactory(SAAJMetaFactoryImpl.java:56)
at javax.xml.soap.MessageFactory.newInstance(MessageFactory.java:159)
at com.sun.xml.internal.ws.api.SOAPVersion.<init>(SOAPVersion.java:179)
at com.sun.xml.internal.ws.api.SOAPVersion.<clinit>(SOAPVersion.java:84)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:241)
at com.sun.xml.internal.ws.api.BindingID.<clinit>(BindingID.java:336)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:241)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parseBinding(RuntimeWSDLParser.java:425)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parseWSDL(RuntimeWSDLParser.java:322)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:147)
at com.sun.xml.internal.ws.client.WSServiceDelegate.parseWSDL(WSServiceDelegate.java:267)
at com.sun.xml.internal.ws.client.WSServiceDelegate.<init>(WSServiceDelegate.java:230)
at com.sun.xml.internal.ws.client.WSServiceDelegate.<init>(WSServiceDelegate.java:178)
at com.sun.xml.internal.ws.spi.ProviderImpl.createServiceDelegate(ProviderImpl.java:93)
at javax.xml.ws.Service.<init>(Service.java:57)
Can anyone help me what could be the reason behind it
Caused by: org.apache.commons.logging.LogConfigurationException: org.apache.commons.logging.LogConfigurationException: java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j (Caused by java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j) (Caused by org.apache.commons.logging.LogConfigurationException: java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j (Caused by java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j))
at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:532)
at org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:272)
at com.ibm.ws.webservices.engine.components.logger.LogFactory$1.run(LogFactory.java:119)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118)
at com.ibm.ws.webservices.engine.components.logger.LogFactory.getLog(LogFactory.java:111)
at com.ibm.ws.webservices.engine.soap.MessageFactoryImpl.<clinit>(MessageFactoryImpl.java:103)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:241)
... 41 more
Caused by: org.apache.commons.logging.LogConfigurationException: java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j (Caused by java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j)
at org.apache.commons.logging.impl.LogFactoryImpl.getLogConstructor(LogFactoryImpl.java:416)
at org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:525)
... 48 more
Caused by: java.lang.UnsupportedOperationException: Operation [getContextClassLoader] is not supported in jcl-over-slf4j. See also http://www.slf4j.org/codes.html#unsupported_operation_in_jcl_over_slf4j
at org.apache.commons.logging.LogFactory.getContextClassLoader(LogFactory.java:366)
at org.apache.commons.logging.impl.LogFactoryImpl.access$000(LogFactoryImpl.java:113)
at org.apache.commons.logging.impl.LogFactoryImpl$1.run(LogFactoryImpl.java:457)
at java.security.AccessController.doPrivileged(AccessController.java:229)
at org.apache.commons.logging.impl.LogFactoryImpl.loadClass(LogFactoryImpl.java:454)
at org.apache.commons.logging.impl.LogFactoryImpl.getLogConstructor(LogFactoryImpl.java:406)
... 49 more
I did not have to add any jar files to the CLASSPATH though. I solved it by adding the following system property:
-Dorg.apache.commons.logging.LogFactory=org.apache.commons.logging.impl.SLF4JLogFactory
is this mandatory in running the webservices through JUNit?

Unable to run Kundera Sample application

I am unable to run a sample application using Kundera Sample.
Exception:
Exception in thread "main" javax.persistence.PersistenceException: invalid persistence.xml
at com.impetus.kundera.loader.PersistenceXMLLoader.getDocument(PersistenceXMLLoader.java:107)
at com.impetus.kundera.loader.PersistenceXMLLoader.findPersistenceUnits(PersistenceXMLLoader.java:142)
at com.impetus.kundera.loader.PersistenceUnitLoader.findPersistenceMetadatas(PersistenceUnitLoader.java:130)
at com.impetus.kundera.loader.PersistenceUnitLoader.getPersistenceMetadata(PersistenceUnitLoader.java:78)
at com.impetus.kundera.loader.PersistenceUnitLoader.load(PersistenceUnitLoader.java:63)
at com.impetus.kundera.loader.ApplicationLoader.load(ApplicationLoader.java:43)
at com.impetus.kundera.KunderaPersistence.initializeKundera(KunderaPersistence.java:95)
at com.impetus.kundera.KunderaPersistence.createEntityManagerFactory(KunderaPersistence.java:72)
at javax.persistence.Persistence.createEntityManagerFactory(Unknown Source)
at javax.persistence.Persistence.createEntityManagerFactory(Unknown Source)
at com.tcs.main.KunderaExample.main(KunderaExample.java:19)
Caused by: org.xml.sax.SAXParseException: cvc-elt.1: Cannot find the declaration of element 'persistence'.
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195)
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.error(ErrorHandlerWrapper.java:131)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:384)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:318)
at com.sun.org.apache.xerces.internal.impl.xs.XMLSchemaValidator.handleStartElement(XMLSchemaValidator.java:1887)
at com.sun.org.apache.xerces.internal.impl.xs.XMLSchemaValidator.startElement(XMLSchemaValidator.java:685)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:400)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl$NSContentDriver.scanRootElementHook(XMLNSDocumentScannerImpl.java:626)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3095)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:921)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:140)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:807)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:107)
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:225)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:283)
at com.impetus.kundera.loader.PersistenceXMLLoader.getDocument(PersistenceXMLLoader.java:103)
... 10 more
This is related to loading remotel persistence.xsd from sun java site. Change this in your persistence.xml
https://raw.github.com/impetus-opensource/Kundera/Kundera-2.0.4/kundera-core/src/test/resources/META-INF/persistence_2_0.xsd"
version="2.0">
2.0.6 release will also load it locally.
-Vivek

Where can I set auto-import="false" for hyperjaxb?

Do you know where can I set auto-import="false" when working with hyperjaxb?. I have this exception when calling Persistence.createEntityManagerFactory().
Assuming that this auto-import solves my problem, I would assume that I have to do manual importing instead. Where would I go about looking for such information?
Thank you.
Exception in thread "main" javax.persistence.PersistenceException: [PersistenceUnit: com.sun.java.xml.ns.persistence:org.jvnet.hyperjaxb3.ejb.schemas.customizations:com.sun.java.xml.ns.persistence.orm:generated] Unable to configure EntityManagerFactory
at org.hibernate.ejb.Ejb3Configuration.configure(Ejb3Configuration.java:265)
at org.hibernate.ejb.HibernatePersistence.createEntityManagerFactory(HibernatePersistence.java:125)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:52)
at PropertiesTest.Test1(PropertiesTest.java:68)
at PropertiesTest.main(PropertiesTest.java:121)
Caused by: org.hibernate.AnnotationException: Use of the same entity name twice: Basic
at org.hibernate.cfg.annotations.EntityBinder.bindEntity(EntityBinder.java:304)
at org.hibernate.cfg.AnnotationBinder.bindClass(AnnotationBinder.java:567)
at org.hibernate.cfg.AnnotationConfiguration.processArtifactsOfType(AnnotationConfiguration.java:546)
at org.hibernate.cfg.AnnotationConfiguration.secondPassCompile(AnnotationConfiguration.java:291)
at org.hibernate.cfg.Configuration.buildMappings(Configuration.java:1148)
at org.hibernate.ejb.Ejb3Configuration.buildMappings(Ejb3Configuration.java:1226)
at org.hibernate.ejb.EventListenerConfigurator.configure(EventListenerConfigurator.java:173)
at org.hibernate.ejb.Ejb3Configuration.configure(Ejb3Configuration.java:854)
at org.hibernate.ejb.Ejb3Configuration.configure(Ejb3Configuration.java:191)
at org.hibernate.ejb.Ejb3Configuration.configure(Ejb3Configuration.java:253)
... 4 more
Caused by: org.hibernate.DuplicateMappingException: duplicate import: Basic refers to both org.jvnet.hyperjaxb3.ejb.schemas.customizations.Basic and com.sun.java.xml.ns.persistence.orm.Basic (try using auto-import="false")
at org.hibernate.cfg.Mappings.addImport(Mappings.java:164)
at org.hibernate.cfg.annotations.EntityBinder.bindEntity(EntityBinder.java:297)
... 13 more