Cofiguration Rmetrics with Rcov in hudson error - hudson

When I attempt to save a job that runs code coverage tests and is configured to publish an rcov report I get the error message listed below and the changes I made aren't saved. This problem cropped up with Hudson version 1.362 and exists in 1.363. If I uncheck the "Publish coverage report" checkbox the job can be saved.
Status Code: 500
Exception:
Stacktrace:
java.lang.InstantiationError: hudson.plugins.rubyMetrics.rcov.model.MetricTarget
at org.kohsuke.stapler.RequestImpl.bindParametersToList(RequestImpl.java:271)
at hudson.plugins.rubyMetrics.rcov.RcovPublisher$DescriptorImpl.newInstance(RcovPublisher.java:143)
at hudson.plugins.rubyMetrics.rcov.RcovPublisher$DescriptorImpl.newInstance(RcovPublisher.java:104)
at hudson.util.DescribableList.rebuild(DescribableList.java:147)
at hudson.model.Project.submit(Project.java:198)
at hudson.model.FreeStyleProject.submit(FreeStyleProject.java:97)
at hudson.model.Job.doConfigSubmit(Job.java:1050)
at hudson.model.AbstractProject.doConfigSubmit(AbstractProject.java:555)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.kohsuke.stapler.Function$InstanceFunction.invoke(Function.java:235)
at org.kohsuke.stapler.Function.bindAndInvoke(Function.java:116)
at org.kohsuke.stapler.Function.bindAndInvokeAndServeResponse(Function.java:57)
at org.kohsuke.stapler.MetaClass$1.doDispatch(MetaClass.java:75)
at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:30)
at org.kohsuke.stapler.Stapler.invoke(Stapler.java:525)
at org.kohsuke.stapler.MetaClass$6.doDispatch(MetaClass.java:181)
at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:30)
at org.kohsuke.stapler.Stapler.invoke(Stapler.java:525)
at org.kohsuke.stapler.Stapler.invoke(Stapler.java:441)
at org.kohsuke.stapler.Stapler.service(Stapler.java:123)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:45)
at winstone.ServletConfiguration.execute(ServletConfiguration.java:249)
at winstone.RequestDispatcher.forward(RequestDispatcher.java:335)
at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:378)
at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:94)
at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:86)
at winstone.FilterConfiguration.execute(FilterConfiguration.java:195)
at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:368)
at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:47)
at winstone.FilterConfiguration.execute(FilterConfiguration.java:195)
at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:368)
at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:84)
at hudson.security.ChainedServletFilter.doFilter(ChainedServletFilter.java:76)
at hudson.security.HudsonFilter.doFilter(HudsonFilter.java:164)
at winstone.FilterConfiguration.execute(FilterConfiguration.java:195)
at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:368)
at winstone.RequestDispatcher.forward(RequestDispatcher.java:333)
at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:244)
at winstone.RequestHandlerThread.run(RequestHandlerThread.java:150)
at java.lang.Thread.run(Thread.java:619)
Does anyone have a good solution? Thanks.

See Hudson Bugreport 6808

Related

Spark Notebook crashes while exporting a huge data frame as CSV file in Scala

I'm using Spark Notebook to process a huge number of Tweets. My final result is stored in a dataframe. While I try to export it as a CSV file, it shows an error though I used the same method before for exporting small dataset. Any suggestions, please?
Spark Notebook 0.7.0
Scala 2.10.6
Spark 2.0.1
Hadoop 2.7.2
Scala code for exporting dataframe
df.coalesce(1)
.write.format("com.databricks.spark.csv")
.option("header", "true")
.save("/home/shoeb/results")
Error Message
org.apache.spark.SparkException: Job aborted.
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelationCommand.scala:149)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply(InsertIntoHadoopFsRelationCommand.scala:115)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply(InsertIntoHadoopFsRelationCommand.scala:115)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:115)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:60)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:58)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:510)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:194)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:98)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:105)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:107)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:109)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:111)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:113)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:115)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:117)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:119)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:121)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:123)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:125)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:127)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:129)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:131)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:133)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:135)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:137)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:139)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:141)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:143)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:145)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:147)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:149)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:151)
at $iwC$$iwC$$iwC.<init>(<console>:153)
at $iwC$$iwC.<init>(<console>:155)
at $iwC.<init>(<console>:157)
at <init>(<console>:159)
at .<init>(<console>:163)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1046)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1327)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:822)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:853)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:801)
at notebook.kernel.Repl$$anonfun$6.apply(Repl.scala:202)
at notebook.kernel.Repl$$anonfun$6.apply(Repl.scala:202)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.Console$.withOut(Console.scala:126)
at notebook.kernel.Repl.evaluate(Repl.scala:201)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$27.replEvaluate$1(ReplCalculator.scala:402)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$27.apply(ReplCalculator.scala:415)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$27.apply(ReplCalculator.scala:396)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 16 in stage 5.0 failed 1 times, most recent failure: Lost task 16.0 in stage 5.0 (TID 98, localhost): java.io.FileNotFoundException: /tmp/blockmgr-bacb743a-120b-4b06-aecc-606755c69cf8/2a/temp_shuffle_3b5b834c-6477-4849-8915-1f7afaff6f14 (Too many open files)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:88)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:181)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1454)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1442)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1441)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1441)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1667)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1622)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1611)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1890)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1903)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1923)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1.apply$mcV$sp(InsertIntoHadoopFsRelationCommand.scala:143)
... 77 more
Caused by: java.io.FileNotFoundException: /tmp/blockmgr-bacb743a-120b-4b06-aecc-606755c69cf8/2a/temp_shuffle_3b5b834c-6477-4849-8915-1f7afaff6f14 (Too many open files)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:88)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:181)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:150)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:79)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:47)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
TL;DR; Don't use coalesce(1).
While useful in some cases, when data is small enough, coalesce(1) moves data to a single worker node, and possibly, escalates this back to the source.
Therefore it is inherently not scalable and should be avoided. Almost all tools these days can, one way or another, read multiple files, and if you really need one file, using dedicated utilities is a much better choice.

How to avoid WARN messages about ReflectionsException

how can I avoid WARN messages to be displayed in logs (without putting the log4j level to ERROR) when I launch Confluent ?
I have set up my plugin.path variable in the properties file with value ${CONFLUENT_HOME}/share/java/kafka-connect-jdbc (with final comma).
I tried to put in the classpath the kafka-connect-jdbc repository, without success.
The following is just an example a small part of the log file:
[2018-07-10 15:40:30,168] INFO Reflections took 1 ms to scan 1 urls, producing 5 keys and 6 values [using 1 cores] (org.reflection
s.Reflections)
[2018-07-10 15:40:30,170] WARN could not get type for name org.jmock.Mockery from any class loader (org.reflections.Reflections)
org.reflections.ReflectionsException: could not get type for name org.jmock.Mockery
at org.reflections.ReflectionUtils.forName(ReflectionUtils.java:390)
at org.reflections.Reflections.expandSuperTypes(Reflections.java:381)
at org.reflections.Reflections.<init>(Reflections.java:126)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader$InternalReflections.<init>(DelegatingClassLoader.java:
365)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:277)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:216)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.registerPlugin(DelegatingClassLoader.java:208)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initPluginLoader(DelegatingClassLoader.java:177)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:154)
at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.java:56)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:77)
Caused by: java.lang.ClassNotFoundException: org.jmock.Mockery
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.reflections.ReflectionUtils.forName(ReflectionUtils.java:388)
... 10 more
That does not seem causing any issues, but can be confusing to read about it.
Have you got any suggestions about that ?
Thanks in advance,
Diego
You can set log level for particular package in log4j config file:
log4j.logger.org.reflections=ERROR
This helped me

JUnit testing using ContainsAll-assertion

I'm testing a search function in the booking system I recently coded for my first semester project. My idea to test this was: I let the original search method return an arraylist of customers found in the search, fetch the list of all customers, and if the list of all customers contains the arraylist of customers found in the search, I assert that this works.
Note: I am prior creating a specific entry which I know I will find (hopefully, right?).
My problem is that when I call -
assertTrue(allCustomers.containsAll(searchResults));
- I get an error:
junit.framework.AssertionFailedError
at TestPackages.newBookingControllerTest.testOnEnter(newBookingControllerTest.java:98)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:86)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:74)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:211)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Can anyone elaborate on why this happens? I can edit some source code in if you need. Thanks!
AssertEquals takes two parameter, so you may see compilation error.
You should be using
assertTrue(allCustomers.contains(searchResults));//meaning allCustomers indeed contains searchResults
OR
assertEquals(allCustomers.contains(searchResults), true);//Meaning boolean returned by contains api is going to be compared with true

Unable to run Kundera Sample application

I am unable to run a sample application using Kundera Sample.
Exception:
Exception in thread "main" javax.persistence.PersistenceException: invalid persistence.xml
at com.impetus.kundera.loader.PersistenceXMLLoader.getDocument(PersistenceXMLLoader.java:107)
at com.impetus.kundera.loader.PersistenceXMLLoader.findPersistenceUnits(PersistenceXMLLoader.java:142)
at com.impetus.kundera.loader.PersistenceUnitLoader.findPersistenceMetadatas(PersistenceUnitLoader.java:130)
at com.impetus.kundera.loader.PersistenceUnitLoader.getPersistenceMetadata(PersistenceUnitLoader.java:78)
at com.impetus.kundera.loader.PersistenceUnitLoader.load(PersistenceUnitLoader.java:63)
at com.impetus.kundera.loader.ApplicationLoader.load(ApplicationLoader.java:43)
at com.impetus.kundera.KunderaPersistence.initializeKundera(KunderaPersistence.java:95)
at com.impetus.kundera.KunderaPersistence.createEntityManagerFactory(KunderaPersistence.java:72)
at javax.persistence.Persistence.createEntityManagerFactory(Unknown Source)
at javax.persistence.Persistence.createEntityManagerFactory(Unknown Source)
at com.tcs.main.KunderaExample.main(KunderaExample.java:19)
Caused by: org.xml.sax.SAXParseException: cvc-elt.1: Cannot find the declaration of element 'persistence'.
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195)
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.error(ErrorHandlerWrapper.java:131)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:384)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:318)
at com.sun.org.apache.xerces.internal.impl.xs.XMLSchemaValidator.handleStartElement(XMLSchemaValidator.java:1887)
at com.sun.org.apache.xerces.internal.impl.xs.XMLSchemaValidator.startElement(XMLSchemaValidator.java:685)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:400)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl$NSContentDriver.scanRootElementHook(XMLNSDocumentScannerImpl.java:626)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3095)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:921)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:140)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:807)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:107)
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:225)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:283)
at com.impetus.kundera.loader.PersistenceXMLLoader.getDocument(PersistenceXMLLoader.java:103)
... 10 more
This is related to loading remotel persistence.xsd from sun java site. Change this in your persistence.xml
https://raw.github.com/impetus-opensource/Kundera/Kundera-2.0.4/kundera-core/src/test/resources/META-INF/persistence_2_0.xsd"
version="2.0">
2.0.6 release will also load it locally.
-Vivek

Exception Error when I try to initialize hudson.war

I have installed hudson on Ubuntu server and then run java -jar hudson.war, giving me this exception error message:
Status Code: 500 Exception: The error
below occurred during context
initialisation, so no further requests
can be processed:
java.lang.ExceptionInInitializerError
at
java.lang.Class.initializeClass(libgcj.so.10)
at
hudson.WebAppMain.installLogger(WebAppMain.java:257)
at
hudson.WebAppMain.contextInitialized(WebAppMain.java:112)
at
winstone.WebAppConfiguration.(WebAppConfiguration.java:889)
at
winstone.HostConfiguration.initWebApp(HostConfiguration.java:131)
at
winstone.HostConfiguration.(HostConfiguration.java:73)
at
winstone.HostGroup.initHost(HostGroup.java:85)
at
winstone.HostGroup.(HostGroup.java:45)
at
winstone.Launcher.(Launcher.java:196)
at
winstone.Launcher.main(Launcher.java:391)
at
java.lang.reflect.Method.invoke(libgcj.so.10)
at Main.main(Main.java:200) Caused
by:
com.thoughtworks.xstream.XStream$InitializationException:
Could not instantiate converter :
com.thoughtworks.xstream.converters.extended.DurationConverter
: null at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:735)
at
com.thoughtworks.xstream.XStream.setupConverters(XStream.java:699)
at
com.thoughtworks.xstream.XStream.(XStream.java:445)
at
com.thoughtworks.xstream.XStream.(XStream.java:385)
at
com.thoughtworks.xstream.XStream.(XStream.java:323)
at
hudson.util.XStream2.(XStream2.java:61)
at
hudson.model.Hudson.(Hudson.java:3571)
at
java.lang.Class.initializeClass(libgcj.so.10)
...11 more Caused by:
java.lang.reflect.InvocationTargetException
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:728)
...18 more Caused by:
javax.xml.datatype.DatatypeConfigurationException:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.converters.extended.DurationConverter.(DurationConverter.java:33)
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
...19 more Caused by:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
java.lang.Class.forName(libgcj.so.10)
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
...21 more
Stacktrace:
java.lang.ExceptionInInitializerError
at
java.lang.Class.initializeClass(libgcj.so.10)
at
hudson.WebAppMain.installLogger(WebAppMain.java:257)
at
hudson.WebAppMain.contextInitialized(WebAppMain.java:112)
at
winstone.WebAppConfiguration.(WebAppConfiguration.java:889)
at
winstone.HostConfiguration.initWebApp(HostConfiguration.java:131)
at
winstone.HostConfiguration.(HostConfiguration.java:73)
at
winstone.HostGroup.initHost(HostGroup.java:85)
at
winstone.HostGroup.(HostGroup.java:45)
at
winstone.Launcher.(Launcher.java:196)
at
winstone.Launcher.main(Launcher.java:391)
at
java.lang.reflect.Method.invoke(libgcj.so.10)
at Main.main(Main.java:200) Caused
by:
com.thoughtworks.xstream.XStream$InitializationException:
Could not instantiate converter :
com.thoughtworks.xstream.converters.extended.DurationConverter
: null at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:735)
at
com.thoughtworks.xstream.XStream.setupConverters(XStream.java:699)
at
com.thoughtworks.xstream.XStream.(XStream.java:445)
at
com.thoughtworks.xstream.XStream.(XStream.java:385)
at
com.thoughtworks.xstream.XStream.(XStream.java:323)
at
hudson.util.XStream2.(XStream2.java:61)
at
hudson.model.Hudson.(Hudson.java:3571)
at
java.lang.Class.initializeClass(libgcj.so.10)
...11 more Caused by:
java.lang.reflect.InvocationTargetException
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.XStream.dynamicallyRegisterConverter(XStream.java:728)
...18 more Caused by:
javax.xml.datatype.DatatypeConfigurationException:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
at
com.thoughtworks.xstream.converters.extended.DurationConverter.(DurationConverter.java:33)
at
java.lang.reflect.Constructor.newInstance(libgcj.so.10)
...19 more Caused by:
java.lang.ClassNotFoundException:
gnu.xml.datatype.JAXPDatatypeFactory
at
java.lang.Class.forName(libgcj.so.10)
at
javax.xml.datatype.DatatypeFactory.newInstance(libgcj.so.10)
...21 more
Generated by Winstone Servlet Engine
v0.9.10 at Mon Oct 25 14:55:59 PDT
20102010
Do you know what I am missing?
any suggestions would be very appreciated.
regards
Naoya
You probably run the wrong java. Check if you use sun's oracle's java.
See here for other people that had this problem:
http://ubuntuforums.org/showthread.php?t=1434376
same comment, you probably have the wrong java version.
You can specify the correct path to the java binary in /etc/default/hudson.
If you use jenkins then edit /etc/default/jenkins instead.
# /etc/default/{hudson,jenkins}
JAVA_HOME=/path/to/jdk_1.6
JAVA=$JAVA_HOME/bin/java
Encountered a similar issue due to no more space in /tmp (root partition).