NoClassDefFoundError with sbt and scala.swing - swing

I'm new to JVM land so I apologize if this is a common problem. I'm using Scala (2.12) with sbt 0.13.13 on OSX.
I'm working on a tiny app that depends on the GUI library scala.swing (2.10.x). I ran into a runtime issue almost immediately with example code (http://otfried.org/scala/index_28.html).
Specifically, when invoking sbt run I get a stacktrace leading with:
[error] (run-main-0) java.lang.NoClassDefFoundError: scala/Proxy$class
java.lang.NoClassDefFoundError: scala/Proxy$class
at scala.swing.Window.<init>(Window.scala:25)
at scala.swing.Frame.<init>(RichWindow.scala:75)
at scala.swing.MainFrame.<init>(MainFrame.scala:19)
(Proxy appears to be a class/trait in the scala stdlib)
Reading on SO and elsewhere suggests this kind of exception is typically emitted when code present at compile time cannot be located subsequently at runtime. Indeed, the code compiles just fine, it is only when running the code that the problem occurs.
All suggestions I've found are to reconcile your classpath to resolve these issues. However, if the sbt console is to believed, my compile-time and run-time classpaths are identical:
> show compile:fullClasspath
[info] * Attributed(/Users/chris/Projects/thing2/target/scala-2.12/classes)
[info] * Attributed(/Users/chris/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.1.jar)
[info] * Attributed(/Users/chris/.ivy2/cache/org.scala-lang/scala-swing/jars/scala-swing-2.10.6.jar)
[success] Total time: 0 s, completed Dec 24, 2016 7:01:15 PM
> show runtime:fullClasspath
[info] * Attributed(/Users/chris/Projects/thing2/target/scala-2.12/classes)
[info] * Attributed(/Users/chris/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.1.jar)
[info] * Attributed(/Users/chris/.ivy2/cache/org.scala-lang/scala-swing/jars/scala-swing-2.10.6.jar)
[success] Total time: 0 s, completed Dec 24, 2016 7:01:19 PM
So, I find myself at a bit of a forensic impasse. Any suggestions on where to look next would be much appreciated. For clarity, this has only happened with scala.swing thus far. I have a couple other small projects in Scala that haven't had any issues. What's perplexing is the "missing" class seems to be scala standard lib material.

NoClassDefFoundError points to a problem where you mix libraries that were compiled for different major Scala versions. If you use Scala 2.12, you must also use the Swing module with a matching version. Before Scala 2.11, Swing has been published with an artifact like this:
"org.scala-lang" % "scala-swing" % scalaVersion.value
It was then moved to the org.scala-lang.modules group. Your build file should contain something like this:
scalaVersion := "2.12.1"
libraryDependencies += "org.scala-lang.modules" %% "scala-swing" % "2.0.0-M2"
(it seems the latest Scala 2.11 compatible version "1.0.2" has not been published for Scala 2.12, and so you need to jump straight to "2.0.0-M2" which should be mostly source compatible).

Related

Can the ConfigurationAPI in Liferay DXP be used for Plugin sdk portlet?

I have followed given 2 tutorials to use COnfigurationAPI in a Liferay dxp plugins SDK portlet built using Ant/Ivy.
COnfiguration API 1
COnfiguration API 2.
Below is the configuration class used:
package com.preferences.interfaces;
import com.liferay.portal.configuration.metatype.annotations.ExtendedObjectClassDefinition;
import aQute.bnd.annotation.metatype.Meta;
#ExtendedObjectClassDefinition(
category = "preferences",
scope = ExtendedObjectClassDefinition.Scope.GROUP
)
#Meta.OCD(
id = "com.preferences.interfaces.UnsupportedBrowserGroupServiceConfiguration",
name = "UnsupportedBrowser.group.service.configuration.name"
)
public interface UnsupportedBrowserGroupServiceConfiguration {
#Meta.AD(deflt = "", required = false)
public String displayStyle();
#Meta.AD(deflt = "0", required = false)
public long displayStyleGroupId(long defaultDisplayStyleGroupId);
}
Post following the steps,I am getting the below error:
ERROR [CM Configuration Updater (ManagedService Update: pid=[com.preferences.interfaces.UnsupportedBrowserGroupServiceConfiguration])][org_apache_felix_configadmin:97] [org.osgi.service.cm.ManagedService, id=7082, bundle=297//com.liferay.portal.configuration.settings-2.0.15.jar?lpkgPath=C:\dev\Liferay\osgi\marketplace\Liferay Foundation.lpkg]: Unexpected problem updating configuration com.preferences.interfaces.UnsupportedBrowserGroupServiceConfiguration {org.osgi.service.cm.ConfigurationAdmin}={service.vendor=Apache Software Foundation, service.pid=org.apache.felix.cm.ConfigurationAdmin, service.description=Configuration Admin Service Specification 1.2 Implementation, service.id=56, service.bundleid=643, service.scope=bundle}
Caused by: java.lang.IllegalArgumentException: wrong number of arguments
So,does this process need a osgi module as mandatory or can we do it using plusings sdk portlet built using ant as well?
Without disecting the error message Caused by: java.lang.IllegalArgumentException: wrong number of arguments:
The way you build your plugin (Ant, Maven, Gradle, manually) doesn't make a difference, as long as you build a plugin that will be understood by the runtime. aQute.bnd.annotation.metatype.Meta points firmly into the OSGi world, and makes it almost certain that you'll need an OSGi module. You can build this with Ant, of course. Even in Ant you can embed tools like bnd, or you can write the proper Manifest.mf to include in your module manually (just kidding - you don't want to do it manually, but it would work).
Recommendation: Instead of moving everything over: Try to reproduce this with a minimal example in gradle or better Liferay Workspace (which is gradle based), just to get all the automatic wiring in. Check if it makes a difference and compare the generated output from your Ant build process with the workspace output. Pay specific attention to the Manifest.
In order to build the proper Manifest, you want to use bnd - if the Manifest turns out to be your issue: Find a way to embrace bnd - if that's by saying goodby to Ant, or by tweaking your build script remains your decision.

Why does sbt give "object swing is not a member of package scala" for import scala.swing?

Sbt version: 0.13.8
Scala version: 2.11.2
When compiling my scala swing application with scalac, it simply compiles.
However, when compiling the same files with SBT, it provides the following error:
[error] my/file/path.scala:1: object swing is not a member of package scala
[error] import scala.swing._
I added the scala version to my build.sbt. I even configured scalaHome (which I believe should never be in a build.sbt).
The lines in build.sbt:
scalaVersion := "2.11.2"
scalaHome := Some(file("/my/scala/location/opt/scala-2.11.2/"))
The
/my/scala/location/opt/scala-2.11.2/lib
directory contains the sacla swing lib: scala-swing_2.11-1.0.1.jar , this is also why scalac simply compiles.
Some might say I should add swing to my libraryDependencies in the build.sbt, however it shouldn't, since it is part of the core library and scalaHome is configured.
How to get sbt to notice the swing core library in a natural manner?
Bonus question:
How to configure scalaHome outside of build.sbt (without hacking the sbt jar itself) or better, have it notice the SCALA_HOME environment variable?
As of 2.11, the scala swing package is no longer listed among scala's standard library api and in fact is described in its own README as "mostly unsupported".
I think you should expect to have to include it as a dependency.
See also What's wrong with my scala.swing?

Unable to undrestand IKVM exception

I have some java code converted to .net assemblies using IKVM.
When ever I run the application, it throws an exception as below:
The type initializer for '1' threw an exception.
{"Method not found: 'System.Exception java.lang.Throwable.__<map>(System.Exception, System.Type, Boolean)'."}
What I am missing?
After a couple of hours, I could figure out the problem.
The IKVM version that made .net assemblies is important. higher versions which I got from Nuget did not work properly. I had to download IKVM version 0.42.0.3

Spark running error java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass

import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import play.api.libs.json._
import java.util.Date
import javax.xml.bind.DatatypeConverter
object Test {
def main(args:Array[String]): Unit = {
val logFile="test.txt"
val conf=new SparkConf().setAppName("Json Test")
val sc = new SparkContext(conf)
try {
val out= "output/test"
val logData=sc.textFile(logFile,2).map(line => Json.parse(cleanTypo(line))).cache()
} finally {
sc.stop()
}
}
Since it was said about the Spark jackson conflict problem, I have rebuilt Spark using
mvn versions:use-latest-versions -Dincludes=org.codehaus.jackson:jackson-core-asl
mvn versions:use-latest-versions -Dincludes=org.codehaus.jackson:jackson-mapper-asl
So the jars have been updated to 1.9.x
But I still have the error
15/03/02 03:12:19 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
at org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
at org.codehaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
at org.codehaus.jackson.map.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:427)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createDeserializer(StdDeserializerProvider.java:398)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCache2(StdDeserializerProvider.java:307)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCacheValueDeserializer(StdDeserializerProvider.java:287)
at org.codehaus.jackson.map.deser.StdDeserializerProvider.findValueDeserializer(StdDeserializerProvider.java:136)
at org.codehaus.jackson.map.deser.StdDeserializerProvider.findTypedValueDeserializer(StdDeserializerProvider.java:157)
at org.codehaus.jackson.map.ObjectMapper._findRootDeserializer(ObjectMapper.java:2468)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2383)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1094)
at play.api.libs.json.JacksonJson$.parseJsValue(JsValue.scala:477)
at play.api.libs.json.Json$.parse(Json.scala:16)
We hit almost the exact same issue. We were trying to use 1.9.2 but hit a no such method error as well.
Annoyingly there is not only 1 version conflict to deal with but 2. First of all Spark depends on Hadoop (for hdfs) which depends on a 1.8.x build of the jackson json and this is the conflict which you are seeing. Spark (at least 1.2+) then uses the jackson 2.4.4 core which actually got moved to com.fasterxml.jackson.core so it does not actually conflict with 1.8.x due to the different package names.
So in your case your code should work if you do 1 of 3 things:
upgrade to 2.4.x build that is LESS THAN OR EQUAL TO 2.4.4 since the actual dependency will be replaced by sparks which is 2.4.4 (at the time of writing this)
downgrade to 1.8.x that is LESS THAN OR EQUAL TO the 1.8.x build which hadoop is using
compile spark under your 1.9.x build. I know you mention this and that it didn't work but when we tried it was successful and we ran the build with the option -Dcodehaus.jackson.version=1.9.2
There are going to be a lot more issues like this to come unfortunately due to the nature of spark and how it already has all of its own internal dependencies on the classpath so any job dependencies that conflict will never work out. Spark already does some dependency shading to avoid this issue with packages like guava but this is not currently done with jackson.

play, scala and jerkson noClassDefFound error

I am trying to work with jerkson in play and with scala 2.10.
However, i want to load data fixtures based on a json files. for this prcoedure I'm trying to load the json with the "parse" command from jerkson.
That ultimatly fails.
I'm doing this in the "override def onStart(app: Application)" function. The error:
NoClassDefFoundError: Could not initialize class com.codahale.jerkson.Json$
Any guesses why this is happening ? I have the following libs in my deps.:
"com.codahale" % "jerkson_2.9.1" % "0.5.0",
"com.cloudphysics" % "jerkson_2.10" % "0.6.3"
my parsing command is:
com.codahale.jerkson.Json.parse[Map[String,Any]](json)
Thanks in advance
A NoClassDefFoundError generally means there is some sort of issues with the classpath. For starters, if you are running on scala 2.10, I would remove the following line from your sbt file:
"com.codahale" % "jerkson_2.9.1" % "0.5.0"
Then, make sure the com.cloudphysics jerkson jar file is available in your apps classpath and try your test again.