Using json as a source for cxf-wadl2java - json

I received a specification of a RESTful service in json format and need to create a java api library for the client.
Now swagger can do it without a problem, but I would prefer to use cxf-wadl2java maven plugin. By default it doesn't expect the json format. See the exception cause stack trace below.
Is there a way to configure the cxf-wadl2java plugin to read json document?
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected character '{' (code 123) in prolog; expected '<'
at [row,col {unknown-source}]: [1,1]
at com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:653)
at com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2133)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1181)
at org.apache.cxf.staxutils.StaxUtils.readDocElements(StaxUtils.java:1367)
at org.apache.cxf.staxutils.StaxUtils.readDocElements(StaxUtils.java:1261)
at org.apache.cxf.staxutils.StaxUtils.read(StaxUtils.java:1189)
at org.apache.cxf.staxutils.StaxUtils.read(StaxUtils.java:1178)
at org.apache.cxf.staxutils.StaxUtils.read(StaxUtils.java:1168)
at org.apache.cxf.tools.wadlto.jaxrs.SourceGenerator.readXmlDocument(SourceGenerator.java:1757)
... 32 more

May be you can have two step conversion. swagger.json to wadl file and then use wadl2java plugin.
Install npm in you machine
Use maven exec plugin and run command defined in this npm package to convert from swagger to wadl.
Use cxf wadl2java plugin to generate java file from generated wadl file from above.
EDIT
There is a maven plugin provided by swagger.io. Please refer a usage example here

Related

Libgdx: The import org.json cannot be resolved

The following error occurs when I try to compile my project to Html with gradlew html:superDev command. In Android and Desktop is Ok.
[ERROR] Line 14: The import org.json cannot be resolved
[ERROR] Line 51: JSONObject cannot be resolved to a type
You need sources for every library you use for GWT to compile it to JS.
As a general advice, if you use features libGDX provides itself, you should not use third libs to avoid such problems. Use the built-in json parser.

Adding Spark CSV dependency to Zeppelin

I'm running an EMR with a spark cluster on AWS.
Spark version is 1.6
When running the folllowing command:
proxy = sqlContext.read.load("/user/zeppelin/ProxyRaw.csv",
format="com.databricks.spark.csv",
header="true",
inferSchema="true")
I get the following error:
Py4JJavaError: An error occurred while calling o162.load.
: java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at
http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
How can I solve this? I assume I should add a package but how do I install it and where?
There is many way to add packages in Zeppelin :
One of them is to actually change the conf/zeppelin-env.sh configuration file adding the packages you need e.g com.databricks:spark-csv_2.10:1.4.0 in your case to the submit options since Zeppelin uses the spark-submit command under the hood :
export SPARK_SUBMIT_OPTIONS="--packages com.databricks:spark-csv_2.10:1.4.0"
But let's say that you don't have actually access to those configuration. You can then use Dynamic Dependency Loading via %dep interpreter (deprecated) :
%dep
z.load("com.databricks:spark-csv_2.10:1.4.0")
This will require that you load the dependencies before launching or restarting the interpreter.
Another way to do it is do add the dependency you need via the interpreter dependency manager as described in the following link : Dependency Management for Interpreter.
Well,
First you need to download the CSV liv from Maven repository:
https://mvnrepository.com/artifact/com.databricks/spark-csv_2.10/1.5.0
Check the scala version that you are using. If is 2.10 or 2.11.
When you call spark-shell our spark-submit or pyspark. Or even your Zeppelin you need to add the option --jars and the path to your lib.
Like this:
pyspark --jars /path/to/jar/spark-csv_2.10-1.5.0.jar
Than you can call it as you did above.
You can see other close issue here: How to add third party java jars for use in pyspark

How to setup Robot Framework standalone jar with SwingLibrary?

I'm using Robot Framework with SwingLibrary to test a Java Swing based application. Since I'm not used to Python and also don't want to setup the Python environment, I decided to go with the Robot standalone JAR version (current version 2.8.4).
My problem is the setup in combination with SwingLibrary (version 1.8.0). I don't know where to put the library such that it gets recognized by Robot.
So far, I have the following test case (mytest.txt):
*** Settings ***
Library SwingLibrary
*** Test Cases ***
MyTestCase
Start Application MyApp
I tried with putting the standalone jar in conjunction with the test case in a folder, and created one subfolder (called it Lib) where I put the SwingLibrary JAR (and later also extracted the JAR).
I added the SwingLibrary as well as my own application to the classpath, tried executing robot the following way:
java -Xbootclasspath/a:Lib/swinglibrary-1.8.0.jar:Lib/MyApp.jar -jar robotframework-2.8.4.jar mytest.txt
and also with
java -jar robotframework-2.8.4.jar mytest.txt
I always get one of the following errors:
[ WARN ] Imported library 'SwingLibrary' contains no keywords
==============================================================================
Mytest
==============================================================================
MyTestCase | FAIL |
No keyword with name 'Start Application' found.
or
[ ERROR ] Error in file 'mytest.txt': Importing test library 'SwingLibrary' failed: ImportError: No module named SwingLibrary
You can use the standalone jar without the -jar option, allowing you to specify the classpath in the standard manner. The main class for the standalone jar is org.robotframework.RobotFramework, so the syntax would be
java -cp robotframework-2.8.4.jar:Lib/swinglibrary-1.8.0.jar:Lib/MyApp.jar org.robotframework.RobotFramework
Slightly more verbose but it's standard and so avoids any oddnesses caused by using the non-standard -Xbootclasspath option.

Error while parsing json in J2ME

I am running Java ME application in Eclipse..I have JSON code in my pp and I have also json.jar lib, but when I run the application, I get an eror like this.
Uncaught exception: java.lang.NoClassDefFoundError: org/json/me/JSONObject
- parsing.Parsing$1.run(), bci=88
Looks like your json.jar lib is not bundled with your app code. The contents from json.jar must be available inside the final app jar file.
If you are using Eclipse Pulsar be sure to check this lib at Project .. Properties .. Java Build Path .. Order and Export tab.

Jruby log4j integration

I am currently working on integrating Java application General Architecture For Text Engineering (GATE) with a Rails application using JRuby architecture. When we worked on integrating JRuby with log4j, I am getting following error:
0 [main] DEBUG Main.class - Hello world
gate/Gate.java:80:in `<clinit>': java.lang.NoClassDefFoundError: org/apache/log4
j/Logger (NativeException)
from gateapp/Main.java:86:in `main'
from test.rb:12
test.rb is the name of ruby program.
I tried importing all the log4j apache libraries, and included the class file in the test.rb file.
When I run the Java program alone its running fine. But when I generate the jar file and include them in Ruby file (test.rb) , I am getting this error
java.lang.NoClassDefFoundError: org/apache/log4j/Logger (NativeException) problem is occuring. How to deal with this problem ?
You need to make sure the log4j JAR file is in your classpath. One way to do this is to set the CLASSPATH variable in your environment. Another way would be to call require in your ruby code like
require "/some/path/MyStuff.jar"
Here is my config to set it up with couchbase Java SDK
include Java
def setup_log4j
java::lang.System.setProperty("net.spy.log.LoggerImpl", "net.spy.memcached.compat.log.Log4JLogger")
fa = Java::OrgApacheLog4j::FileAppender.new();
fa.setName("FileLogger");
fa.setFile("./log/#{Rails.env}.log");
fa.setLayout(Java::OrgApacheLog4j::PatternLayout.new("%d %-5p [%c{1}] %m%n"));
fa.setThreshold(Java::OrgApacheLog4j::Level::INFO);
fa.setAppend(true);
fa.activateOptions();
Java::OrgApacheLog4j::Logger.getRootLogger().addAppender(fa)
end
Just beware that I required the lo4j.jar file earlier.
Worth to mention that there is project named log4jruby.