Is there a way and/or library to automatically create Kotlin Data class from Json like it is works in Scala Json.Spray?
Something like this:
data class User(id: Int, name: String)
class DataClassFactory(val json: String) {
fun getUser(): User {
//some reflection
return User(10, "Kirill")
}
}
fun main(args: Array<String>): Unit {
val json = "{id: 10, name: Kirill}"
val usr = DataClassFactory(json).getUser()
println(usr)
}
You can use the Jackson module for Kotlin to serialize/deserialize easily from any format that Jackson supports (including JSON). This is the easiest way, and supports Kotlin data classes without annotations. See https://github.com/FasterXML/jackson-module-kotlin for the module which includes the latest information for using from Maven and Gradle (you can infer IVY and download JARs from the Maven repositories as well)
Alternatives exist such as Boon, but it has no specific support for Kotlin (usually a problem with not having a default constructor) and uses some unsafe direct access to internal JVM classes for performance. In doing so, it can crash on some VM's, and in cases where you extend Boon from Kotlin with custom serializer/deserializer it makes assumptions about the classes that do not hold true in Kotlin (the String class is wrapped for example) which I have seen core dump. Boon is lightening fast, just be careful of these issues and test first before using.
(note: I am the creator of the Jackson-Kotlin module)
This is very clean and easy in Kotlin.
import com.fasterxml.jackson.module.kotlin.*
data class User(val id: Int, val name: String)
fun main(args: Array<String>) {
val mapper = jacksonObjectMapper()
val json = """{"id": 10, "name": "Kirill"}"""
val user = mapper.readValue<User>(json)
println(user)
}
produces this output:
User(id=10, name=Kirill)
you only have to add this to your pom.xml
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-kotlin</artifactId>
<version>2.6.3-4</version>
</dependency>
Why not use Jackson or any other serializer? It should work..
What about this
This is a translator that translate JSON string into kotlin data class ,it make it throught a plugin,see next
https://plugins.jetbrains.com/plugin/9960-jsontokotlinclass
http://www.json2kotlin.com converts your json response to kotlin data classes online, without the need to install any plugin. Optionally, you can generate gson annotations too. (Disclosure: I created this utility)
Related
I'm very new to Scala and I'm trying to send a json from a client to a service.
I have an interface ("trait") where I have the method receiveData takes a JSONObject as an argument, but apparently it's deprecated.
trait DataService {
def receiveData(data: JSONObject, clientId: String): Unit
}
I have this Scala version ThisBuild / scalaVersion := "3.0.0-RC1"
I have looked around for what I should use instead and found nothing.
it says
#deprecated("Use The Scala Library Index to find alternatives: https://index.scala-lang.org/", "1.0.6")
but I don't know how to search for things there.
Appreciate all help!
EDIT:
ANSWER: there are plenty: Play-Json, zio-json, Circe, Jackson...
I have some model definition inside a XSD file and I need to reference these models from an OpenApi definition. Manually remodeling is no option since the file is too large, and I need to put it into a build system, so that if the XSD is changed, I can regenerate the models/schemas for OpenApi.
What I tried and what nearly worked is using xsd2json and then converting it with the node module json-schema-to-openapi. However xsd2json is dropping some of the complexElement models. For example "$ref": "#/definitions/tns:ContentNode" is used inside of one model as the child type but there is no definition for ContentNode in the schema, where when I look into the XSD, there is a complexElement definition for ContentNode.
Another approach which I haven't tried yet but seems a bit excessive to me is using xjb to generate Java models from the XSD and then using JacksonSchema to generate the json schema.
Is there any established library or way, to use XSD in OpenApi?
I ended up implementing the second approach using jaxb to convert the XSD to java models and then using Jackson to write the schemas to files.
Gradle:
plugins {
id 'java'
id 'application'
}
group 'foo'
version '1.0-SNAPSHOT'
sourceCompatibility = 1.8
repositories {
mavenCentral()
}
dependencies {
testCompile group: 'junit', name: 'junit', version: '4.12'
compile group: 'com.fasterxml.jackson.module', name: 'jackson-module-jsonSchema', version: '2.9.8'
}
configurations {
jaxb
}
dependencies {
jaxb (
'com.sun.xml.bind:jaxb-xjc:2.2.7',
'com.sun.xml.bind:jaxb-impl:2.2.7'
)
}
application {
mainClassName = 'foo.bar.Main'
}
task runConverter(type: JavaExec, group: 'application') {
classpath = sourceSets.main.runtimeClasspath
main = 'foo.bar.Main'
}
task jaxb {
System.setProperty('javax.xml.accessExternalSchema', 'all')
def jaxbTargetDir = file("src/main/java")
doLast {
jaxbTargetDir.mkdirs()
ant.taskdef(
name: 'xjc',
classname: 'com.sun.tools.xjc.XJCTask',
classpath: configurations.jaxb.asPath
)
ant.jaxbTargetDir = jaxbTargetDir
ant.xjc(
destdir: '${jaxbTargetDir}',
package: 'foo.bar.model',
schema: 'src/main/resources/crs.xsd'
)
}
}
compileJava.dependsOn jaxb
With a converter main class, that does something along the lines of:
package foo.bar;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.module.jsonSchema.JsonSchema;
import com.fasterxml.jackson.module.jsonSchema.JsonSchemaGenerator;
import foo.bar.model.Documents;
public class Main {
public static void main(String[] args) {
ObjectMapper mapper = new ObjectMapper();
JsonSchemaGenerator schemaGen = new JsonSchemaGenerator(mapper);
try {
JsonSchema schema = schemaGen.generateSchema(Documents.class);
System.out.print(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(schema));
} catch (JsonMappingException e) {
e.printStackTrace();
} catch (JsonProcessingException e) {
e.printStackTrace();
}
}
}
It is still not perfect though,... this would need to iterate over all the model classes and generate a file with the schema. Also it doesn't use references, if a class has a member of another class, the schema is printed inline instead of referencing. This requires a bit more customization with the SchemaFactoryWrapper but can be done.
The problem you have is that you are applying inference tooling over a multi-step conversion. As you have found, inference tooling is inherently fussy and will not work in all situations. It's kind of like playing Chinese whispers - every step of the chain is potentially lossy, so what you get out the other end may be garbled.
Based on the alternative approach you suggest, I would suggest a similar solution:
OpenAPI is, rather obviously, an API definition standard. It should be possible for you to take a code first approach, composing your API operations in code and exposing the types generated from XJB. Then you can use Apiee and its annotations to generate the OpenAPI definition. This assumes you are using JAX-RS for your API.
This is still a two-step process, but one with a higher chance of success. The benefit here is that your first step, inferring your XSD types into java types, will hopefully have very little (if any) impact on the code which defines your API operations. Although there will still be a manual step (updating the models) the OpenAPI definition will update automatically once the code has been rebuilt.
I have HTTP client written in Scala that uses json4s/jackson to serialize and deserialize HTTP payloads. For now I was using only Scala case classes as model and everything was working fine, but now I have to communicate with third party service. They provided me with their own model but its written in Java, so now I need to deserialize jsons also to Java classes. It seams to work fine with simple classes but when class contains collections like Lists or Maps json4s has problems and sets all such fields to null.
Is there any way to handle such cases? Maybe I should use different formats (I'm using DefaultFormats + few custom ones). Example of problem with test:
import org.json4s.DefaultFormats
import org.json4s.jackson.Serialization.read
import org.scalatest.{FlatSpec, Matchers}
class JavaListTest extends FlatSpec with Matchers{
implicit val formats = DefaultFormats
"Java List" should "be deserialized properly" in {
val input = """{"list":["a", "b", "c"]}"""
val output = read[ObjectWithList](input)
output.list.size() shouldBe 3
}
}
And sample Java class:
import java.util.List;
public class ObjectWithList {
List<String> list;
}
I have also noticed that when I'll try to deserialize to Scala case class that contains java.util.List[String] type of field I'll get an exception of type: org.json4s.package$MappingException: Expected collection but got List[String]
Key for solving your issue, is composition of formatters. Basically you want to define JList formatter as list formatter composed with toJList function.
Unfortunately, json4s Formatters are extremely difficult to compose, so I used the Readers for you to get an idea. I also simplified an example, to having only java list:
import DefaultReaders._
import scala.collection.JavaConverters._
implicit def javaListReader[A: Reader]: Reader[java.util.List[A]] = new Reader[util.List[A]] {
override def read(value: JValue) = DefaultReaders.traversableReader[List, A].read(value).asJava
}
val input = """["a", "b", "c"]"""
val output = Formats.read[java.util.List[String]](parse(input))
To my knowledge json4s readers will not work with java classes out of the box, so you might either need to implement the Serializer[JList[_]] the same way, or mirror your java classes with case classes and use them inside your domain.
P.S.
Highly recommend you to switch to circe or argonaut, then you will forget about the most problems with jsons.
I have JUnit test like that:
Test fun testCategoriesLoading() {
val subscriber = TestSubscriber<List<ACategory>>()
service.categories().subscribe(subscriber)
subscriber.awaitTerminalEvent()
subscriber.assertNoErrors()
}
service is Retrofit, that uses GsonConverter to deserialize json into
data class ACategory(val id: String, val title: String, val parentId: String?, val hasChildren: Boolean)
instances.
Test is passing, even if ACategory filled with id = null, title = null etc.
So, as far as i know, gson using reflection, and kotlin lazily resolves this nullability constraints on first access.
Is there any way to force this resolve?
Some good-looking solution without direct access to fields manually? I really don't want to write every assert by hand.
You could use the new Kotlin reflection. If you have an instance of ACategory, call
ACategory::class.memberProperties
.filter { !it.returnType.isMarkedNullable }
.forEach {
assertNotNull(it.get(aCategory))
}
to access all properties that are marked as not nullable and assert they're not null. Make sure, you have the reflection lib on the classpath.
Make sure you're using M14.
We ended up with hack for data classes(only use case for us, so its ok).
Calling gsonConstructedObject.copy() reveals all exceptions
I have my relevant actors' messages (de)serializable to/from Play! JSON. I'd like to use JSON (de)serializers for akka persistance system (if possbile).
In akka persistance documentation there is possibility to use our own serializers. Further more here are the instructions how to write custom serializers. Since akka.serialization.Serializer is expecting toBinary and fromBinary is there any way to use Play JSON serializers with akka persistence?
Thank you!
Best!
Where do you like to serialize data to?
I'm looking for a mongodb based akka persistence store with serialized objects using the json formats on my own. Maybe the following driver might be interesting for you as well:
https://github.com/scullxbones/akka-persistence-mongo/issues/16
Integrating play json into akka-persistence is complicated since play json uses instances of Format that are gathered via implicits. Akka provides just java.lang.Object for serialization and a java.lang.Class[_] for deserialization which makes resolving the correct implicit Format impossible.
What you could do is write a custom akka.serialization.Serializer that has a Map from Class[A] to Format[A]. This map can be used to find the correct format for a java.lang.Object / java.lang.Class[_]:
class JsonSerializer(serializers: Map[Class[_], Format[_]]) extends Serializer {
val charset: Charset = StandardCharsets.UTF_8
val identifier: Int = "play-json-serializer".##
val includeManifest: Boolean = true
def serializer[A](c: Class[_]): GenericFormat[A] = serializers.get(c) match {
case Some(format) => format.asInstanceOf[GenericFormat[A]]
case None => throw new RuntimeException("No Format available for " + c.getName)
}
def toBinary(o: AnyRef): Array[Byte] = jsonSerialize(o).getBytes(charset)
def fromBinary(bytes: Array[Byte], manifest: Option[Class[_]]): AnyRef = jsonDeserialize(bytes, manifest.get)
def jsonSerialize[A](a: A): String = {
implicit val format: GenericFormat[A] = serializer[A](a.getClass)
Json.stringify(Json.toJson(a))
}
def jsonDeserialize[A](bytes: Array[Byte], manifest: Class[_]): A = {
implicit val format: GenericFormat[A] = serializer[A](manifest)
Json.fromJson[A](Json.parse(new String(bytes, charset))).get
}
}
You can now inherit this class and pass play formats for all types that your akka-serializer should be able to (de)serialize to the constructor. This serializer must be configured in the akka configuration as described in the documentation:
class MyJsonSerializer extends JsonSerializer(Map(
Serializer[Foo], Serializer[...], ...
))
// Just a utility class for the pretty syntax above
object Serializer {
def apply[A](implicit format: Format[A], ctag: ClassTag[A]): (Class[A], Format[A]) =
(ctag.runtimeClass.asInstanceOf[Class[A]], format)
}