I'm trying to develop a system which alows serializing/deserializing of JSON for multiple types of classes in Kotlin. For deserialization I'm usung klaxon, but I also want to use it for serializstion. I've done some research on that, but didn't get a concluseve answer.
So, can I do that? If so, how can it be done? Or should I use other library for this purpose?
Here's my code
package com.pineapple.threadio
import com.beust.klaxon.Klaxon
import com.beust.klaxon.TypeAdapter
import com.beust.klaxon.TypeFor
import kotlin.reflect.KClass
// Frame types
#TypeFor(field = "id", adapter = FrameTypeAdapter::class)
open class BasicFrame(val id: String)
class Ping : BasicFrame("0x0000")
class TransactionRequest : BasicFrame("0x0001")
class TransactionAccept : BasicFrame("0x0002")
class TransactionDeny(val deny_reason: String) : BasicFrame("0x0003")
// Frame processing
class Frame(
#TypeFor(field = "id", adapter = FrameTypeAdapter::class)
val id: String,
val frame: BasicFrame
)
class FrameTypeAdapter : TypeAdapter<BasicFrame> {
override fun classFor(id: Any): KClass<out BasicFrame> = when (id as String) {
"0x0000" -> Ping::class
"0x0001" -> TransactionRequest::class
"0x0002" -> TransactionAccept::class
"0x0003" -> TransactionDeny::class
else -> throw IllegalArgumentException("Unknown frame ID: $id")
}
}
// Actual parsing, straight from klaxon's docs
val frame = Klaxon().parseArray<Frame>(json)
#TypeFor(field = "id", adapter = FrameTypeAdapter::class) annotation should be placed on BasicFrame class. It's redundant on places where you put it.
Related
Suppose I have the following abstract base class:
package Models
import reactivemongo.bson.BSONObjectID
abstract class RecordObject {
val _id: String = BSONObjectID.generate().stringify
}
Which is extended by the following concrete case class:
package Models
case class PersonRecord(name: String) extends RecordObject
I then try to get a JSON string using some code like the following:
import io.circe.syntax._
import io.circe.generic.auto._
import org.http4s.circe._
// ...
val person = new PersonRecord(name = "Bob")
println(person._id, person.name) // prints some UUID and "Bob"
println(person.asJso) // {"name": "Bob"} -- what happened to "_id"?
As you can see, the property _id: String inherited from RecordObject is missing. I would expect that the built-in Encoder should function just fine for this use case. Do I really need to build my own?
Let's see what happens in encoder generation. Circe uses shapeless to derive its codecs, so its enough to check what shapeless resolves into to answer your question. So in ammonite:
# abstract class RecordObject {
val _id: String = java.util.UUID.randomUUID.toString
}
defined class RecordObject
# case class PersonRecord(name: String) extends RecordObject
defined class PersonRecord
# import $ivy.`com.chuusai::shapeless:2.3.3`, shapeless._
import $ivy.$ , shapeless._
# Generic[PersonRecord]
res3: Generic[PersonRecord]{type Repr = String :: shapeless.HNil} = ammonite.$sess.cmd3$anon$macro$2$1#1123d461
OK, so its String :: HNil. Fair enough - what shapeless does is extracting all fields available in constructor transforming one way, and putting all fields back through constructor if converting the other.
Basically all typeclass derivation works this way, so you should make it possible to pass _id as constructor:
abstract class RecordObject {
val _id: String
}
case class PersonRecord(
name: String,
_id: String = BSONObjectID.generate().stringify
) extends RecordObject
That would help type class derivation do its work. If you cannot change how PersonRecord looks like... then yes you have to write your own codec. Though I doubt it would be easy as you made _id immutable and impossible to set from outside through a constructor, so it would also be hard to implement using any other way.
I need to parse an object that contains a property "triggers" which is an List<Trigger>. This list can contain 2 type of triggers: Custom and Event.
Here are my Trigger classes :
#JsonClass(generateAdapter = true)
open class Trigger(open val type: String,
open val source: String,
open val tags: Properties? = mutableMapOf())
#JsonClass(generateAdapter = true)
data class CustomTrigger(override val type: String,
override val source: String,
override val tags: Properties?,
//some other fields
) : Trigger(type, source, tags)
#JsonClass(generateAdapter = true)
data class EventTrigger(override val type: String,
override val source: String,
override val tags: Properties?,
//some other fields
) : Trigger(type, source, tags)
My object that I receive from server looks like this :
#JsonClass(generateAdapter = true)
data class Rule(val id: String,
val triggers: MutableList<Trigger>,
//some other fields
)
Using generated adapter on parsing I get on triggers only the fields from Trigger class. I need to implement a logic to parse an EventTrigger is type is "event" or an CustomTrigger if type is "custom".
How can I do this with Moshi?
Do I need to write a manual parser for my Rule object?
Any idea is welcome. Thank you
Take a look at the PolymorphicJsonAdapterFactory.
Moshi moshi = new Moshi.Builder()
.add(PolymorphicJsonAdapterFactory.of(HandOfCards.class, "hand_type")
.withSubtype(BlackjackHand.class, "blackjack")
.withSubtype(HoldemHand.class, "holdem"))
.build();
Note that it needs the optional moshi-adapters dependency.
This example from Moshi helped me to solve the parsing problem :
https://github.com/square/moshi#another-example
I have a lot of different external JSON entities that I want to parse to different internal case classs via json4s (scala). Everything works fine via the extract function from json4s. I have implemented a parse function which takes a type and a json string and parses the string to the type / the case class. To map the correct json string to the correct case class I have implemented a pattern matching function, which looks like this
entityName match {
case "entity1" => JsonParser.parse[Entity1](jsonString)
case "entity2" => JsonParser.parse[Entity2](jsonString)
....
I don't like the repetition here and would like to do this mapping via a map like this:
val mapping = Map(
"entity1" -> Entity1,
"entity2" -> Entity2
....
With this map in place I could implement the JsonParser.parse function only once like this
JsonParser.parse[mapping(entityName)](jsonString)
This is not working, because the map is referencing to the object and not to the class type. I also tried classOf[Entity1], but this also is not working. Is there a way to do this?
Thanks!
The way you want your JsonParser.parse to work is not possible in Scala. Scala is a strongly and statically typed language. It means that the compiler should know the types of the values at the compile time to be able to verify that you accesss only valid fields and methods on them and/or pass them as valid parameters to methods. Assuming your classes are
case class Entity1(value:Int, unique1:Int)
case class Entity2(value:String, unique2:String)
and you write
val parsed = JsonParser.parse[mapping("entity1")](jsonString)
how the compiler could know the type of parsed to know the type of parsed.value or to know that parsed.unique1 is a valid field while parsed.unique2 is not? The best type compiler could assign to such parsed is something very generic like Any. Of course you can downcast that Any to the specific type later but this means you still have to specify that type explicitly in the asInstanceOf which kind of defeats the whole purpose. Still, if somehow returning Any is OK for you, you may try to do something like this:
import org.json4s.jackson.JsonMethods
implicit val formats = org.json4s.DefaultFormats // or whatever Formats you actually need
val typeMap: Map[String, scala.reflect.Manifest[_]] = Map(
"String" -> implicitly[scala.reflect.Manifest[String]],
"Int" -> implicitly[scala.reflect.Manifest[Int]]
)
def parseAny(typeName: String, jsonString: String): Any = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract(formats, typeMap(typeName))
}
and then do something like this:
def testParseByTypeName(typeName: String, jsonString: String): Unit = {
try {
val parsed = parseAny(typeName, jsonString)
println(s"parsed by name $typeName => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParseByTypeName("String", "\"abc\"")
testParseByTypeName("Int", "123")
}
P.S. If your entityName doesn't come from the outside (i.e. you don't analyze data to find out actual type), you don't actually need it at all. It is enough to use type (without a need for match/case) such as:
def parse[T](jsonString: String)(implicit mf: scala.reflect.Manifest[T]): T = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract[T]
}
def testParse[T](prefix: String, jsonString: String)(implicit mf: scala.reflect.Manifest[T]): Unit = {
try {
val parsed = parse[T](jsonString)
println(s"$prefix => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParse[String]("parse String", "\"abc\"")
testParse[Int]("parse Int", "123")
}
Following idea from #SergGr, as a snippet to paste on Ammonite REPL:
{
import $ivy.`org.json4s::json4s-native:3.6.0-M2`
import org.json4s.native.JsonMethods.parse
import org.json4s.DefaultFormats
import org.json4s.JValue
case class Entity1(name : String, value : Int)
case class Entity2(name : String, value : Long)
implicit val formats = DefaultFormats
def extract[T](input : JValue)(implicit m : Manifest[T]) = input.extract[T]
val mapping: Map[String, Manifest[_]] = Map(
"entity1" -> implicitly[Manifest[Entity1]],
"entity2" -> implicitly[Manifest[Entity2]]
)
val input = parse(""" { "name" : "abu", "value" : 1 } """)
extract(input)(mapping("entity1")) //Entity1("abu", 1)
extract(input)(mapping("entity2")) //Entity2("abu", 1L)
}
I made a case class to store some of my data. The case class looks like the following:
case class Job(id: Option[Int], title: String, description: Option[String],
start: Date, end: Option[Date], customerId: Int)
I was using the following formatter to write/read my JSON objects:
implicit val jobFormat = jsonFormat6(Job.apply)
I've got some problems with the write/read because I need to add a field to the JSON (but not to the case class), for example: "test": "test". I tried to write a custom read/write with the following code:
implicit object jobFormat extends RootJsonFormat[Job] {
override def read(json: JsValue): JobRow = ???
override def write(job: Job): JsValue = ??
}
I couldn't get the working code, could somebody help me with this problem?
Thanks in advance!
What jsonFormat6 does is to create you autogenerated RootJsonFormat[Job] object. You can create your custom instances with extending RootJsonFormat[Job]. In this case, you need to create custom instance that decorates autogenerated one and adds logic on write method.
The code will look like this:
implicit object JobFormat extends RootJsonFormat[Job] {
// to use autogenerated jobFormat
val jobFormat = jsonFormat6(Job.apply)
// leave read at it is
override def read(json: JsValue): JobRow =
jobFormat.read(json)
// Change write to add your custom logic
override def write(job: Job): JsValue = {
val json = jobFormat.write(job).asJsonObject
JsObject(json.fields + ("test" -> JsString("test")))
}
}
PS: I haven't compiled code, however, overall implementation will look like this.
I've got following case:
I'd like to serialize Scala case classes that extend parent class with var of type java.util.UUID.
Serialization of this case classes should happen without any configuration of them - no annotations and definition of custom formats. Any serialization hints may be situated in parent class.
I tried sjson, but Reflection based serialization can't serialize UUID types and type based serialization forces me to define formats for every case class.
Which json serialization library would best fit this case?
Here's one solution with Lift JSON.
import java.util.UUID
import net.liftweb.json._
import net.liftweb.json.JsonAST._
import net.liftweb.json.JsonDSL._
import net.liftweb.json.Serialization._
sealed abstract class Parent {
def uuid: UUID
}
case class Foo(uuid: UUID, name: String) extends Parent
object UUIDTest extends Application {
implicit val formats = Serialization.formats(NoTypeHints) + new UUIDSerializer
val f = Foo(UUID.randomUUID, "foo")
val ser = write(f)
println(ser)
val f2 = read[Foo](ser)
assert(f == f2)
// Special serializer for UUID type
class UUIDSerializer extends Serializer[UUID] {
private val Class = classOf[UUID]
def deserialize(implicit format: Formats): PartialFunction[(TypeInfo, JValue), UUID] = {
case (TypeInfo(Class, _), json) => json match {
case JObject(JField("mostSig", JInt(m)) :: JField("leastSig", JInt(l)) :: Nil) =>
new UUID(m.longValue, l.longValue)
case x => throw new MappingException("Can't convert " + x + " to UUID")
}
}
def serialize(implicit format: Formats): PartialFunction[Any, JValue] = {
case x: UUID =>
("mostSig" -> x.getMostSignificantBits) ~ ("leastSig" -> x.getLeastSignificantBits)
}
}
}
It prints:
{"uuid":{"mostSig":-8054689529719995935,"leastSig":-5722404370736228056},"name":"foo"}'
Another solution which uses a custom serializer for Parent type.
sealed abstract class Parent {
var uuid: UUID = UUID.randomUUID
}
case class Foo(name: String) extends Parent
object UUIDTest extends Application {
implicit val formats =
Serialization.formats(NoTypeHints) + new ParentSerializer
val f = Foo("foo")
val ser = write(f)
println(ser)
val f2 = read[Foo](ser)
assert(f == f2)
// Special serializer for Parent type
class ParentSerializer extends Serializer[Parent] {
def deserialize(implicit format: Formats): PartialFunction[(TypeInfo, JValue), Parent] = {
case (t#TypeInfo(cl, _), json) if (classOf[Parent].isAssignableFrom(cl)) =>
val x = Extraction.extract(json, t)(DefaultFormats).asInstanceOf[Parent]
x.uuid = (for {
JField("mostSig", JInt(m)) <- json
JField("leastSig", JInt(l)) <- json
} yield new UUID(m.longValue, l.longValue)).head
x
}
def serialize(implicit format: Formats): PartialFunction[Any, JValue] = {
case x: Parent =>
Extraction.decompose(x)(DefaultFormats) ++
JField("mostSig", x.uuid.getMostSignificantBits) ++
JField("leastSig", x.uuid.getLeastSignificantBits)
}
}
}
If the type is important, you should take a look at YAML.
http://www.google.fr/search?q=java+yaml
It's a subset of json with improved stuff, like variable typing.
You could try jerkson: https://github.com/codahale/jerkson
Its working good for my use, but that is mostly list/map structures. Would not be surprised if it supports your needs though..
Edit: Tried it with the following example (inspired by the lift example in another answer). Seems to work fine.
import java.util.UUID
import com.codahale.jerkson.Json
import org.scalatest.FunSuite
sealed abstract class Parent {
def uuid: UUID
}
case class Foo(uuid: UUID, name: String) extends Parent
class TmpJsonTest extends FunSuite {
test("Json case class serialize") {
val f = Foo(UUID.randomUUID, "foo")
val ser = Json.generate(f)
println(ser)
val f2 = Json.parse[Foo](ser)
assert(f === f2)
}
}
Try the XStream library which includes JSON support. I have used this successfully in a few projects. It has a number of default converters, including one for java.util.UUID. A full list of default converters is located here: http://x-stream.github.io/converters.html.
A brief tutorial on using XStream for JSON reading and writing is located here: http://x-stream.github.io/json-tutorial.html. The tutorial code is written for Java but it should work just the same for Scala since reflection is being used behind the scenes.
Keep in mind that serializing and then deserializing arbitrary graphs of objects is not always possible with this library. In particular, loops in your data cannot be handled, i.e. your data must be a purely hierarchical tree. This is a reasonable limitation given the intentions of the JSON format.
Referenced links:
XStream JSON tutorial: http://x-stream.github.io/json-tutorial.html
XStream default converters: http://x-stream.github.io/converters.html