Simple Scala JSON library for non-recursive case classes - json

I have case classes that only contain Strings or collections of Strings and want to convert them to JSON objects with corresponding field names plus an additional field to denote the type.
sealed trait Item
case class ProductX(a: String, b: String) extends Item
case class ProductY(a: String, b: String) extends Item
case class CollX(els: List[String]) extends Item
case class CollY(els: List[String]) extends Item
Most libs have special support for case classes but there never seems to be a way to use that in conjunction with an additional type field to disambiguate the isomorphic cases, and I'd have to fall back to the more low-level descriptions.
So far I've tried spray-json and argonaut and I wind up with way more boilerplate than my simple usage scenario would justify:
// spray-json
implicit object CollXJsonFormat extends RootJsonFormat[CollX] {
def write(ss: CollX) = JsObject(
"type" -> JsString("CollX"),
"els" -> JsArray(ss.els.map(JsString(_)): _*)
)
def read(value: JsValue) = {
value.asJsObject.getFields("type", "els") match {
case Seq(JsString("CollX"), JsArray(els)) =>
CollX(els.toList.map({
case JsString(s) => s
case _ => throw new DeserializationException("JsString expected")
}))
case _ => throw new DeserializationException("CollX expected")
}
}
}
For argonaut I couldn't even figure out how to match the type field since DecodeResult has no filter method:
// argonaut
implicit def CollXCodec: CodecJson[CollX] =
CodecJson(
(px: CollX) =>
("type" := "CollX") ->:
("els" := px.els) ->:
jEmptyObject,
c => for {
t <- (c --\ "type").as[String]
els <- (c --\ "els").as[List[String]]
} yield CollX(els))
Is there another lib that can handle this better or is there some feature in one of these libs I've overlooked that would significantly reduce the boilerplate?

I use: net.liftweb with very good results.
You can create json this way.
For example if you have:
case class ABjsontemplate(a: String, b: String)
You can create the json like this:
net.liftweb.json.Serialization.write(ABjsontemplate("a", "b"))
in SBT:
libraryDependencies += "net.liftweb" %% "lift-json" % "3.0-M1"
I hope it can help.

Related

Differentiate between Map[String,String] and Map[String,Object]

I have two different maps that are created in my project. I have used type alias for both.
type alias1 = Map[String, ScalaObject]
type alias2 = Map[String, String]
I have a match-case situation where I want to differentiate between the two because different operations need to happen on both.
val obj: T = fromJson[T](jacksonMapper, json)
obj match {
case _: alias1 => operation1()
case _: alias2 => operation2()
case _ => obj
}
Any idea how to differentiate the two?
ADT would be to create a sealed trait with some case classes for each case; so one for a Map[String, String] and other for Map[String, ScalaObject] and you can pattern match on the case classes.
So like this:
sealed trait MyType extends Product with Serializable
final case class ObjMap(data: Map[String, ScalaObject]) extends MyType
final case class StrMap(data: Map[String, String]) extends MyType
val obj: MyType = ???
obj match {
case ObjMap(_) => operation1()
case StrMap(_) => operation2()
}
The Typeclass approach may be too complex for this.
The two key concepts to understand why the two maps cannot be differentiated at runtime are
type erasure where compile-time parameteriseed type Map[String, String] becomes runtime class Map
pattern matching translates to runtime isInstanceOf and asInstanceOf calls
Consider the following simplified example
case class Foo[T](i: T)
val fooInt = Foo[Int](42)
val fooStr = Foo[String]("")
fooInt.isInstanceOf[Foo[String]]
// val res0: Boolean = true
Note how isInstanceOf could not check at runtime what was the compile-time type parameter T and so it cannot differentiate between fooInt and fooStr. Effectively the best we can do is something like
fooInt.isInstanceOf[Foo[_]]
where the underscore _ serves to communicate the fact of type erasure.
Next consider how the following pattern match
(fooInt: Any) match { case str: Foo[String] => "oops :(" }
// val res1: String = oops :(
effectively becomes something like
if (fooInt.isInstanceOf[Foo[String]]) "oops :("
which again due to type erasure incorrectly evaluates to "oops :(".
An alternative approach is to try to do as much as possible at compile-time before type arguments are discarded. Typeclasses can be thought of as kind of compile-time pattern matching that happens before type erasure.

Configure spray-json for non strict parsing deserialization

How to configure the spray-json parsing on parsing options?
Similarly as Jackson Parsing Features.
For example, I am parsing a json that has a field that my case class has not, and it is breaking:
spray.json.DeserializationException: Object is missing required member 'myfield'
UPDATE :
A simple example:
case class MyClass(a: String, b: Long);
and try to parse an incomplete json like
val data = "{a: \"hi\"}"
with a spray-json format like:
jsonFormat2(MyClass.apply)
// ...
data.parseJson.convertTo[MyClass]
(simplified code).
But the question goes further, I want to ask about configuration options like in other parsers. More examples:
Be able to ignore fields that exist in the JSON but not in the case class.
Ways of managing nulls or nonexistent values.
etc.
SprayJson allows you to define custom parsers like so:
case class Foo(a: String, b: Int)
implicit object FooJsonFormat extends RootJsonFormat[Foo] {
override def read(json: JsValue): Foo = {
json.asJsObject.getFields("name", "id") match {
case Seq(JsString(name), id) =>
Foo(name, id.convertTo[Int])
}
}
override def write(obj: Foo): JsValue = obj.toJson
}
This allows you to parse any arbitrary payload and pull out the fields "name" and "id" - other fields are ignored. If those fields are not guaranteed you can add something like:
case Seq(JsString(name), JsNull) =>
Foo(name, 0)
You should look at what's available in JsValue.scala - in particular JsArray may come in handy if you're getting payloads with anonymous arrays (i.e. the root is [{...}] instead of {"field":"value"...})
Spray Json doesn't support default parameters. So You cannot have a case class like
case class MyClass(a: String, b: Int = 0)
and then parse json like {"a":"foo"}
However if you make the second parameter as Option. then it works.
import spray.json._
case class MyClass(a: String, b: Option[Int] = None)
object MyProtocol extends DefaultJsonProtocol {
implicit val f = jsonFormat2(MyClass)
}
import MyProtocol.f
val mc1 = MyClass("foo", Some(10))
val strJson = mc1.toJson.toString
val strJson2 = """{"a": "foo"}"""
val mc2 = strJson2.parseJson.convertTo[MyClass]
println(mc2)

How can I define a dynamic base json object?

I would like to design a base trait/class in Scala that can produce the following json:
trait GenericResource {
val singularName: String
val pluralName: String
}
I would inherit this trait in a case class:
case class Product(name: String) extends GenericResource {
override val singularName = "product"
override val pluralName = "products"
}
val car = Product("car")
val jsonString = serialize(car)
the output should look like: {"product":{"name":"car"}}
A Seq[Product] should produce {"products":[{"name":"car"},{"name":"truck"}]} etc...
I'm struggling with the proper abstractions to accomplish this. I am open to solutions using any JSON library (available in Scala).
Here's about the simplest way I can think of to do the singular part generically with circe:
import io.circe.{ Decoder, Encoder, Json }
import io.circe.generic.encoding.DerivedObjectEncoder
trait GenericResource {
val singularName: String
val pluralName: String
}
object GenericResource {
implicit def encodeResource[A <: GenericResource](implicit
derived: DerivedObjectEncoder[A]
): Encoder[A] = Encoder.instance { a =>
Json.obj(a.singularName -> derived(a))
}
}
And then if you have some case class extending GenericResource like this:
case class Product(name: String) extends GenericResource {
val singularName = "product"
val pluralName = "products"
}
You can do this (assuming all the members of the case class are encodeable):
scala> import io.circe.syntax._
import io.circe.syntax._
scala> Product("car").asJson.noSpaces
res0: String = {"product":{"name":"car"}}
No boilerplate, no extra imports, etc.
The Seq case is a little trickier, since circe automatically provides a Seq[A] encoder for any A that has an Encoder, but it doesn't do what you want—it just encodes the items and sticks them in a JSON array. You can write something like this:
implicit def encodeResources[A <: GenericResource](implicit
derived: DerivedObjectEncoder[A]
): Encoder[Seq[A]] = Encoder.instance {
case values # (head +: _) =>
Json.obj(head.pluralName -> Encoder.encodeList(derived)(values.toList))
case Nil => Json.obj()
}
And use it like this:
scala> Seq(Product("car"), Product("truck")).asJson.noSpaces
res1: String = {"products":[{"name":"car"},{"name":"truck"}]}
But you can't just stick it in the companion object and expect everything to work—you have to put it somewhere and import it when you need it (otherwise it has the same priority as the default Seq[A] instances).
Another issue with this encodeResources implementation is that it just returns an empty object if the Seq is empty:
scala> Seq.empty[Product].asJson.noSpaces
res2: String = {}
This is because the plural name is attached to the resource at the instance level, and if you don't have an instance there's no way to get it (short of reflection). You could of course conjure up a fake instance by passing nulls to the constructor or whatever, but that seems out of the scope of this question.
This issue (the resource names being attached to instances) is also going to be trouble if you need to decode this JSON you've encoded. If that is the case, I'd suggest considering a slightly different approach where you have something like a GenericResourceCompanion trait that you mix into the companion object for the specific resource type, and to indicate the names there. If that's not an option, you're probably stuck with reflection or fake instances, or both (but again, probably not in scope for this question).

Overcoming type erasure in Scala when pattern matching on Objects which may be different Sets or any type of Object

Is there any way of pattern matching objects where the objects may be Set[Foo] or Set[Bar] when the matching object can be any Object.
Given the below code, trying to pattern match on Set[Bar] will result in a match of Set[Foo] because of type erasure.
import play.api.libs.json._
import scala.collection.immutable.HashMap
case class Foo(valOne: Int, valTwo: Double)
object Foo {
implicit val writesFoo = Json.writes[Foo]
}
case class Bar(valOne: String)
object Bar {
implicit val writesBar = Json.writes[Bar]
}
case class TestRequest(params: Map[String, Object])
object TestRequest {
import play.api.libs.json.Json.JsValueWrapper
implicit val writeAnyMapFormat = new Writes[Map[String, Object]] {
def writes(map: Map[String, Object]): JsValue = {
Json.obj(map.map {
case (s, a) => {
val ret: (String, JsValueWrapper) = a match {
case _: String => s -> JsString(a.asInstanceOf[String])
case _: java.util.Date => s -> JsString(a.asInstanceOf[String])
case _: Integer => s -> JsString(a.toString)
case _: java.lang.Double => s -> JsString(a.toString)
case None => s -> JsNull
case foo: Set[Foo] => s -> Json.toJson(a.asInstanceOf[Set[Foo]])
case bar: Set[Bar] => s -> Json.toJson(a.asInstanceOf[Set[Bar]])
case str: Set[String] => s -> Json.toJson(a.asInstanceOf[Set[String]])
}
ret
}}.toSeq: _*)
}
}
implicit val writesTestRequest = Json.writes[TestRequest]
}
object MakeTestRequest extends App {
val params = HashMap[String, Object]("name" -> "NAME", "fooSet" -> Set(Foo(1, 2.0)), "barSet" -> Set(Bar("val1")))
val testRequest = new TestRequest(params)
println(Json.toJson(testRequest))
}
Trying to serialise the TestRequest will result in:
Exception in thread "main" java.lang.ClassCastException: Bar cannot be cast to Foo
Delegating the pattern matching of Sets to another method in an attempt to get the TypeTag,
case _ => s -> matchSet(a)
results in the type, unsurprisingly, of Object.
def matchSet[A: TypeTag](set: A): JsValue = typeOf[A] match {
case fooSet: Set[Foo] if typeOf[A] =:= typeOf[Foo] => Json.toJson(set.asInstanceOf[Set[Foo]])
case barSet: Set[Bar] if typeOf[A] =:= typeOf[Bar] => Json.toJson(set.asInstanceOf[Set[Bar]])
}
The runtime error being:
Exception in thread "main" scala.MatchError: java.lang.Object (of class scala.reflect.internal.Types$ClassNoArgsTypeRef)
A workaround could be to check the instance of the first element in the Set but this seems inefficient and ugly. Could also match on the key eg fooSet or barSet but if the keys are the same name eg both called set, then this wouldn't work.
In 2.11 s there any way to get at the type/class the Set has been created with?
You could use Shapeless typeable. Note that this is still not 100% safe (e.g. empty lists of different types cannot be distinguished at runtime, because that information literally doesn't exist); under the hood it's doing things like checking the types of the elements using reflection, just with a nicer interface on top.
In general it's better to carry the type information around explicitly, e.g. by using a a case class (or a shapeless HMap) rather than an untyped Map. Less good, but still better than nothing, is using wrapper case classes for the different types of Set that are possible, so that each one is a different type at runtime.
(Also half the point of the pattern match is to avoid the asInstanceOf; you should use e.g. foo rather than a.asInstanceOf[Set[Foo]])

Json Serialization for Trait with Multiple Case Classes (Sum Types) in Scala's Play

I often have to serialize/deserialize sum types (like Either[S,T]), and I haven't yet found a general or elegant way to do it. Here's an example type (essentially equivalent to Either)
sealed trait OutcomeType
case class NumericOutcome(units: String) extends OutcomeType
case class QualitativeOutcome(outcomes: List[String]) extends OutcomeType
Here's my best effort at a companion object that implements serialization. It works, but it's very tiresome to write these sorts of things over and over for every sum type. Are there any suggestions for making it nicer and/or more general?
import play.api.libs.json._
import play.api.libs.functional.syntax._
object OutcomeType {
val fmtNumeric = Json.format[NumericOutcome]
val fmtQualitative = Json.format[QualitativeOutcome]
implicit object FormatOutcomeType extends Format[OutcomeType] {
def writes(o: OutcomeType) = o match {
case n#NumericOutcome(_) => Json.obj("NumericOutcome" -> Json.toJson(n)(fmtNumeric))
case q#QualitativeOutcome(_) => Json.obj("QualitativeOutcome" -> Json.toJson(q)(fmtQualitative))
}
def reads(json: JsValue) = (
Json.fromJson(json \ "NumericOutcome")(fmtNumeric) orElse
Json.fromJson(json \ "QualitativeOutcome")(fmtQualitative)
)
}
}
I think that is about as simple as you can make it, if you want to avoid writing the code for each explicit subtype maybe you could do it with reflection, use jackson directly or some other json library with reflection support. Or write your own macro to generate the Format from a list of subtypes.
I have a systematic solution for the problem of serializing sum-types in my json pickling library Prickle. Similar ideas could be employed with Play. There is still some config code required, but its high signal/noise, eg final code like:
implicit val fruitPickler = CompositePickler[Fruit].concreteType[Apple].concreteType[Lemon]
CompositePicklers associated with a supertype are configured with one PicklerPair for each known subtype (ie sum type option). The associations are setup at configure time.
During pickling a descriptor is emitted into the json stream describing which subtype the record is.
During unpickling, the descriptor is read out of the json and then used to locate the appropriate Unpickler for the subtype
An example updated for play 2.5:
object TestContact extends App {
sealed trait Shape
object Shape {
val rectFormat = Json.format[Rect]
val circleFormat = Json.format[Circle]
implicit object ShapeFormat extends Format[Shape] {
override def writes(shape: Shape): JsValue = shape match {
case rect: Rect =>
Json.obj("Shape" ->
Json.obj("Rect" ->
Json.toJson(rect)(rectFormat)))
case circle: Circle =>
Json.obj("Shape" ->
Json.obj("Circle" ->
Json.toJson(circle)(circleFormat)))
}
override def reads(json: JsValue): JsResult[Shape] = {
json \ "Shape" \ "Rect" match {
case JsDefined(rectJson) => rectJson.validate[Rect](rectFormat)
case _ => json \ "Shape" \ "Circle" match {
case JsDefined(circleJson) => circleJson.validate[Circle](circleFormat)
case _ => JsError("Not a valide Shape object.")
}
}
}
}
}
case class Rect(width: Double, height: Double) extends Shape
case class Circle(radius: Double) extends Shape
val circle = Circle(2.1)
println(Json.toJson(circle))
val rect = Rect(1.3, 8.9)
println(Json.toJson(rect))
var json = Json.obj("Shape" -> Json.obj("Circle" -> Json.obj("radius" -> 4.13)))
println(json.validate[Shape])
json =
Json.obj("Shape" ->
Json.obj("Rect" ->
Json.obj("width" -> 23.1, "height" -> 34.7)))
println(json.validate[Shape])
}