Scala - play json trait serialisation - json

In scala, I have the following trait and classes defined (names used only for illustrative purposes):
trait Entity {
def x() : Collection
}
case class X(x : Int, y : Int) extends Entity {
def x() : Collection = XCollection()
}
case class Y(x : Int, y : Int) extends Entity {
def x() : Collection = YCollection()
}
While the class instances are created by parsing the response from a web service REST API.
Although the approach using the play-json library works in the case parsing the response, and returning a class representation of the response, I've been struggling with the following: having a generic function taking a type parameter, whereas T : Entity, and returning an instance of type T.
For example, consider the following:
def parse[T <: Entity](json : String) : Option[T] = Json.parse(json).asOpt[T](Variants.format[T])
Given a type T, I would like to parse the JSON string and produce an instance of type T, whereas the instance is a derivation of the trait Entity. However, I keep getting a compile error in regard to the reflection API:
Error:(25, 96) exception during macro expansion:
scala.ScalaReflectionException: type T is not a class
at scala.reflect.api.Symbols$SymbolApi$class.asClass(Symbols.scala:323)
at scala.reflect.internal.Symbols$SymbolContextApiImpl.asClass(Symbols.scala:73)
at julienrf.variants.Variants$Impl$.baseAndVariants(Variants.scala:132)
at julienrf.variants.Variants$Impl$.formatDiscriminator(Variants.scala:99)
at julienrf.variants.Variants$Impl$.format(Variants.scala:94)
def parse[T <: Entity](json : String) : Option[T] = Json.parse(json).asOpt[T](Variants.format[T])
^
Therefore, I would appreciate for some help!
Thanks

You better allow implicits solve your format requirements, than using some kind of factory:
def parse[T <: Entity](json: String)(implicit r: Reads[T]): Option[T] =
Json.parse(json).asOpt[T]
Then if you have a format in current context implicitly defined, parse will work:
implicit val XFormat = Json.format[X]
parse[X](Json.stringify(Json.toJson(X(1, 2))) // returns X(1, 2)
Update
You can make it as a factory, if you really want it. I would question whether it's worth doing that, but in theory I could imagine some distinct situations where you would not want to use implicit mechanics. Still I think if implicits does not work for you, you might have an architecture problem in your code
import play.api.libs.json.{Json, Reads}
import scala.reflect.runtime.universe._
trait Entity
case class X(x : Int, y : Int) extends Entity
case class Y(x : Int, y : Int) extends Entity
val mapping = Map[Type, Reads[_]](typeOf[X] -> Json.format[X], typeOf[Y] -> Json.format[Y])
def getFormat[T](tpe: Type): Reads[T] =
mapping(tpe).asInstanceOf[Reads[T]]
def parse[T : TypeTag](json: String): Option[T] = {
val map = mapping(implicitly[TypeTag[T]].tpe)
Json.parse(json).asOpt[T](getFormat(implicitly[TypeTag[T]].tpe))
}
println(parse[X]("""{"x": 5, "y": 6}"""))
println(parse[Y]("""{"x": 5, "y": 6}"""))

Related

Configure spray-json for non strict parsing deserialization

How to configure the spray-json parsing on parsing options?
Similarly as Jackson Parsing Features.
For example, I am parsing a json that has a field that my case class has not, and it is breaking:
spray.json.DeserializationException: Object is missing required member 'myfield'
UPDATE :
A simple example:
case class MyClass(a: String, b: Long);
and try to parse an incomplete json like
val data = "{a: \"hi\"}"
with a spray-json format like:
jsonFormat2(MyClass.apply)
// ...
data.parseJson.convertTo[MyClass]
(simplified code).
But the question goes further, I want to ask about configuration options like in other parsers. More examples:
Be able to ignore fields that exist in the JSON but not in the case class.
Ways of managing nulls or nonexistent values.
etc.
SprayJson allows you to define custom parsers like so:
case class Foo(a: String, b: Int)
implicit object FooJsonFormat extends RootJsonFormat[Foo] {
override def read(json: JsValue): Foo = {
json.asJsObject.getFields("name", "id") match {
case Seq(JsString(name), id) =>
Foo(name, id.convertTo[Int])
}
}
override def write(obj: Foo): JsValue = obj.toJson
}
This allows you to parse any arbitrary payload and pull out the fields "name" and "id" - other fields are ignored. If those fields are not guaranteed you can add something like:
case Seq(JsString(name), JsNull) =>
Foo(name, 0)
You should look at what's available in JsValue.scala - in particular JsArray may come in handy if you're getting payloads with anonymous arrays (i.e. the root is [{...}] instead of {"field":"value"...})
Spray Json doesn't support default parameters. So You cannot have a case class like
case class MyClass(a: String, b: Int = 0)
and then parse json like {"a":"foo"}
However if you make the second parameter as Option. then it works.
import spray.json._
case class MyClass(a: String, b: Option[Int] = None)
object MyProtocol extends DefaultJsonProtocol {
implicit val f = jsonFormat2(MyClass)
}
import MyProtocol.f
val mc1 = MyClass("foo", Some(10))
val strJson = mc1.toJson.toString
val strJson2 = """{"a": "foo"}"""
val mc2 = strJson2.parseJson.convertTo[MyClass]
println(mc2)

How can I define a dynamic base json object?

I would like to design a base trait/class in Scala that can produce the following json:
trait GenericResource {
val singularName: String
val pluralName: String
}
I would inherit this trait in a case class:
case class Product(name: String) extends GenericResource {
override val singularName = "product"
override val pluralName = "products"
}
val car = Product("car")
val jsonString = serialize(car)
the output should look like: {"product":{"name":"car"}}
A Seq[Product] should produce {"products":[{"name":"car"},{"name":"truck"}]} etc...
I'm struggling with the proper abstractions to accomplish this. I am open to solutions using any JSON library (available in Scala).
Here's about the simplest way I can think of to do the singular part generically with circe:
import io.circe.{ Decoder, Encoder, Json }
import io.circe.generic.encoding.DerivedObjectEncoder
trait GenericResource {
val singularName: String
val pluralName: String
}
object GenericResource {
implicit def encodeResource[A <: GenericResource](implicit
derived: DerivedObjectEncoder[A]
): Encoder[A] = Encoder.instance { a =>
Json.obj(a.singularName -> derived(a))
}
}
And then if you have some case class extending GenericResource like this:
case class Product(name: String) extends GenericResource {
val singularName = "product"
val pluralName = "products"
}
You can do this (assuming all the members of the case class are encodeable):
scala> import io.circe.syntax._
import io.circe.syntax._
scala> Product("car").asJson.noSpaces
res0: String = {"product":{"name":"car"}}
No boilerplate, no extra imports, etc.
The Seq case is a little trickier, since circe automatically provides a Seq[A] encoder for any A that has an Encoder, but it doesn't do what you want—it just encodes the items and sticks them in a JSON array. You can write something like this:
implicit def encodeResources[A <: GenericResource](implicit
derived: DerivedObjectEncoder[A]
): Encoder[Seq[A]] = Encoder.instance {
case values # (head +: _) =>
Json.obj(head.pluralName -> Encoder.encodeList(derived)(values.toList))
case Nil => Json.obj()
}
And use it like this:
scala> Seq(Product("car"), Product("truck")).asJson.noSpaces
res1: String = {"products":[{"name":"car"},{"name":"truck"}]}
But you can't just stick it in the companion object and expect everything to work—you have to put it somewhere and import it when you need it (otherwise it has the same priority as the default Seq[A] instances).
Another issue with this encodeResources implementation is that it just returns an empty object if the Seq is empty:
scala> Seq.empty[Product].asJson.noSpaces
res2: String = {}
This is because the plural name is attached to the resource at the instance level, and if you don't have an instance there's no way to get it (short of reflection). You could of course conjure up a fake instance by passing nulls to the constructor or whatever, but that seems out of the scope of this question.
This issue (the resource names being attached to instances) is also going to be trouble if you need to decode this JSON you've encoded. If that is the case, I'd suggest considering a slightly different approach where you have something like a GenericResourceCompanion trait that you mix into the companion object for the specific resource type, and to indicate the names there. If that's not an option, you're probably stuck with reflection or fake instances, or both (but again, probably not in scope for this question).

How to make Spray Json throw exception when it sees extra fields?

For example suppose I have
case class Test(a: String, b: String)
...
implicit val testFormat = jsonFormat2(Test.apply)
and a json with an extra c field:
val test = "{\"a\": \"A\", \"b\": \"B\", \"c\": \"C\"}"
then I want to find a way (config/param/whatever) to make the following line throw and exception:
test.parseJson.convertTo[Test]
It's very hard to work this out from reading the source code and github documentation.
I didn't see anything in the library that provides that functionality, so I created a wrapper that does a quick check on the number of fields present before delegating the read call to the supplied formatter.
case class StrictRootFormat[T](format: RootJsonFormat[T], size: Int) extends RootJsonFormat[T] {
def read(json: JsValue): T = {
if(json.asJsObject.fields.size > size) deserializationError("JSON has too many fields: \n " + json.toString())
else format.read(json)
}
def write(obj: T): JsValue = format.write(obj)
}
Usage:
implicit val testFormat = StrictRootFormat(jsonFormat2(Test.apply), 2)
You could enhance the read implementation so that you don't need to supply the "size" argument.

Create a generic Json serialization function

Is it possible to create a generic function in Scala, using Play Framework 2.2, that will serialize an arbitrary object to JSON, without having to be supplied a writer or formatter?
For instance, this non-generic code will create a JSON response given a Customer:
import play.api.libs.json._
import play.api.libs.functional.syntax._
case class Customer(id: Int, name: String)
object scratch {
val p = Customer(1, "n")
//> p : Customer = Customer(1,n)
def createJsonResponseCustomer(data: Customer) = {
implicit val formatter = Json.format[Customer]
Json.obj("success" -> true, "data" -> Json.toJson[Customer](data))
}
createJsonResponseCustomer(p)
//> res0: play.api.libs.json.JsObject = {"success":true,"data":{"id":1,"name":"n"}}
}
To avoid having to define the formatter for each different object, I'd like to create a generic function like this:
def createJsonResponse[T](data: T) = {
implicit val formatter = Json.format[T]
Json.obj("success" -> true, "data" -> Json.toJson[T](data))
}
But this attempt produces the error No unapply function found at Json.format[T].
In other words, this works:
def getFormatter(c: Customer) = Json.format[Customer]
but this doesn't:
def getFormatterGeneric[T](c: T) = Json.format[T]
Is there any way around this?
You need to define the formatter somewhere, for each type you wish to read or write. This is because the formatter instances are resolved at compile time, not at runtime. This is a good thing, because it means trying to serialize a type that does not have a serializer becomes a compile-time error, not a runtime one.
Instead of defining the formatters on the fly, define them in a module that you can reuse, e.g.
object JsonFormatters {
implicit val customerWrites: Format[Customer] = Json.format[Customer]
}
Then import JsonFormatters._ in the scope that you want to write some JSON.
Now, you can write a generic method similar to what you wanted: you just have to specify the requirement for a formatter in the signature of your method. In practice, this is an implicit paramter of type Writes[T].
def createJsonResponse[T](data: T)(implicit writes: Writes[T]) =
Json.obj("success" -> true, "data" -> Json.toJson[T](data))
You can also write this method signature using context bound syntax, i.e.
def createJsonResponse[T : Writes](data: T) = ...
This requires that there is an instance of Writes[T] in scope; but the compiler will choose the correct instance for you based on the type T, rather than you resolving it explicitly.
Note that Writes[T] is a supertype of Format[T]; since you are only writing JSON in this method, there's no need to specify a requirement for Format[T], which would also give you Reads[T].

Convert polymorphic case classes to json and back

I am trying to use spray-json in scala to recognize the choice between Ec2Provider and OpenstackProvider when converting to Json and back.
I would like to be able to give choices in "Provider", and if those choices don't fit the ones available then it should not validate.
My attempt at this can be seen in the following code:
import spray.json._
import DefaultJsonProtocol._
case class Credentials(username: String, password: String)
abstract class Provider
case class Ec2Provider(endpoint: String,credentials: Credentials) extends Provider
case class OpenstackProvider(credentials: Credentials) extends Provider
case class Infrastructure(name: String, provider: Provider, availableInstanceTypes: List[String])
case class InfrastructuresList(infrastructures: List[Infrastructure])
object Infrastructures extends App with DefaultJsonProtocol {
implicit val credFormat = jsonFormat2(Credentials)
implicit val ec2Provider = jsonFormat2(Ec2Provider)
implicit val novaProvider = jsonFormat1(OpenstackProvider)
implicit val infraFormat = jsonFormat3(Infrastructure)
implicit val infrasFormat = jsonFormat1(InfrastructuresList)
println(
InfrastructuresList(
List(
Infrastructure("test", Ec2Provider("nova", Credentials("user","pass")), List("1", "2"))
)
).toJson
)
}
Unfortunately, it fails because it can not find a formatter for Provider abstract class.
test.scala:19: could not find implicit value for evidence parameter of type Infrastructures.JF[Provider]
Anyone have any solution for this?
What you want to do is not available out of the box (i.e. via something like type hints that allow the deserializer to know what concrete class to instantiate), but it's certainly possible with a little leg work. First, the example, using a simplified version of the code you posted above:
case class Credentials(user:String, password:String)
abstract class Provider
case class Ec2Provider(endpoint:String, creds:Credentials) extends Provider
case class OpenstackProvider(creds:Credentials) extends Provider
case class Infrastructure(name:String, provider:Provider)
object MyJsonProtocol extends DefaultJsonProtocol{
implicit object ProviderJsonFormat extends RootJsonFormat[Provider]{
def write(p:Provider) = p match{
case ec2:Ec2Provider => ec2.toJson
case os:OpenstackProvider => os.toJson
}
def read(value:JsValue) = value match{
case obj:JsObject if (obj.fields.size == 2) => value.convertTo[Ec2Provider]
case obj:JsObject => value.convertTo[OpenstackProvider]
}
}
implicit val credFmt = jsonFormat2(Credentials)
implicit val ec2Fmt = jsonFormat2(Ec2Provider)
implicit val openStackFmt = jsonFormat1(OpenstackProvider)
implicit val infraFmt = jsonFormat2(Infrastructure)
}
object PolyTest {
import MyJsonProtocol._
def main(args: Array[String]) {
val infra = List(
Infrastructure("ec2", Ec2Provider("foo", Credentials("me", "pass"))),
Infrastructure("openstack", OpenstackProvider(Credentials("me2", "pass2")))
)
val json = infra.toJson.toString
val infra2 = JsonParser(json).convertTo[List[Infrastructure]]
println(infra == infra2)
}
}
In order to be able to serialize/deserialize instances of the abstract class Provider, I've created a custom formatter where I am supplying operations for reading and writing Provider instances. All I'm doing in these functions though is checking a simple condition (binary here as there are only 2 impls of Provider) to see what type it is and then delegating to logic to handle that type.
For writing, I just need to know which instance type it is which is easy. Reading is a little trickier though. For reading, I'm checking how many properties the object has and since the two impls have different numbers of props, I can differentiate which is which this way. The check I'm making here is very rudimentary, but it shows the point that if you can look at the Json AST and differentiate the types, you can then pick which one to deserialize to. Your actual check can be as simple or as complicated as you like, as long as it's is deterministic in differentiating the types.