Parsing json collection using play scala reads - json

Assume that I have a case class:
case class Element(name: String)
and a companion class which has a reads:
object Element {
implicit val elementReads = (
(JsPath / "name").read[String]
)
}
I need a map of this objects, so:
object ElementMap {
def apply(elements: Iterable[Element]) = {
val builder = Map.newBuilder[String, Element]
elements.foreach { x => builder += ((x.name, x)) }
builder result
}
}
Finally, I have to create a reads that would read from JSON:
implicit val elementMapReads = (
(JsPath \ "elements").read[Seq[Element]]
)(ElementMap(_))
However I get a error message here:
overloaded method value read with alternatives: (t: Seq[Element])play.api.libs.json.Reads[Seq[Element]] <and> (implicit r: play.api.libs.json.Reads[Seq[Element]])play.api.libs.json.Reads[Seq[Element]] cannot be applied to (Iterable[Element] ⇒ Map[String,Element])

This is a working version:
case class Element(name: String)
object Element {
implicit val elementReads: Reads[Element] =
(JsPath \ "name").read[String].map(Element.apply)
implicit val elementMapReads: Reads[Map[String, Element]] =
(JsPath \ "elements").read[Seq[Element]].map(ElementMap(_))
}
object ElementMap {
def apply(elements: Iterable[Element]) = {
val builder = Map.newBuilder[String, Element]
elements.foreach { x => builder += ((x.name, x)) }
builder result
}
}
In your code, the elementReads implicit as it is defined is a Reads[String], not a Reads[Element]; in general it's better to have explicit type annotations in play-json as there are some implicit conversions going on that need them.
Also, Reads[_] has a map method that is convenient when you have single-field wrappers and that solves also the problem of converting a Reads[Seq[Element]] to create a Reads[Map[String, Element]].
Finally, moving the elementMapReads in the Element companion object should make it automatically available (i.e. no import required when using it); I didn't test it but it should work as far as I know.

Related

Parsing an object-sequence with Play-JSON library

As a disclaimer: I'm very new to Scala and functional programming in general.
I have the following classes:
case class A(
field1: String,
field2: DateTime
)
case class B(
listOfStuff: Seq[A]
)
object A{
def create(field1: String, field2: DateTime): A = A(field1, field2)
}
object B{
def create(listOfStuff: Seq[A]): B = B(listOfStuff)
}
(The create() functions exist because I sometimes had issues in my code when using apply(). Let's ignore this, I doubt it's relevant.)
I get these objects in JSON format and I try to parse them using the Play-JSON library. An important aspect is that the list (Seq) can be missing from the JSON text, it's an optional field.
With that in mind, this is how I wrote that particular line:
private implicit val messageReader = (
//...
(__ \ "stuff").readNullable[Seq[A]].map(_.getOrElse(Seq()))
//...
)(B.create _)
When compiling, I get the following error:
Error:(!line!, !column!) No Json deserializer found for type Seq[A].
Try to implement an implicit Reads or Format for this type.
From what I saw in this question, apparently you need to have an implicit instance of Reads[T] for every type T that is not part of the language (or something like that, did I mention I'm new to this?)
So I also added a secondary Reads[T] in the same scope, my code now looked like this:
private implicit val messageReader = (
(__ \ "stuff").readNullable[Seq[A]].map(_.getOrElse(Seq()))
)(B.create _)
private implicit val anotherReader = (
(__ \ "field1").read[String] and
(__ \ "field2").read[String].map(org.joda.time.DateTime.parse)
)(A.create _)
However I still get that same error on the same line.
I feel like there's some very easy and obvious problem here, but I can't figure out what it is.
How do I fix this?
if B.create accepts a single argument then its reads can look like this:
private implicit val messageReader : Reads[B] =
(__ \ "stuff").readNullable[Seq[A]].map(_.getOrElse(Seq())).map(B.create)

Making Reads and Writes in Scala Play for lists of custom classes

So I have two classes in my project
case class Item(id: Int, name: String)
and
case class Order(id: Int, items: List[Item])
I'm trying to make reads and writes properties for Order but I get a compiler error saying:
"No unapply or unapplySeq function found"
In my controller I have the following:
implicit val itemReads = Json.reads[Item]
implicit val itemWrites = Json.writes[Item]
implicit val listItemReads = Json.reads[List[Item]]
implicit val listItemWrites = Json.writes[List[Item]]
The code works for itemReads and itemWrites but not for the bottom two. Can anyone tell me where I'm going wrong, I'm new to Play framework.
Thank you for your time.
The "No unapply or unapplySeq function found" error is caused by these two:
implicit val listItemReads = Json.reads[List[Item]]
implicit val listItemWrites = Json.writes[List[Item]]
Just throw them away. As Ende said, Play knows how to deal with lists.
But you need Reads and Writes for Order too! And since you do both reading and writing, it's simplest to define a Format, a mix of the Reads and Writes traits. This should work:
case class Item(id: Int, name: String)
object Item {
implicit val format = Json.format[Item]
}
case class Order(id: Int, items: List[Item])
object Order {
implicit val format = Json.format[Order]
}
Above, the ordering is significant; Item and the companion object must come before Order.
So, once you have all the implicit converters needed, the key is to make them properly visible in the controllers. The above is one solution, but there are other ways, as I learned after trying to do something similar.
You don't actually need to define those two implicits, play already knows how to deal with a list:
scala> import play.api.libs.json._
import play.api.libs.json._
scala> case class Item(id: Int, name: String)
defined class Item
scala> case class Order(id: Int, items: List[Item])
defined class Order
scala> implicit val itemReads = Json.reads[Item]
itemReads: play.api.libs.json.Reads[Item] = play.api.libs.json.Reads$$anon$8#478fdbc9
scala> implicit val itemWrites = Json.writes[Item]
itemWrites: play.api.libs.json.OWrites[Item] = play.api.libs.json.OWrites$$anon$2#26de09b8
scala> Json.toJson(List(Item(1, ""), Item(2, "")))
res0: play.api.libs.json.JsValue = [{"id":1,"name":""},{"id":2,"name":""}]
scala> Json.toJson(Order(10, List(Item(1, ""), Item(2, ""))))
res1: play.api.libs.json.JsValue = {"id":10,"items":[{"id":1,"name":""},{"id":2,"name":""}]}
The error you see probably happens because play uses the unapply method to construct the macro expansion for your read/write and List is an abstract class, play-json needs concrete type to make the macro work.
This works:
case class Item(id: Int, name: String)
case class Order(id: Int, items: List[Item])
implicit val itemFormat = Json.format[Item]
implicit val orderFormat: Format[Order] = (
(JsPath \ "id").format[Int] and
(JsPath \ "items").format[JsArray].inmap(
(v: JsArray) => v.value.map(v => v.as[Item]).toList,
(l: List[Item]) => JsArray(l.map(item => Json.toJson(item)))
)
)(Order.apply, unlift(Order.unapply))
This also allows you to customize the naming for your JSON object. Below is an example of the serialization in action.
Json.toJson(Order(1, List(Item(2, "Item 2"))))
res0: play.api.libs.json.JsValue = {"id":1,"items":[{"id":2,"name":"Item 2"}]}
Json.parse(
"""
|{"id":1,"items":[{"id":2,"name":"Item 2"}]}
""".stripMargin).as[Order]
res1: Order = Order(1,List(Item(2,Item 2)))
I'd also recommend using format instead of read and write if you are doing symmetrical serialization / deserialization.

How to convert a json to an entity in Play Framework REST API

I am just starting up with scala and play framework and stuck on this seemingly simple problem.
I have a single REST end point to which I will post some Json Object. That Json Object needs to be converted to an entity.
The entity is declared of type case classes, and the received Json can be of any type of the case classes.
My problem is, I am unable to convert the Json to the corresponding entity type, because (as per the tutorial) I need to write implicit Reads in the validation with each of the fields defined.
For example
implicit val emailReads: Reads[Email] = (
(JsPath \ "from").read[String] and
(JsPath \ "subject").read[String]
)(Email.apply _)
works fine for a sample case class email. But when I have case classes like this :
abstract class Event
case class OneEventType(type : String) extends Event
case class TwoEventType(type : String, attribute : SomeType) extends Event
And the controller method works on Event basis :
def events = Action(BodyParsers.parse.json) { request =>
val eventReceived = request.body.validate[Event]
//do something
Ok(Json.obj("status" ->"OK"))
}
How would I validate the event and construct the correct Event Object, as in the Reads method I need to specify each of the fields ?
This should work,
implicit val st: Reads[Event] = new Reads[Event] {
def reads(json: JsValue): JsResult[Event] = {
json match {
case JsObject(Seq(("type", JsString(type)), ("attribute", JsString(attribute)))) => JsSuccess(TwoEventType(type, attribute))
case o: JsObject if (o.value.get("type").isDefined) => JsSuccess(OneEventType(o.value.get("type")))
case a: Any => JsError(a.toString())
}
}
}
I guess you can, adding a custom process to the reading. Suppose that attribute is of type Int:
abstract class Event
case class OneEventType(type : String) extends Event
case class TwoEventType(type : String, attribute : Int) extends Event
implicit val eventReader: Reads[Event] = (
(JsPath \ "type").read[String] and
(JsPath \ "attribute").readNullable[Int]
)((typeOp, attributeOp) => {
if (attribute.isEmpty) OneEventType(typeOp.get())
else TwoEventType(typeOp.get(), attributeOp.get())
})
(can't test now, so I am not sure if it works out of the box).

Json Serialization for Trait with Multiple Case Classes (Sum Types) in Scala's Play

I often have to serialize/deserialize sum types (like Either[S,T]), and I haven't yet found a general or elegant way to do it. Here's an example type (essentially equivalent to Either)
sealed trait OutcomeType
case class NumericOutcome(units: String) extends OutcomeType
case class QualitativeOutcome(outcomes: List[String]) extends OutcomeType
Here's my best effort at a companion object that implements serialization. It works, but it's very tiresome to write these sorts of things over and over for every sum type. Are there any suggestions for making it nicer and/or more general?
import play.api.libs.json._
import play.api.libs.functional.syntax._
object OutcomeType {
val fmtNumeric = Json.format[NumericOutcome]
val fmtQualitative = Json.format[QualitativeOutcome]
implicit object FormatOutcomeType extends Format[OutcomeType] {
def writes(o: OutcomeType) = o match {
case n#NumericOutcome(_) => Json.obj("NumericOutcome" -> Json.toJson(n)(fmtNumeric))
case q#QualitativeOutcome(_) => Json.obj("QualitativeOutcome" -> Json.toJson(q)(fmtQualitative))
}
def reads(json: JsValue) = (
Json.fromJson(json \ "NumericOutcome")(fmtNumeric) orElse
Json.fromJson(json \ "QualitativeOutcome")(fmtQualitative)
)
}
}
I think that is about as simple as you can make it, if you want to avoid writing the code for each explicit subtype maybe you could do it with reflection, use jackson directly or some other json library with reflection support. Or write your own macro to generate the Format from a list of subtypes.
I have a systematic solution for the problem of serializing sum-types in my json pickling library Prickle. Similar ideas could be employed with Play. There is still some config code required, but its high signal/noise, eg final code like:
implicit val fruitPickler = CompositePickler[Fruit].concreteType[Apple].concreteType[Lemon]
CompositePicklers associated with a supertype are configured with one PicklerPair for each known subtype (ie sum type option). The associations are setup at configure time.
During pickling a descriptor is emitted into the json stream describing which subtype the record is.
During unpickling, the descriptor is read out of the json and then used to locate the appropriate Unpickler for the subtype
An example updated for play 2.5:
object TestContact extends App {
sealed trait Shape
object Shape {
val rectFormat = Json.format[Rect]
val circleFormat = Json.format[Circle]
implicit object ShapeFormat extends Format[Shape] {
override def writes(shape: Shape): JsValue = shape match {
case rect: Rect =>
Json.obj("Shape" ->
Json.obj("Rect" ->
Json.toJson(rect)(rectFormat)))
case circle: Circle =>
Json.obj("Shape" ->
Json.obj("Circle" ->
Json.toJson(circle)(circleFormat)))
}
override def reads(json: JsValue): JsResult[Shape] = {
json \ "Shape" \ "Rect" match {
case JsDefined(rectJson) => rectJson.validate[Rect](rectFormat)
case _ => json \ "Shape" \ "Circle" match {
case JsDefined(circleJson) => circleJson.validate[Circle](circleFormat)
case _ => JsError("Not a valide Shape object.")
}
}
}
}
}
case class Rect(width: Double, height: Double) extends Shape
case class Circle(radius: Double) extends Shape
val circle = Circle(2.1)
println(Json.toJson(circle))
val rect = Rect(1.3, 8.9)
println(Json.toJson(rect))
var json = Json.obj("Shape" -> Json.obj("Circle" -> Json.obj("radius" -> 4.13)))
println(json.validate[Shape])
json =
Json.obj("Shape" ->
Json.obj("Rect" ->
Json.obj("width" -> 23.1, "height" -> 34.7)))
println(json.validate[Shape])
}

Custom JodaTime serializer using Play Framework's JSON library?

How do I implement a custom JodaTime's DateTime serializer/deserializer for JSON? I'm inclined to use the Play Framework's JSON library (2.1.1). There is a default DateTime serializer, but it uses dt.getMillis instead of .toString which would return an ISO compliant String.
Writing Reads[T] amd Writes[T] for case classes seems fairly straightforward, but I can't figure out how to do the same for DateTime.
There is a default DateTime serializer, but it uses dt.getMillis instead of .toString which would return an ISO compliant String.
If you look at the source, Reads.jodaDateReads already handles both numbers and strings using DateTimeFormatter.forPattern. If you want to handle ISO8601 string, just replace it with ISODateTimeFormat:
implicit val jodaISODateReads: Reads[org.joda.time.DateTime] = new Reads[org.joda.time.DateTime] {
import org.joda.time.DateTime
val df = org.joda.time.format.ISODateTimeFormat.dateTime()
def reads(json: JsValue): JsResult[DateTime] = json match {
case JsNumber(d) => JsSuccess(new DateTime(d.toLong))
case JsString(s) => parseDate(s) match {
case Some(d) => JsSuccess(d)
case None => JsError(Seq(JsPath() -> Seq(ValidationError("validate.error.expected.date.isoformat", "ISO8601"))))
}
case _ => JsError(Seq(JsPath() -> Seq(ValidationError("validate.error.expected.date"))))
}
private def parseDate(input: String): Option[DateTime] =
scala.util.control.Exception.allCatch[DateTime] opt (DateTime.parse(input, df))
}
(simplify as desired, e.g. remove number handling)
implicit val jodaDateWrites: Writes[org.joda.time.DateTime] = new Writes[org.joda.time.DateTime] {
def writes(d: org.joda.time.DateTime): JsValue = JsString(d.toString())
}
I use Play 2.3.7 and define in companion object implicit reads/writes with string pattern:
case class User(username:String, birthday:org.joda.time.DateTime)
object User {
implicit val yourJodaDateReads = Reads.jodaDateReads("yyyy-MM-dd'T'HH:mm:ss'Z'")
implicit val yourJodaDateWrites = Writes.jodaDateWrites("yyyy-MM-dd'T'HH:mm:ss'Z'")
implicit val userFormat = Json.format[User]
}
Another, perhaps simpler, solution would be to do a map, for example:
case class GoogleDoc(id: String, etag: String, created: LocalDateTime)
object GoogleDoc {
import org.joda.time.LocalDateTime
import org.joda.time.format.ISODateTimeFormat
implicit val googleDocReads: Reads[GoogleDoc] = (
(__ \ "id").read[String] ~
(__ \ "etag").read[String] ~
(__ \ "createdDate").read[String].map[LocalDateTime](x => LocalDateTime.parse(x, ISODateTimeFormat.basicdDateTime()))
)(GoogleDoc)
}
UPDATE
If you had a recurring need for this conversion, then you could create your own implicit conversion, it is only a couple of lines of code:
import org.joda.time.LocalDateTime
import org.joda.time.format.ISODateTimeFormat
implicit val readsJodaLocalDateTime = Reads[LocalDateTime](js =>
js.validate[String].map[LocalDateTime](dtString =>
LocalDateTime.parse(dtString, ISODateTimeFormat.basicDateTime())
)
)