Scala Play 2.6: Parse json with constraint between fields - json

I'd like to validate a field depending on the value of another field when parsing a json.
For example when I'm reading a range, I'd like to verify min < max:
import org.scalatest.{FlatSpec, Matchers}
import play.api.libs.json.{JsError, Json, Reads}
class JsonReadsTest extends FlatSpec with Matchers {
"Json" should "be reads" in {
val reads: Reads[Range] = ???
val json = Json.obj("min" -> 3, "max" -> 2)
reads.reads(json) shouldBe JsError("max should be superior to min")
}
}
case class Range(min: Int, max: Int)

Stright-forward-oop solution
You can create an object that extends Reads[T] and implement reads method directly without fuctional builders
(this approach is not fully covered in docs but you can find many exambles in source code)
val reads1 = new Reads[Range] {
def reads(json: JsValue): JsResult[Range] = {
(for {
min <- (json \ "min").validate[Int]
max <- (json \ "max").validate[Int]
} yield (min, max)).flatMap {
case (min, max) if max > min =>
JsSuccess(Range(min, max))
case _ =>
JsError(Seq(JsPath ->
Seq(JsonValidationError("error.expected.range"))))
}
}
}
Functional-builder solution
val reads2 = (
(__ \ "min").read[Int] and
(__ \ "max").read[Int]
).tupled
.filter(JsonValidationError("error.expected.range")){ case (min, max) => max > min}
.map{ case (min, max) => Range(min, max)}
Best practice
You are trying to mix reading and validation and it can be a problem when you have different validation rules in different modules. There is a practice to parse json object to an presentational-model-class like
case class RangeInput(min: Int, max: Int)
and then convert it to business model class performing validation
def validate(input: RangeInput): Option[Range] =
input.filter(i => i.max > i.min).map(i => Range(i.min, i.max))
If you need to aggregate validation errors things like cats validated can help you with that

Related

Deserialize JSON distinguising missing and null values

I have a requirement to parse a JSON object, using play-json and distinguish between a missing value, a string value and a null value.
So for example I might want to deserialize into the following case class:
case class MyCaseClass(
a: Option[Option[String]]
)
Where the values of 'a' mean:
None - "a" was missing - normal play-json behavipr
Some(Some(String)) - "a" had a string value
Some(None) - "a" had a null value
So examples of the expected behavior are:
{}
should deserialize to myCaseClass(None)
{
"a": null
}
should deserialize as myCaseClass(Some(None))
{
"a": "a"
}
should deserialize as myCaseClass(Some(Some("a"))
I've tried writing custom formatters, but the formatNullable and formatNullableWithDefault methods don't distinguish between a missing and null value, so the code I've written below cannot generate the Some(None) result
object myCaseClass {
implicit val aFormat: Format[Option[String]] = new Format[Option[String]] {
override def reads(json: JsValue): JsResult[Option[String]] = {
json match {
case JsNull => JsSuccess(None) // this is never reached
case JsString(value) => JsSuccess(Some(value))
case _ => throw new RuntimeException("unexpected type")
}
}
override def writes(codename: Option[String]): JsValue = {
codename match {
case None => JsNull
case Some(value) => JsString(value)
}
}
}
implicit val format = (
(__ \ "a").formatNullableWithDefault[Option[String]](None)
)(MyCaseClass.apply, unlift(MyCaseClass.unapply))
}
Am I missing a trick here? How should I go about this? I am very much willing to encode the final value in some other way than an Option[Option[Sting]] for example some sort of case class that encapsulates this:
case class MyContainer(newValue: Option[String], wasProvided: Boolean)
I recently found a reasonable way to do this. I'm using Play 2.6.11 but I'm guessing the approach will transfer to other recent versions.
The following snippet adds three extension methods to JsPath, to read/write/format fields of type Option[Option[A]]. In each case a missing field maps to a None, a null to a Some(None), and a non-null value to a Some(Some(a)) as the original poster requested:
import play.api.libs.json._
object tristate {
implicit class TriStateNullableJsPathOps(path: JsPath) {
def readTriStateNullable[A: Reads]: Reads[Option[Option[A]]] =
Reads[Option[Option[A]]] { value =>
value.validate[JsObject].flatMap { obj =>
path.asSingleJsResult(obj) match {
case JsError(_) => JsSuccess(Option.empty[Option[A]])
case JsSuccess(JsNull, _) => JsSuccess(Option(Option.empty[A]))
case JsSuccess(json, _) => json.validate[A]
.repath(path)
.map(a => Option(Option(a)))
}
}
}
def writeTriStateNullable[A: Writes]: OWrites[Option[Option[A]]] =
path.writeNullable(Writes.optionWithNull[A])
def formatTriStateNullable[A: Format]: OFormat[Option[Option[A]]] =
OFormat(readTriStateNullable[A], writeTriStateNullable[A])
}
}
Like previous suggestions in this thread, this method requires you to write out a JSON format in full using the applicative DSL. It's unfortunately incompatible with the Json.format macro, but it gets you close to what you want. Here's a use case:
import play.api.libs.json._
import play.api.libs.functional.syntax._
import tristate._
case class Coord(col: Option[Option[String]], row: Option[Option[Int]])
implicit val format: OFormat[Coord] = (
(__ \ "col").formatTriStateNullable[String] ~
(__ \ "row").formatTriStateNullable[Int]
)(Coord.apply, unlift(Coord.unapply))
Some examples of writing:
format.writes(Coord(None, None))
// => {}
format.writes(Coord(Some(None), Some(None)))
// => { "col": null, "row": null }
format.writes(Coord(Some(Some("A")), Some(Some(1))))
// => { "col": "A", "row": 1 }
And some examples of reading:
Json.obj().as[Coord]
// => Coord(None, None)
Json.obj(
"col" -> JsNull,
"row" -> JsNull
).as[Coord]
// => Coord(Some(None), Some(None))
Json.obj(
"col" -> "A",
"row" -> 1
).as[Coord]
// => Coord(Some(Some("A")), Some(Some(1)))
As a bonus exercise for the reader, you could probably combine this with a little shapeless to automatically derive codecs and replace the Json.format macro with a different one-liner (albeit one that takes longer to compile).
Following #kflorence suggestion about OptionHandler I was able to get the desired behavior.
implicit def optionFormat[T](implicit tf: Format[T]): Format[Option[T]] = Format(
tf.reads(_).map(r => Some(r)),
Writes(v => v.map(tf.writes).getOrElse(JsNull))
)
object InvertedDefaultHandler extends OptionHandlers {
def readHandler[T](jsPath: JsPath)(implicit r: Reads[T]): Reads[Option[T]] = jsPath.readNullable
override def readHandlerWithDefault[T](jsPath: JsPath, defaultValue: => Option[T])(implicit r: Reads[T]): Reads[Option[T]] = Reads[Option[T]] { json =>
jsPath.asSingleJson(json) match {
case JsDefined(JsNull) => JsSuccess(defaultValue)
case JsDefined(value) => r.reads(value).repath(jsPath).map(Some(_))
case JsUndefined() => JsSuccess(None)
}
}
def writeHandler[T](jsPath: JsPath)(implicit writes: Writes[T]): OWrites[Option[T]] = jsPath.writeNullable
}
val configuration = JsonConfiguration[Json.WithDefaultValues](optionHandlers = InvertedDefaultHandler)
case class RequestObject(payload: Option[Option[String]] = Some(None))
implicit val requestObjectFormat: OFormat[RequestObject] = Json.configured(configuration).format[RequestObject]
Json.parse(""" {} """).as[RequestObject] // RequestObject(None)
Json.parse(""" {"payload": null } """).as[RequestObject] // RequestObject(Some(None))
Json.parse(""" {"payload": "hello" } """).as[RequestObject] // RequestObject(Some(Some(hello)))
So the important parts are:
The readHandlerWithDefault basically flipping how
JsDefined(JsNull) and JsUndefined are handling absent and explicit nulls compared to the original implementation in OptionHandlers.Default
The JsonConfiguration taking both Json.WithDefaultValues and optionHandlers
How the default value is being set. Note the RequestObject.payload's default value
Unfortunately I don't know how to achieve what you want automatically. For now it seems to me that you can't do that with the standard macro. However surprisingly you might achieve a similar result if you are OK with swapping the null and "absent" cases (which I agree is a bit confusing).
Assume class Xxx is defined as (default value is important - this will be the result for the null case)
case class Xxx(a: Option[Option[String]] = Some(None))
and you provide following implicit Reads:
implicit val optionStringReads:Reads[Option[String]] = new Reads[Option[String]] {
override def reads(json: JsValue) = json match {
case JsNull => JsSuccess(None) // this is never reached
case JsString(value) => JsSuccess(Some(value))
case _ => throw new RuntimeException("unexpected type")
}
}
implicit val xxxReads = Json.using[Json.WithDefaultValues].reads[Xxx]
Then for a test data:
val jsonNone = "{}"
val jsonNull = """{"a":null}"""
val jsonVal = """{"a":"abc"}"""
val jsonValues = List(jsonNone, jsonNull, jsonVal)
jsonValues.foreach(jsonString => {
val jsonAst = Json.parse(jsonString)
val obj = Json.fromJson[Xxx](jsonAst)
println(s"'$jsonString' => $obj")
})
the output is
'{}' => JsSuccess(Xxx(Some(None)),)
'{"a":null}' => JsSuccess(Xxx(None),)
'{"a":"abc"}' => JsSuccess(Xxx(Some(Some(abc))),)
So
absent attribute is mapped onto Some(None)
null is mapped onto None
Value is mapped onto Some(Some(value))
This is clumsy and a bit unexpected by a developer, but at least this distinguishes all 3 choices. The reason why null and "absent" choices are swapped is that the only way I found to distinguish those cases is to have the value in the target class to be declared as Option and with default value at the same time and in that case the default value is what the "absent" case is mapped to; and unfortunately you can't control the value that null is mapped onto - it is always None.

Play Json API: Convert a JsArray to a JsResult[Seq[Element]]

I have a JsArray which contains JsValue objects representing two different types of entities - some of them represent nodes, the other part represents edges.
On the Scala side, there are already case classes named Node and Edge whose supertype is Element. The goal is to transform the JsArray (or Seq[JsValue]) to a collection that contains the Scala types, e.g. Seq[Element] (=> contains objects of type Node and Edge).
I have defined Read for the case classes:
implicit val nodeReads: Reads[Node] = // ...
implicit val edgeReads: Reads[Edge] = // ...
Apart from that, there is the first step of a Read for the JsArray itself:
implicit val elementSeqReads = Reads[Seq[Element]](json => json match {
case JsArray(elements) => ???
case _ => JsError("Invalid JSON data (not a json array)")
})
The part with the question marks is responsible for creating a JsSuccess(Seq(node1, edge1, ...) if all elements of the JsArray are valid nodes and edges or a JsError if this is not the case.
However, I'm not sure how to do this in an elegant way.
The logic to distinguish between nodes and edges could look like this:
def hasType(item: JsValue, elemType: String) =
(item \ "elemType").asOpt[String] == Some(elemType)
val result = elements.map {
case n if hasType(n, "node") => // use nodeReads
case e if hasType(e, "edge") => // use edgeReads
case _ => JsError("Invalid element type")
}
The thing is that I don't know how to deal with nodeReads / edgeReads at this point. Of course I could call their validate method directly, but then result would have the type Seq[JsResult[Element]]. So eventually I would have to check if there are any JsError objects and delegate them somehow to the top (remember: one invalid array element should lead to a JsError overall). If there are no errors, I still have to produce a JsSuccess[Seq[Element]] based on result.
Maybe it would be a better idea to avoid the calls to validate and work temporarily with Read instances instead. But I'm not sure how to "merge" all of the Read instances at the end (e.g. in simple case class mappings, you have a bunch of calls to JsPath.read (which returns Read) and in the end, validate produces one single result based on all those Read instances that were concatenated using the and keyword).
edit: A little bit more information.
First of all, I should have mentioned that the case classes Node and Edge basically have the same structure, at least for now. At the moment, the only reason for separate classes is to gain more type safety.
A JsValue of an element has the following JSON-representation:
{
"id" : "aet864t884srtv87ae",
"type" : "node", // <-- type can be 'node' or 'edge'
"name" : "rectangle",
"attributes": [],
...
}
The corresponding case class looks like this (note that the type attribute we've seen above is not an attribute of the class - instead it's represented by the type of the class -> Node).
case class Node(
id: String,
name: String,
attributes: Seq[Attribute],
...) extends Element
The Read is as follows:
implicit val nodeReads: Reads[Node] = (
(__ \ "id").read[String] and
(__ \ "name").read[String] and
(__ \ "attributes").read[Seq[Attribute]] and
....
) (Node.apply _)
everything looks the same for Edge, at least for now.
Try defining elementReads as
implicit val elementReads = new Reads[Element]{
override def reads(json: JsValue): JsResult[Element] =
json.validate(
Node.nodeReads.map(_.asInstanceOf[Element]) orElse
Edge.edgeReads.map(_.asInstanceOf[Element])
)
}
and import that in scope, Then you should be able to write
json.validate[Seq[Element]]
If the structure of your json is not enough to differentiate between Node and Edge, you could enforce it in the reads for each type.
Based on a simplified Node and Edge case class (only to avoid any unrelated code confusing the answer)
case class Edge(name: String) extends Element
case class Node(name: String) extends Element
The default reads for these case classes would be derived by
Json.reads[Edge]
Json.reads[Node]
respectively. Unfortunately since both case classes have the same structure these reads would ignore the type attribute in the json and happily translate a node json into an Edge instance or the opposite.
Lets have a look at how we could express the constraint on type all by itself :
def typeRead(`type`: String): Reads[String] = {
val isNotOfType = ValidationError(s"is not of expected type ${`type`}")
(__ \ "type").read[String].filter(isNotOfType)(_ == `type`)
}
This method builds a Reads[String] instance which will attempt to find a type string attribute in the provided json. It will then filter the JsResult using the custom validation error isNotOfType if the string parsed out of the json doesn't matched the expected type passed as argument of the method. Of course if the type attribute is not a string in the json, the Reads[String] will return an error saying that it expected a String.
Now that we have a read which can enforce the value of the type attribute in the json, all we have to do is to build a reads for each value of type which we expect and compose it with the associated case class reads. We can used Reads#flatMap for that ignoring the input since the parsed string is not useful for our case classes.
object Edge {
val edgeReads: Reads[Edge] =
Element.typeRead("edge").flatMap(_ => Json.reads[Edge])
}
object Node {
val nodeReads: Reads[Node] =
Element.typeRead("node").flatMap(_ => Json.reads[Node])
}
Note that if the constraint on type fails the flatMap call will be bypassed.
The question remains of where to put the method typeRead, in this answer I initially put it in the Element companion object along with the elementReads instance as in the code below.
import play.api.libs.json._
trait Element
object Element {
implicit val elementReads = new Reads[Element] {
override def reads(json: JsValue): JsResult[Element] =
json.validate(
Node.nodeReads.map(_.asInstanceOf[Element]) orElse
Edge.edgeReads.map(_.asInstanceOf[Element])
)
}
def typeRead(`type`: String): Reads[String] = {
val isNotOfType = ValidationError(s"is not of expected type ${`type`}")
(__ \ "type").read[String].filter(isNotOfType)(_ == `type`)
}
}
This is actually a pretty bad place to define typeRead :
- it has nothing specific to Element
- it introduces a circular dependency between the Elementcompanion object and both Node and Edge companion objects
I'll let you think up of the correct location though :)
The specification proving it all works together :
import org.specs2.mutable.Specification
import play.api.libs.json._
import play.api.data.validation.ValidationError
class ElementSpec extends Specification {
"Element reads" should {
"read an edge json as an edge" in {
val result: JsResult[Element] = edgeJson.validate[Element]
result.isSuccess should beTrue
result.get should beEqualTo(Edge("myEdge"))
}
"read a node json as an node" in {
val result: JsResult[Element] = nodeJson.validate[Element]
result.isSuccess should beTrue
result.get should beEqualTo(Node("myNode"))
}
}
"Node reads" should {
"read a node json as an node" in {
val result: JsResult[Node] = nodeJson.validate[Node](Node.nodeReads)
result.isSuccess should beTrue
result.get should beEqualTo(Node("myNode"))
}
"fail to read an edge json as a node" in {
val result: JsResult[Node] = edgeJson.validate[Node](Node.nodeReads)
result.isError should beTrue
val JsError(errors) = result
val invalidNode = JsError.toJson(Seq(
(__ \ "type") -> Seq(ValidationError("is not of expected type node"))
))
JsError.toJson(errors) should beEqualTo(invalidNode)
}
}
"Edge reads" should {
"read a edge json as an edge" in {
val result: JsResult[Edge] = edgeJson.validate[Edge](Edge.edgeReads)
result.isSuccess should beTrue
result.get should beEqualTo(Edge("myEdge"))
}
"fail to read a node json as an edge" in {
val result: JsResult[Edge] = nodeJson.validate[Edge](Edge.edgeReads)
result.isError should beTrue
val JsError(errors) = result
val invalidEdge = JsError.toJson(Seq(
(__ \ "type") -> Seq(ValidationError("is not of expected type edge"))
))
JsError.toJson(errors) should beEqualTo(invalidEdge)
}
}
val edgeJson = Json.parse(
"""
|{
| "type":"edge",
| "name":"myEdge"
|}
""".stripMargin)
val nodeJson = Json.parse(
"""
|{
| "type":"node",
| "name":"myNode"
|}
""".stripMargin)
}
if you don't want to use asInstanceOf as a cast you can write the
elementReads instance like so :
implicit val elementReads = new Reads[Element] {
override def reads(json: JsValue): JsResult[Element] =
json.validate(
Node.nodeReads.map(e => e: Element) orElse
Edge.edgeReads.map(e => e: Element)
)
}
unfortunately, you can't use _ in this case.

Play 2.x Json transform json keys to camelCase from underscore case

I want to transform a json with underscore case keys to camel case keys.
"{\"first_key\": \"first_value\", \"second_key\": {\"second_first_key\":\"second_first_value\"}}"
to
"{\"firstKey\": \"first_value\", \"secondKey\": {\"secondFirstKey\":\"second_first_value\"}}"
This is partial code:
val CamelCaseRegex = new Regex("(_.)")
val jsonTransformer = (__).json.update(
//converts json camel_case field names to Scala camelCase field names
)
val jsonRet = Json.parse(jsonStr).transform(jsonTransformer)
I have tried several ways in the update method without success.
While it would be nice to do this with just the native Play library, it's a good use-case for Mandubian's Play Json Zipper extension libraries.
Here's a quick go at this (not exhaustively tested). First you need to add the resolver and library to your build:
resolvers += "mandubian maven bintray" at "http://dl.bintray.com/mandubian/maven"
libraryDependencies ++= Seq(
"com.mandubian" %% "play-json-zipper" % "1.2"
)
Then you could try something like this:
import play.api.libs.json._
import play.api.libs.json.extensions._
// conversion function borrowed from here:
// https://gist.github.com/sidharthkuruvila/3154845
def underscoreToCamel(name: String) = "_([a-z\\d])".r.replaceAllIn(name, {m =>
m.group(1).toUpperCase
})
// Update the key, otherwise ignore them...
// FIXME: The None case shouldn't happen here so maybe we
// don't need it...
def underscoreToCamelCaseJs(json: JsValue) = json.updateAllKeyNodes {
case (path, js) => JsPathExtension.hasKey(path) match {
case Some(key) => underscoreToCamel(key) -> js
case None => path.toJsonString -> js
}
}
Which on this input:
val testJson = Json.obj(
"some_str" -> JsString("foo_bar"),
"some_obj" -> Json.obj(
"some_field" -> Json.arr("foo", "bar")
),
"an_int" -> JsNumber(1)
)
...produces:
{
"someStr" : "foo_bar",
"someObj" : {
"someField" : [ "foo", "bar" ]
},
"anInt" : 1
}
For your requirement, you can use this library play-json-naming. It easily converts snake_case(underscore case) json to camelCase and vice versa.
https://github.com/tototoshi/play-json-naming
If you're using play >= 2.6:
final case class Blah(foo: String)
object Blah {
import play.api.libs.json._
implicit val config = JsonConfiguration(SnakeCase)
implicit val blahReader:OFormat[QueueMsgEnvelope] = Json.format[Blah]
}
https://www.playframework.com/documentation/2.6.x/ScalaJsonAutomated#Custom-Naming-Strategies

PlayFramework obj to json

From: https://www.playframework.com/documentation/2.3.x/ScalaJsonHttp
Playframework Scala way:
case class Location(lat: Double, long: Double)
case class Place(name: String, location: Location)
object Place {
var list: List[Place] = {
List(
Place(
"Sandleford",
Location(51.377797, -1.318965)
),
Place(
"Watership Down",
Location(51.235685, -1.309197)
)
)
}
}
implicit val locationWrites: Writes[Location] = (
(JsPath \ "lat").write[Double] and
(JsPath \ "long").write[Double]
)(unlift(Location.unapply))
implicit val placeWrites: Writes[Place] = (
(JsPath \ "name").write[String] and
(JsPath \ "location").write[Location]
)(unlift(Place.unapply))
Next we write our Action:
def listPlaces = Action {
val json = Json.toJson(Place.list)
Ok(json)
}
Is there is simpler way to do it? Without all this obvious implicits ? I just want it convert all object structure as it is. Like on JS when work with JSON.
Assuming i'm O.K. with default field names and default object structure. Can I I make use of "default advantage-sugar" ?.
UPDATE:
As simple as in NodeJS for example:
var list = {"places":[
{"name":"Sandleford", "location":{"lat": "1", "long": "222"}},
{"name":"Watership Down", "location":{"lat": "2", "long": "333"}},
]}
router.get('/', function(req, res) {
res.json(list);
});
There is no question what should be localized and how.
Or like in java (pure jackson):
List list = Arrays.asList(place1, place2);
ObjectWriter ow = new ObjectMapper().writer().withDefaultPrettyPrinter();
String json = ow.writeValueAsString(list);
Where class Place defined with no annotation (but with getters and setters, what I don't like though).
I may use jackson in scala, but I wonder about 'official defaults'.. in playframewok for scala to go with less code (like it is supposed to be).
Json.writes[T]
If all you want is literal serialization to Json, and you don't want to write the Writes instances yourself you can use the Json.writes[T] macro to generate the Writes[T] for you.
import play.api.libs.json._
case class Location(lat: Double, long: Double)
case class Place(name: String, location: Location)
object Place {
val list: List[Place] = {
List(
Place(
"Sandleford",
Location(51.377797, -1.318965)
),
Place(
"Watership Down",
Location(51.235685, -1.309197)
)
)
}
}
object JsonStuff {
implicit val locationWrites: Writes[Location] = Json.writes[Location]
implicit val placeWrites: Writes[Place] = Json.writes[Place]
}
Then on the repl,
scala> import JsonStuff._
import JsonStuff._
scala> import play.api.libs.json._
import play.api.libs.json._
scala> Json.toJson(Place.list)
res0: play.api.libs.json.JsValue = [{"name":"Sandleford","location":{"lat":51.377797,"long":-1.318965}},{"name":"Watership Down","location":{"lat":51.235685,"long":-1.309197}}]
Json Literals
If you really want to you can of course just create and manipulate the JsValue directly. For instance
object Place {
val JsList =
JsArray(
Seq(
Json.obj(
("name" -> "Sandleford"),
("location" ->
Json.obj(
("lat" -> 51.377797),
("long" -> -1.318965)))),
Json.obj(
("name" -> "Watership Down"),
("location" ->
Json.obj(
("lat" -> 51.235685),
("long" -> -1.309197))))))
}
On the repl,
scala> Place.JsList
res0: play.api.libs.json.JsArray = [{"name":"Sandleford","location":{"lat":51.377797,"long":-1.318965}},{"name":"Watership Down","location":{"lat":51.235685,"long":-1.309197}}]
Though this approach can be quite difficult to work with for all but the most trivial applications and I would highly recommend against it.
As isomarcte already answered. I had to do this (and this almost satisfied me as noise-free solution):
case class Location(lat: Double, long: Double)
case class Place(name: String, location: Location)
// little noise (just too lines, but that could be even one #Json. Could..
object JsonStuff {
implicit val locationWrites: Writes[Location] = Json.writes[Location]
implicit val placeWrites: Writes[Place] = Json.writes[Place]
}
object Application extends Controller {
def places = Action {
import play.api.libs.json._
val list: List[Place] = {
List(
Place(
"Sandleford",
Location(51.377797, -1.318965)
),
Place(
"Watership Down",
Location(51.235685, -1.309197)
)
)
}
import JsonStuff._
Ok(Json.toJson(list))
}
And I had to:
to move case classes out - to make them stand alone
delete Place object to make list stands by itself
otherwise I had: No unapply function found error.
(known limitations)
and sequence of putting implicit val ..writers does matter - Place should know about Location .. since those writers are macros.. Like in JS :) heh.
otherwise you will get NullPotinter exception
(a little bit overcomplicated as for me)

Modifying JSON reads and writes in playframework 2.1

I'm a newbie and scala/play and need help with playframework's JSON reads/writes.
I use Json.reads[T] and Json.writes[T] macros to define json reads and writes for my classes. However I'd like to have one property name to be (always) mapped differently. Namely, I have property named id in my classes and I want it to be represented as _id when object is converted to json and vice versa.
Is there a way to modify reads/writes objects generated by Json.reads and Json.writes macros to achieve this or do I have to rewrite reads and writes manually just to have one property named differently?
EDIT
Let me try to explain the problem better. Consider the model object User:
case class User (id: BigInt, email: String, name: String)
When serializing User to json for purposes of serving json in context of a REST api the json should look like this:
{
"id": 23432,
"name": "Joe",
"email: "joe#example.com"
}
When serializing User to json for purposes of storing/updating/reading form MongoDB json should look like:
{
"_id": 23432,
"name": "Joe",
"email: "joe#example.com"
}
In other words everything is the same except when communicating with Mongo id should be represented as _id.
I know I could manually write two sets of reads and writes for each model object (one to be used for web and another for communication with Mongo) as suggested by Darcy Qiu in the answer, however maintaining two sets of reads and writes that are nearly identical except for the id property seems like a lot of code duplication so I'm wondering if there is a better approach.
First you define transformations for renames id/_id back and forth:
import play.api.libs.json._
import play.modules.reactivemongo.json._
val into: Reads[JsObject] = __.json.update( // copies the full JSON
(__ \ 'id).json.copyFrom( (__ \ '_id).json.pick ) // adds id
) andThen (__ \ '_id).json.prune // and after removes _id
val from: Reads[JsObject] = __.json.update( // copies the full JSON
(__ \ '_id).json.copyFrom( (__ \ 'id).json.pick ) // adds _id
) andThen (__ \ 'id).json.prune // and after removes id
(To understand why Reads is a transformation please read: https://www.playframework.com/documentation/2.4.x/ScalaJsonTransformers)
Assuming we have macro generated Writes and Reads for our entity class:
def entityReads: Reads[T] // eg Json.reads[Person]
def entityWrites: Writes[T] // eg Json.writes[Person]
Then we mix transformations with macro-generated code:
private[this] def readsWithMongoId: Reads[T] =
into.andThen(entityReads)
private[this] def writesWithMongoId: Writes[T] =
entityWrites.transform(jsValue => jsValue.transform(from).get)
The last thing. Mongo driver wants to be sure (ie typesafe-sure) that the json it inserts is a JsObject. That is why we need an OWrites. I haven't found better way than:
private[this] def oWritesWithMongoId = new OWrites[T] {
override def writes(o: T): JsObject = writesWithMongoId.writes(o) match {
case obj: JsObject => obj
case notObj: JsValue =>
throw new InternalError("MongoRepo has to be" +
"definded for entities which serialize to JsObject")
}
}
Last step is to privide an implicit OFormat.
implicit val routeFormat: OFormat[T] = OFormat(
readsWithMongoId,
oWritesWithMongoId
)
Let's say your case class, which is T in your question, is named User and has definision as below
case class User(_id: String, age: Int)
Your reads can be defined as
implicit val userReads = new Reads[User] {
def reads(js: JsValue): User = {
User(
(js \ "id").as[String],
(js \ "age").as[Int]
)
}
}
Your writes[User] should follow the same logic.
If you put in enough code you can achieve this with transformers:
val idToUnderscore = (JsPath).json.update((JsPath).read[JsObject].map { o:JsObject =>
o ++ o.transform((JsPath\"_id").json.put((JsPath\"id").asSingleJson(o))).get
}) andThen (JsPath\"id").json.prune
val underscoreWrite = normalWrites.transform( jsVal => jsVal.transform(idToUnderscore).get )
Here's the full test:
import play.api.libs.functional.syntax._
import play.api.libs.json._
val u = User("overlord", "Hansi Meier", "evil.overlord#hansi-meier.de")
val userToProperties = {u:User => (u.id, u.name, u.email)}
val normalWrites = (
(JsPath\"id").write[String] and
(JsPath\"name").write[String] and
(JsPath\"email").write[String]
)(userToProperties)
val idToUnderscore = (JsPath).json.update((JsPath).read[JsObject].map { o:JsObject =>
o ++ o.transform((JsPath\"_id").json.put((JsPath\"id").asSingleJson(o))).get
}) andThen (JsPath\"id").json.prune
val underscoreWrite = normalWrites.transform( jsVal => jsVal.transform(idToUnderscore).get )
info(Json.stringify(Json.toJson(u)(normalWrites)))
info(Json.stringify(Json.toJson(u)(underscoreWrite)))
Now, if you modify the normalWrites (say by adding additional properties), the underscoreWrite will still do what you want.