Play Scala JSON - conditionally add field to JSON object in Writes - json

In our app, we have pretty complex structure of objects that is getting converted to JSON and back. Until now most of the formatted are symmetrical (except some very specific cases, and even these for security reasons).
Now we are facing a more complex case where conversion of an object into JSON (writes) needs to create an additional field at a time of conversion while the case class does not have that field.
For example, here is one of our existing formatters:
case class ChecklistColumn(kind: ColumnKind.Value, descriptor: Descriptor.Value, data: JsValue) extends Column
implicit val checklistResultChecklistDataFormat: Format[ChecklistColumn] = (
(__ \ "kind").format[ColumnKind.Value] and
(__ \ "descriptor").format[Descriptor.Value] and
(__ \ "data").format[JsValue]
)(ChecklistColumn.apply, unlift(ChecklistColumn.unapply))
This one creates a json that will looks like:
{
"kind": <String>,
"descriptor": <String>,
"data": <JsValue>
}
What we need to achieve is:
{
"kind": <String>,
"descriptor": <String>,
"data": <JsValue>,
"normalized_data": <JsString>
}
But, only in case when the data is type of JsString (in any other case normalized_data can be left empty, byt ideally should not even exist).
I do understand that we have to create separate Reads & Writes for that.
But, I am not sure, how to implement the logic that will react differently to a different type of data.
Of course, there is always an option to create fully custom writes:
override def writes(column: ChecklistColumn): JsValue = {...}
But, this will create a huge complexity in a code that will be hard to maintain.
What is the cleanest way to implement something like that?

Have a look at ScalaJsonTransformers. You can create a transformer that creates the normalised field from a string data value and use it to, erm, transform your original Format to a new Writes. Here's a slightly simplified example that could doubtless be improved (you'll want to check various edge cases):
case class ChecklistColumn(kind: String, descriptor: String, data: JsValue)
// The original format.
val checklistFormat: Format[ChecklistColumn] = (
(__ \ "kind").format[String] and
(__ \ "descriptor").format[String] and
(__ \ "data").format[JsValue]
)(ChecklistColumn.apply, unlift(ChecklistColumn.unapply))
// A transformer that looks for a "data" field with a string
// value and adds the normalized_data field if it finds one.
val checklistTransformer: Reads[JsObject] = JsPath.json.update(
(__ \ "data").read[String].flatMap (
str => (__ \ "normalized_data").json.put(JsString(str + "!!!"))))
// A new derived Writes which writes the transformed value if
// the transformer succeeds (a data string), otherwise the
// original value.
implicit val checklistWrites: Writes[ChecklistColumn] = checklistFormat
.transform (js => js.transform(checklistTransformer).getOrElse(js))
That gives me:
Json.prettyPrint(Json.toJson(ChecklistColumn("a", "b", JsNumber(1))))
// {
// "kind" : "a",
// "descriptor" : "b",
// "data" : 1
// }
Json.prettyPrint(Json.toJson(ChecklistColumn("a", "b", JsString("c"))))
// {
// "kind" : "a",
// "descriptor" : "b",
// "data" : "c",
// "normalized_data" : "c!!!"
// }

Related

Play framework (Scala) - Getting subset of json that contains arrays

I have many very large json-objects that I return from Play Framework with Scala.
In most cases the user doesn't need all the data in the objects, only a few fields. So I want to pass in the paths I need (as query parameters), and return a subset of the json object.
I have looked at using JSON Transformers for this task.
Filter code
def filterByPaths(paths: List[JsPath], inputObject: JsObject) : JsObject = {
paths
.map(_.json.pick)
.map(inputObject.transform)
.filter(_.isSuccess)
.map { case JsSuccess(value, path) => (value, path) }
.foldLeft(Json.obj()) { (obj, jsValueAndPath) =>
val(jsValue, path) = jsValueAndPath
val transformer = __.json.update(path.json.put(jsValue))
obj.transform(transformer).get
}
}
Usage:
val input = Json.obj(
"field1" -> Json.obj(
"field2" -> "right result"
),
"field4" -> Json.obj(
"field5" -> "not included"
),
)
val result = filterByPaths(List(JsPath \ "field1" \ "field2"), input)
// {"field1":{"field2":"right result"}}
Problem
This code works fine for JsObjects. But I can't make it work if there are JsArrays in the strucure. I had hoped that my JsPath could contain an index to look up the field, but that's not the case. (Don't know why I assumed that, maybe my head was too far in the JavaScript-world)
So this would fail to return the first entry in the Array:
val input: JsObject = Json.parse("""
{
"arr1" : [{
"field1" : "value1"
}]
}
""").as[JsObject]
val result = filterByPaths(List(JsPath \ "arr1" \ "0"), input)
// {}
Question
My question is: How can I return a subset of a json structure that contains arrays?
Alternative solution
I have the data as a case class first, and I serialize it to Json, and then run filterByPaths on it. Having a Reader that only creates the json I need in the first place might be a better solution, but creating a Reader on the fly, with configuration from queryparams seamed a more difficult task, then just stripping down the json afterwards.
The example of the returning array element:
val input: JsValue = Json.parse("""
{
"arr1" : [{
"field1" : "value1"
}]
}
""")
val firstElement = (input \ "arr1" \ 0).get
val firstElementAnotherWay = input("arr1")(0)
More about this in the Play Framework documentation: https://www.playframework.com/documentation/2.6.x/ScalaJson
Update
It looks like you got the old issue RuntimeException: expected KeyPathNode. JsPath.json.put, JsPath.json.update can't past an object to a nesting array.
https://github.com/playframework/playframework/issues/943
https://github.com/playframework/play-json/issues/82
What you can do:
Use the JSZipper: https://github.com/mandubian/play-json-zipper
Create a script to update arrays "manually"
If you can afford it, strip array in a resulting object
Example of stripping array (point 3):
def filterByPaths(paths: List[JsPath], inputObject: JsObject) : JsObject = {
paths
.map(_.json.pick)
.map(inputObject.transform)
.filter(_.isSuccess)
.map { case JsSuccess(value, path) => (value, path)}
.foldLeft(Json.obj()) { (obj, jsValueAndPath) =>
val (jsValue, path) = jsValueAndPath
val arrayStrippedPath = JsPath(path.path.filter(n => !(n.toJsonString matches """\[\d+\]""")))
val transformer = __.json.update(arrayStrippedPath.json.put(jsValue))
obj.transform(transformer).get
}
}
val result = filterByPaths(List(JsPath \ "arr1" \ "0"), input)
// {"arr1":{"field1":"value1"}}
The example
The best to handle JSON objects is by using case classes and create implicit Reads and Writes, by that you can handle errors every fields directly. Don't make it complicated.
Don't use .get() much recommended to use .getOrElse() because scala is a type-safe programming language.
Don't just use any Libraries except you know the process behind it, much better to create your own parsing method with simplified solution to save memory.
I hope it will help you..

Play Json Recursive Reads

I'm working on consuming an API which exposes an object at multiple layers within it's response. For example for some responses we get back a:
{
"error": {
"code": "123",
"description": "Description"
}
}
But in other situations it response with:
{
"data": [
{
"message_id": "123",
"error": {
"code": "123",
"description": "Description"
}
}
]
}
In both cases the error object is identical and in both case I don't actually care about the rest of the payload. I was hoping to use the \\ recursive JsPath operator, however the implementation below fails:
case class ErrorMessage(code: String, description: String)
implicit val errorMessageFormat = Json.format[ErrorMessage]
case class ErrorResponse(errors: Seq[ErrorMessage])
implicit val errorResponseFormat: Format[ErrorResponse] = Format(
(__ \\ "error").read[Seq[ErrorMessage]].map(ErrorResponse),
(__ \ "errors").write[Seq[ErrorMessage]].contramap((r: ErrorResponse) => r.errors)
)
This gives an error:
JsError(List((//error,List(ValidationError(List(error.expected.jsarray),WrappedArray())))))
I understand why: The (__ \\ "error") returns a Seq[JsValue], where as my read call is expecting a JsArray.
Is there a nice way a round this?
Since the first piece is already a Seq, just map the internal elements as you would map single objects. I'm not very familiar with the framework (I use Json4s myself mostly), but by your description sounds like
ErrorResponse((__ \\ "error").map(_.read[Seq[ErrorMessage]]))
Should be closer to what you want.(__ \\ "error") gives you a Seq of JsValues, map does something for every JsValue, read converts a single JsValue to an ErrorMessage and the resulting Seq[ErrorMessage] you pass to the ErrorResponse constructor.
So for anyone trying to do something similar, the below works.
val errorResponseReads = Reads[ErrorResponse] { value: JsValue =>
val errorsJsArray: JsArray = JsArray(value \\ "error")
errorsJsArray.validate[Seq[ErrorMessage]].map(ErrorResponse)
}
The format then becomes:
implicit val errorResponseFormat: Format[ErrorResponse] = Format(
errorResponseReads,
(__ \ "errors").write[Seq[ErrorMessage]].contramap((r: ErrorResponse) => r.errors)
)
Basically you need to define Reads that you use explicitly. In the reads you can then use the recursive search to return a Seq[JsValue] and then create a JsArray that can be validated as normal.
This works fine, it would be nice if we didn't have to define a separate Reads[T] though.

Play ScalaJSON Reads[T] parsing

I am writing an Json parse for an rest webservice response, I have an Json file looking as:
{"program": {
"name": "myname",
"#id": "12345",
"$": "text text text"
}, etc. etc.
I wrote an case class for the Reads object:
case class program(name:String)
implicit val programFormat = Json.format[program]
And this pseudo code for get a data:
val x=(jobj \ "program").validate[program]
x match {
case JsSuccess(pr, _) => println("JsSuccess:"+pr)
for(p<- pr.program)
{
println(p.name)
}
case error: JsError => ....
}
For the field name no problem, the code work well, but I don't understand how capture the field "#id" and the field "$" because I cannot create an param in case class named: #id or $.
Thank you for your help.
More correct solution in my opinion is creating own Reads, that is:
case class Program(name: String, id: String, dollar: String)
implicit val programWrites: Reads[Program] = (
(__ \ "name").read[String] ~
(__ \ "#id").read[String] ~
(__ \ "$").read[String]
)(Program.apply _)
Docs: https://www.playframework.com/documentation/2.4.x/ScalaJsonCombinators#Reads
Another solution, i think much worse one, is using backtick sign
case class Program(name: String, `#id`: String, `$`: String)
implicit val programFormat = Json.format[Program]
It allows to write special signs in method names, field names and so on.
More about it: Need clarification on Scala literal identifiers (backticks)

Modifying JSON reads and writes in playframework 2.1

I'm a newbie and scala/play and need help with playframework's JSON reads/writes.
I use Json.reads[T] and Json.writes[T] macros to define json reads and writes for my classes. However I'd like to have one property name to be (always) mapped differently. Namely, I have property named id in my classes and I want it to be represented as _id when object is converted to json and vice versa.
Is there a way to modify reads/writes objects generated by Json.reads and Json.writes macros to achieve this or do I have to rewrite reads and writes manually just to have one property named differently?
EDIT
Let me try to explain the problem better. Consider the model object User:
case class User (id: BigInt, email: String, name: String)
When serializing User to json for purposes of serving json in context of a REST api the json should look like this:
{
"id": 23432,
"name": "Joe",
"email: "joe#example.com"
}
When serializing User to json for purposes of storing/updating/reading form MongoDB json should look like:
{
"_id": 23432,
"name": "Joe",
"email: "joe#example.com"
}
In other words everything is the same except when communicating with Mongo id should be represented as _id.
I know I could manually write two sets of reads and writes for each model object (one to be used for web and another for communication with Mongo) as suggested by Darcy Qiu in the answer, however maintaining two sets of reads and writes that are nearly identical except for the id property seems like a lot of code duplication so I'm wondering if there is a better approach.
First you define transformations for renames id/_id back and forth:
import play.api.libs.json._
import play.modules.reactivemongo.json._
val into: Reads[JsObject] = __.json.update( // copies the full JSON
(__ \ 'id).json.copyFrom( (__ \ '_id).json.pick ) // adds id
) andThen (__ \ '_id).json.prune // and after removes _id
val from: Reads[JsObject] = __.json.update( // copies the full JSON
(__ \ '_id).json.copyFrom( (__ \ 'id).json.pick ) // adds _id
) andThen (__ \ 'id).json.prune // and after removes id
(To understand why Reads is a transformation please read: https://www.playframework.com/documentation/2.4.x/ScalaJsonTransformers)
Assuming we have macro generated Writes and Reads for our entity class:
def entityReads: Reads[T] // eg Json.reads[Person]
def entityWrites: Writes[T] // eg Json.writes[Person]
Then we mix transformations with macro-generated code:
private[this] def readsWithMongoId: Reads[T] =
into.andThen(entityReads)
private[this] def writesWithMongoId: Writes[T] =
entityWrites.transform(jsValue => jsValue.transform(from).get)
The last thing. Mongo driver wants to be sure (ie typesafe-sure) that the json it inserts is a JsObject. That is why we need an OWrites. I haven't found better way than:
private[this] def oWritesWithMongoId = new OWrites[T] {
override def writes(o: T): JsObject = writesWithMongoId.writes(o) match {
case obj: JsObject => obj
case notObj: JsValue =>
throw new InternalError("MongoRepo has to be" +
"definded for entities which serialize to JsObject")
}
}
Last step is to privide an implicit OFormat.
implicit val routeFormat: OFormat[T] = OFormat(
readsWithMongoId,
oWritesWithMongoId
)
Let's say your case class, which is T in your question, is named User and has definision as below
case class User(_id: String, age: Int)
Your reads can be defined as
implicit val userReads = new Reads[User] {
def reads(js: JsValue): User = {
User(
(js \ "id").as[String],
(js \ "age").as[Int]
)
}
}
Your writes[User] should follow the same logic.
If you put in enough code you can achieve this with transformers:
val idToUnderscore = (JsPath).json.update((JsPath).read[JsObject].map { o:JsObject =>
o ++ o.transform((JsPath\"_id").json.put((JsPath\"id").asSingleJson(o))).get
}) andThen (JsPath\"id").json.prune
val underscoreWrite = normalWrites.transform( jsVal => jsVal.transform(idToUnderscore).get )
Here's the full test:
import play.api.libs.functional.syntax._
import play.api.libs.json._
val u = User("overlord", "Hansi Meier", "evil.overlord#hansi-meier.de")
val userToProperties = {u:User => (u.id, u.name, u.email)}
val normalWrites = (
(JsPath\"id").write[String] and
(JsPath\"name").write[String] and
(JsPath\"email").write[String]
)(userToProperties)
val idToUnderscore = (JsPath).json.update((JsPath).read[JsObject].map { o:JsObject =>
o ++ o.transform((JsPath\"_id").json.put((JsPath\"id").asSingleJson(o))).get
}) andThen (JsPath\"id").json.prune
val underscoreWrite = normalWrites.transform( jsVal => jsVal.transform(idToUnderscore).get )
info(Json.stringify(Json.toJson(u)(normalWrites)))
info(Json.stringify(Json.toJson(u)(underscoreWrite)))
Now, if you modify the normalWrites (say by adding additional properties), the underscoreWrite will still do what you want.

Handle multidimensional JSON with scala Play framework

I am trying to send data from the client to the server using a JSON request. The body of the JSON request looks like this:
[
[
{"x":"0","y":"0","player":0},
{"x":"0","y":"1","player":0},
{"x":"0","y":"2","player":1}
],
[
{"x":"1","y":"0","player":0},
{"x":"1","y":"1","player":2},
{"x":"1","y":"2","player":0}
],
[
{"x":"2","y":"0","player":0},
{"x":"2","y":"1","player":1},
{"x":"2","y":"2","player":2}
]
]
On server side I would like to transform data with Play 2 framework to Scala 2D list like this:
List(
List(0,0,1),
List(0,2,0),
List(0,1,2)
)
this is 3x3 but it can be variable like 50x50 or so.
Thanks for any help.
It might be incomplete (don't know if you want to modelize the square matrix contraint as well) but something like that could be a good start:
First here is what the controller (and model) part can define
import play.api.libs.json.Json._
import play.api.libs.json._
type PlayerT = (String, String, Int)
implicit val playerTripleReads:Reads[PlayerT] = (
(__ \ "x").read[String] and
(__ \ "y").read[String] and
(__ \ "player").read[Int]
tupled
)
def getJson = Action(parse.json) { request =>
request.body.validate[List[List[PlayerT]]].map{
case xs => Ok(xs.mkString("\n"))
}.recoverTotal{
e => BadRequest("Detected error:"+ JsError.toFlatJson(e))
}
}
In this version, you'll get a list of list holding validated tuples of the form (String, String, Int) which has been aliased with the PlayerT type to save some typing.
As you may saw, the reader as been created "by-hand" by composing (using the and combinator) three basic blocks and the result is flattened using the tupled operator.
With this solution you're now on track to play with those tuples, but IMO the code will suffer from bad readability, because of the usage of _1, _2 and _3 along the way.
So here is a different approach (which is in fact even easier...) that tackles this problem of sane coding, this will simply defined a `case class that models your atomic data
case class Player(x:String, y:String, player:Int)
implicit val playerReads = Json.reads[Player]
def getJson = Action(parse.json) { request =>
request.body.validate[List[List[Player]]].map{
case xs => Ok(xs.mkString("\n"))
}.recoverTotal{
e => BadRequest("Detected error:"+ JsError.toFlatJson(e))
}
}
Note that, the reader will always follow further changes in your data representation, that is the case class's fields thanks to the use of the implicit creation of the reader at compile time.
Now, you'll be able to use x, y and player fields rather than _1, _2 and _3.