Read JSON Tree structure in Scala Play Framework - json

I am trying to process an Ajax POST request in Play Framework 2.1.3. The post data is a JSON object and has a tree structure like:
{ id: "a", name: "myname", kids : [{ id: "a1", name : "kid1", kids: []}, {id: "a2", name: "kid2", kids: [{id: "aa1", name :"grandkid", kids: []}]}]
I would like to nest the 'children' arbitrarily deep. The class I would have in mind is like this (I realize the recursiveness can be problematic):
case class Person {
id: String,
name: String,
kids: Array[Person]
}
The format I would have in mind:
implicit val personFormat:Format[Person] = Json.format[Person]
Play is throwing errors on my Format that I wrote:
type mismatch; found : controllers.Resources.Person required: Array[controllers.Resources.Person]
I am aware that Play has a Tree structure. I couldn't find examples/documentation on how to tie that to JSON reads.
Any help is highly appreciated, thanks

You will need a recursive val, something like:
implicit val jsonReads: Reads[Person] = ((__ \ "id").read[String] and (__ \ "name").read[String] and (__ \ "kids").read[Seq[Person]])(apply _)
(I've changed the collection type from Array to Seq because it's more general and allows you to change your implementation without affecting downline code.)
This is using the syntax documented here.

The only way that I see this working is using either JsArray or Array[String] instead of Array[Person] in your Person case class. The JSON Macro Inception can only generate reads + writes code for primitives and lists, maps, and arrays for objects for which there already exist implicit JSON read + write code. Essentially you can't have a case class that references itself.
package models
import play.api.libs.json._
case class Person(
id : String,
name : String,
kids : JsArray
)
object Person extends ((String,String,JsArray) => Person) {
implicit val jsonFormat = Json.format[Person]
}

Related

PlayJSON in Scala

I am trying to familiarize myself with the PlayJSON library. I have a JSON formatted file like this:
{
"Person": [
{
"name": "Jonathon",
"age": 24,
"job": "Accountant"
}
]
}
However, I'm having difficulty with parsing it properly due to the file having different types (name is a String but age is an Int). I could technically make it so the age is a String and call .toInt on it later but for my purposes, it is by default an integer.
I know how to parse some of it:
import play.api.libs.json.{JsValue, Json}
val parsed: JsValue = Json.parse(jsonFile) //assuming jsonFile is that entire JSON String as shown above
val person: List[Map[String, String]] = (parsed \ "Person").as[List[Map[String, String]]]
Creating that person value throws an error. I know Scala is a strongly-typed language but I'm sure there is something I am missing here. I feel like this is an obvious fix too but I'm not quite sure.
The error produced is:
JsResultException(errors:List(((0)/age,List(JsonValidationError(List(error.expected.jsstring),WrappedArray())))
The error you are having, as explained in the error you are getting, is in casting to the map of string to string. The data you provided does not align with it, because the age is a string. If you want to keep in with this approach, you need to parse it into a type that will handle both strings and ints. For example:
(parsed \ "Person").validate[List[Map[String, Any]]]
Having said that, as #Luis wrote in a comment, you can just use case class to parse it. Lets declare 2 case classes:
case class JsonParsingExample(Person: Seq[Person])
case class Person(name: String, age: Int, job: String)
Now we will create a formatter for each of them on their corresponding companion object:
object Person {
implicit val format: OFormat[Person] = Json.format[Person]
}
object JsonParsingExample {
implicit val format: OFormat[JsonParsingExample] = Json.format[JsonParsingExample]
}
Now we can just do:
Json.parse(jsonFile).validate[JsonParsingExample]
Code run at Scastie.

play framework json reads from empty string to empty list

Hi everyone recently I faced an issue in converting json into my own data model.
I have a json format message which may contain an empty string:
{
"name" : "John Doe",
"hobbies": ""
}
or a list of hobby types:
{
"name" : "John Doe",
"hobbies": [{"name":"basketball"}]
}
And the following is my case class data model in scala play framework:
case class Person(name: String, hobbies: List[Hobby])
case class Hobby(name: String)
Right now I'm using the default json formatter but of course it's not working well when we have empty string as value.
implicit val HobbyJson= Json.format[Hobby]
implicit val PersonJson = Json.format[Person]
it will throw exception if the hobbies has a empty string. I want to convert it into an empty list when it's the empty string. I search the document Play provides but couldn't find infomation. Can anyone give some suggestions?
Thanks in advance.
As you mentioned, the default Format macros won't work for you here because of the inconsistent treatment of hobbies. So you need to implement your own Reads[Person] - here's how I'd do it:
object PersonJson {
implicit val hobbyConverter = Json.format[Hobby]
val personReads = new Reads[Person] {
override def reads(json: JsValue): JsResult[Person] = {
for {
personName <- (json \ "name").validate[String]
hobbies <- (json \ "hobbies").validate[JsValue]
} yield {
val maybeHobbyList = hobbies.validate[List[Hobby]].asOpt
Person(personName, maybeHobbyList.getOrElse(Nil))
}
}
}
implicit val personConverter = Format(personReads, Json.writes[Person])
}
The key thing to note here is surrounding the whole thing in a JsResult courtesy of the for-comprehension and the yield. This gives us all the necessary checking (like the name field being there and being a String, and the hobbies field being there).
The code within the yield block only runs if we've got something that looks pretty close to a Person. Then we can safely try validating the hobbies as a List[Hobby], and convert the result to an Option[List[Hobby]]. It'll be a None if it didn't work (thus it must have been a string) and so we default it to the empty list as required.
Thanks #millhouse answer, it definitely works. Like he said we need a custom Reads[Person] to properly convert it.
I also post my code as reference.
implicit val personJsonReads: Reads[Person] = (
(__ \ "name").read[String] and
(__ \ "hobbies").read[List[Hobby]].orElse(Reads.pure(List()))
) (Person.apply _)
read[List[Hobby]].orElse(Reads.pure(List())) will generate the empty list when the value cannot convert to List[Hobby].

Play ScalaJSON Reads[T] parsing

I am writing an Json parse for an rest webservice response, I have an Json file looking as:
{"program": {
"name": "myname",
"#id": "12345",
"$": "text text text"
}, etc. etc.
I wrote an case class for the Reads object:
case class program(name:String)
implicit val programFormat = Json.format[program]
And this pseudo code for get a data:
val x=(jobj \ "program").validate[program]
x match {
case JsSuccess(pr, _) => println("JsSuccess:"+pr)
for(p<- pr.program)
{
println(p.name)
}
case error: JsError => ....
}
For the field name no problem, the code work well, but I don't understand how capture the field "#id" and the field "$" because I cannot create an param in case class named: #id or $.
Thank you for your help.
More correct solution in my opinion is creating own Reads, that is:
case class Program(name: String, id: String, dollar: String)
implicit val programWrites: Reads[Program] = (
(__ \ "name").read[String] ~
(__ \ "#id").read[String] ~
(__ \ "$").read[String]
)(Program.apply _)
Docs: https://www.playframework.com/documentation/2.4.x/ScalaJsonCombinators#Reads
Another solution, i think much worse one, is using backtick sign
case class Program(name: String, `#id`: String, `$`: String)
implicit val programFormat = Json.format[Program]
It allows to write special signs in method names, field names and so on.
More about it: Need clarification on Scala literal identifiers (backticks)

Abstraction to extract data from JSON in Scala

I am looking for a good abstraction to extract data form JSON (I am using json4s now).
Suppose I have a case class A and data in JSON format.
case class A(a1: String, a2: String, a3: String)
{"a1":"xxx", "a2": "yyy", "a3": "zzz"}
I need a function to extract the JSON data and return A with these data as follows:
val a: JValue => A = ...
I do not want to write the function a from scratch. I would rather compose it from primitive functions.
For example, I can write a primitive function to extract string by field name:
val str: (String, JValue) => String = {(fieldName, jval) => ... }
Now I would like to compose the function a: JValue => A from str. Does it make sense ?
Consider use of Play-JSON, which has a composable "Reads" object. If you've ever used ReactiveMongo, it can be used in much the same way. Contrary to some older posts here, it can be used stand-alone, without most of the rest of Play.
It uses the common "implicit translator" (my term) idiom. I found that my favorite deserializing pattern for using it is not highlighted in the docs, though - the pattern they espouse is a lot harder to get right, IMHO. I make heavy use of .as and .asOpt, which are documented on the first linked page above, in the small section "Using JsValue.as/asOpt". When deserializing a JSON object, you can say something like
val person:Person = (someParsedJsonObject \ "aPerson").as[Person]
and as long as you have an implicit Reads[Person] in scope, all just works. There are built-in Reads for all primitive types and many collection types. In many cases, it makes sense to put the Reads and Writes implicit objects in the companion object for, e.g., Person.
I thought json4s had a similar feature, but I could be wrong.
Argonaut is fully functional Scala library.
It allows to encode/decode case classes (JSON codecs).
import argonaut._, Argonaut._
case class Person(name: String, age: Int)
implicit def PersonDecodeJson: DecodeJson[Person]
jdecode2L(Person.apply)("name", "age")
// Codec for Person case class from JSON of form
// { "name": "string", "age": 1 }
It also provides JSON cursor (lenses/monocle) for custom parsing.
implicit def PersonDecodeJson: DecodeJson[Person] =
DecodeJson(c => for {
name <- (c --\ "_name").as[String]
age <- (c --\ "_age").as[String].map(_.toInt)
} yield Person(name, age))
// Decode Person from a JSON with property names different
// from those of the case class, and age passed as string:
// { "_name": "string", "age": "10" }
Parsing result is represented by DecodeResult type that can be composed (.map, .flatMap) and handle error cases.

Modifying JSON reads and writes in playframework 2.1

I'm a newbie and scala/play and need help with playframework's JSON reads/writes.
I use Json.reads[T] and Json.writes[T] macros to define json reads and writes for my classes. However I'd like to have one property name to be (always) mapped differently. Namely, I have property named id in my classes and I want it to be represented as _id when object is converted to json and vice versa.
Is there a way to modify reads/writes objects generated by Json.reads and Json.writes macros to achieve this or do I have to rewrite reads and writes manually just to have one property named differently?
EDIT
Let me try to explain the problem better. Consider the model object User:
case class User (id: BigInt, email: String, name: String)
When serializing User to json for purposes of serving json in context of a REST api the json should look like this:
{
"id": 23432,
"name": "Joe",
"email: "joe#example.com"
}
When serializing User to json for purposes of storing/updating/reading form MongoDB json should look like:
{
"_id": 23432,
"name": "Joe",
"email: "joe#example.com"
}
In other words everything is the same except when communicating with Mongo id should be represented as _id.
I know I could manually write two sets of reads and writes for each model object (one to be used for web and another for communication with Mongo) as suggested by Darcy Qiu in the answer, however maintaining two sets of reads and writes that are nearly identical except for the id property seems like a lot of code duplication so I'm wondering if there is a better approach.
First you define transformations for renames id/_id back and forth:
import play.api.libs.json._
import play.modules.reactivemongo.json._
val into: Reads[JsObject] = __.json.update( // copies the full JSON
(__ \ 'id).json.copyFrom( (__ \ '_id).json.pick ) // adds id
) andThen (__ \ '_id).json.prune // and after removes _id
val from: Reads[JsObject] = __.json.update( // copies the full JSON
(__ \ '_id).json.copyFrom( (__ \ 'id).json.pick ) // adds _id
) andThen (__ \ 'id).json.prune // and after removes id
(To understand why Reads is a transformation please read: https://www.playframework.com/documentation/2.4.x/ScalaJsonTransformers)
Assuming we have macro generated Writes and Reads for our entity class:
def entityReads: Reads[T] // eg Json.reads[Person]
def entityWrites: Writes[T] // eg Json.writes[Person]
Then we mix transformations with macro-generated code:
private[this] def readsWithMongoId: Reads[T] =
into.andThen(entityReads)
private[this] def writesWithMongoId: Writes[T] =
entityWrites.transform(jsValue => jsValue.transform(from).get)
The last thing. Mongo driver wants to be sure (ie typesafe-sure) that the json it inserts is a JsObject. That is why we need an OWrites. I haven't found better way than:
private[this] def oWritesWithMongoId = new OWrites[T] {
override def writes(o: T): JsObject = writesWithMongoId.writes(o) match {
case obj: JsObject => obj
case notObj: JsValue =>
throw new InternalError("MongoRepo has to be" +
"definded for entities which serialize to JsObject")
}
}
Last step is to privide an implicit OFormat.
implicit val routeFormat: OFormat[T] = OFormat(
readsWithMongoId,
oWritesWithMongoId
)
Let's say your case class, which is T in your question, is named User and has definision as below
case class User(_id: String, age: Int)
Your reads can be defined as
implicit val userReads = new Reads[User] {
def reads(js: JsValue): User = {
User(
(js \ "id").as[String],
(js \ "age").as[Int]
)
}
}
Your writes[User] should follow the same logic.
If you put in enough code you can achieve this with transformers:
val idToUnderscore = (JsPath).json.update((JsPath).read[JsObject].map { o:JsObject =>
o ++ o.transform((JsPath\"_id").json.put((JsPath\"id").asSingleJson(o))).get
}) andThen (JsPath\"id").json.prune
val underscoreWrite = normalWrites.transform( jsVal => jsVal.transform(idToUnderscore).get )
Here's the full test:
import play.api.libs.functional.syntax._
import play.api.libs.json._
val u = User("overlord", "Hansi Meier", "evil.overlord#hansi-meier.de")
val userToProperties = {u:User => (u.id, u.name, u.email)}
val normalWrites = (
(JsPath\"id").write[String] and
(JsPath\"name").write[String] and
(JsPath\"email").write[String]
)(userToProperties)
val idToUnderscore = (JsPath).json.update((JsPath).read[JsObject].map { o:JsObject =>
o ++ o.transform((JsPath\"_id").json.put((JsPath\"id").asSingleJson(o))).get
}) andThen (JsPath\"id").json.prune
val underscoreWrite = normalWrites.transform( jsVal => jsVal.transform(idToUnderscore).get )
info(Json.stringify(Json.toJson(u)(normalWrites)))
info(Json.stringify(Json.toJson(u)(underscoreWrite)))
Now, if you modify the normalWrites (say by adding additional properties), the underscoreWrite will still do what you want.