unable to understand Read[T]JsPath example in documentation - json

I am unable to understand example code of JsPath and Read in the documentation
https://www.playframework.com/documentation/2.2.1/ScalaJsonCombinators
import play.api.libs.json._
import play.api.libs.functional.syntax._
Question 1 - We create a custom reader. Reads should be able to read a structure of data consisting of String, Float and a List. But in the example below, we pass it a Json! How is the Json getting converted to (String, Fload and List)?
Question 2 - we use JsPath \ "key1" but where have we passed the JSON?
val customReads: Reads[(String, Float, List[String])] =
(JsPath \ "key1").read[String](email keepAnd minLength(5)) and
(JsPath \ "key2").read[Float](min(45)) and
(JsPath \ "key3").read[List[String]]
tupled
import play.api.libs.json.Json
val js = Json.obj(
"key1" -> "alpha",
"key2" -> 123.345F,
"key3" -> Json.arr("alpha", "beta")
)
res5: JsSuccess(("alpha", 123.345F, List("alpha", "beta")))
scala> customReads.reads(js)
customReads.reads(js).fold(
invalid = { errors => ... },
valid = { res =>
val (s, f, l): (String, Float, List[String]) = res
...
}
)

Question 1
We create a custom reader.
Yes
Reads should be able to read a structure of data consisting of String, Float and a List.
No
val customReads: Reads[(String, Float, List[String])]= ...
Means that customReads is the variable of type Reads, and that Reads.reads must return tuple of types (String, Float, List[String])
But in the example below, we pass it a JSON! How is the JSON getting converted to (String, Float and List)?
Reads.reads is a function that takes JSON as a parameter and returns some value, extracted from the JSON by defined rules. In our case, the rules are:
(JsPath \ "key1").read[String](email keepAnd minLength(5))
and
(JsPath \ "key2").read[Float](min(45))
and
(JsPath \ "key3").read[List[String]]
all these values must be
tupled
so our Reads.reads function returns (String, Float, List[String]) tuple.
Question 2
we use JsPath \ "key1" but where have we passed the JSON?
JsPath \ "key1" is not the actual code to process the JSON, it's a rule to process a JSON. i.e it is like XPath expression /key1

Related

How do you add validation to a Reads in Play Json?

Let's say I've got a reads that creates an object from JSON with two optional fields:
implicit val rd: Reads[MyObject] = (
(__ \ "field1").readNullable[String] and
(__ \ "field2").readNullable[String]
)(MyObject.apply _)
I want to check to make sure that the value of field1 is one of the values in the list:
List("foo", "bar")
I can do that after the fact, by creating a new MyObject and mapping the values through a function to transform them, but I feel like there should be a way to do this more elegantly using JSON transformers or something.
Ideally, I want the Reads to read the nullable value of field1 and transform it if it is defined, without the need to post-process it. Is there some way of sneaking a transform in there?
You can use this approach:
case class MyObject(a: Option[String], b: Option[String])
val allowedValues = Seq("foo", "bar")
implicit val reads: Reads[MyObject] = new Reads[MyObject] {
override def reads(json: JsValue): JsResult[MyObject] = {
val a = (json \ "a").asOpt[String].filter(allowedValues.contains)
val b = (json \ "b").asOpt[String]
JsSuccess(MyObject(a, b))
}
}
Usage examples:
scala> Json.parse(""" { "a": "bar", "b": "whatever"} """).validate[MyObject]
res2: play.api.libs.json.JsResult[MyObject] = JsSuccess(MyObject(Some(bar),Some(whatever)),)
scala> Json.parse(""" { "a": "other", "b": "whatever"} """).validate[MyObject]
res3: play.api.libs.json.JsResult[MyObject] = JsSuccess(MyObject(None,Some(whatever)),)
scala> Json.parse(""" {} """).validate[MyObject]
res4: play.api.libs.json.JsResult[MyObject] = JsSuccess(MyObject(None,None),)
Okay, after doing some more research, I came up with the following:
In play.api.libs.json.ConstraintReads there is a function called verifying(cond: A => Boolean) that returns a Reads[A]. This can be passed as a parameter to JsPath.readNullable[A] like so:
implicit val rd: Reads[MyObject] = (
(__ \ "field1").readNullable[String](verifying(allowedValues.contains)) and
(__ \ "field2").readNullable[String]
)(MyObject.apply _)
This will return a JsResponse, either a JsSuccess if "field1" validates, or a JsError if it doesn't validate. It actually fails on an invalid input, rather than just ignoring the input. That's more like the behaviour I wanted.
There are a number of other constraint functions that perform similar tests on the read value, as well.

Play Scala JSON: combine properties

I have the following case class:
case class User(name: String).
I am trying to implement a JSON Reads converter for it, so I can do the following:
val user = userJson.validate[User]
… but the incoming JSON has slightly different structure:
{ "firstName": "Bob", "lastName": "Dylan" }.
How can I implement my JSON Reads converter to combine the JSON fields firstName and lastName into a name property on my class?
This should do the trick:
implicit val userReads: Reads[User] =
for {
first <- (__ \ "firstName").read[String]
last <- (__ \ "lastName").read[String]
} yield User(s"$first $last")
EDIT
Without using a for comprehension
implicit val userReads =
{
(__ \ "firstName").read[String] and
(__ \ "lastName"
}.read[String] ).tupled.map(t => User(s"${t._1} ${t._2}"))
Bringing userReads in scope where you want to use it will let you parse the JSON you provided.
Reads is essentially a function from JsValue to JsResult, meaning userReads represents a function from JsValue -> JsResult. Within the function, it first inspects the provided JSON & tries to read out a property named "firstName" from the current JSON path (__ is shorthand for this). \ indicates that the field its looking for is one level beneath the root, and read[String] means the value associated with the "firstName" key should be read as a string. Same follows for "lastName".
Edit
In the version without the for comprehension, it first creates an intermediary object FunctionalBuilder[Reads]#CanBuild[String, String], which is a complicated way of saying it reads two distinct strings from the Json. Next it converts that complex object into a Reads[(String, String)] by way of tupled. Finally it maps the pair of strings into a User.
Were you to try validating some JSON without "firstName" & "lastName", this will fail with a validation error for a missing path.

Serialize only specific attributes using Writes trait with unapply

Lets imagine I have a case class like this:
case class Product(ean: Long, name: String, description: String)
and I want so serialize objects of this class to Json, I can implement the Writes trait like this:
implicit val productWrites: Writes[Product] = (
(JsPath \ "ean").write[Long] and
(JsPath \ "name").write[String] and
(JsPath \ "description").write[String]
)(unlift(Product.unapply))
This works fine if I want to serialize all the attributes of the object. Now lets say I don't want to serialize the ean. I tried something like this:
implicit val productWrites: Writes[Product] = (
(JsPath \ "name").write[String] and
(JsPath \ "description").write[String]
)(unlift(Product.unapply))
This doesn't seem to work since one needs to use all the fields/attributes that the unapply method returns.
Is there a way to make the second serialization method work with only the attributes that I want to serialize or do I have to use something like this:
implicit object ProductWrites extends Writes[Product] {
def writes(p: Product) = Json.obj(
"name" -> Json.toJson(p.name),
"description" -> Json.toJson(p.description)
)
}
Is this the only way?
unlift(Product.unapply) has a type Product => (Long, String, String).
In this case, the argument should have a type Product => (String, String). You can write a function literal like following.
implicit val productWrites: Writes[Product] = (
(JsPath \ "name").write[String] and
(JsPath \ "description").write[String]
)(p => (p.name, p.description))
I think your last example is the way to go. Here's another way of doing the same thing using an implicit val instead of an implicit object:
implicit val productWrites: Writes[Product] = Writes { p =>
Json.obj(
"name" -> Json.toJson(p.name),
"description" -> Json.toJson(p.description)
)
}

If statements within Play/Scala JSON parsing?

Is there a way to perform conditional logic while parsing json using Scala/Play?
For example, I would like to do something like the following:
implicit val playlistItemInfo: Reads[PlaylistItemInfo] = (
(if(( (JsPath \ "type1").readNullable[String]) != null){ (JsPath \ "type1" \ "id").read[String]} else {(JsPath \ "type2" \ "id").read[String]}) and
(JsPath \ "name").readNullable[String]
)(PlaylistItemInfo.apply _)
In my hypothetical JSON parsing example, there are two possible ways to parse the JSON. If the item is of "type1", then there will be a value for "type1" in the JSON. If this is not present in the JSON or its value is null/empty, then I would like to read the JSON node "type2" instead.
The above example does not work, but it gives you the idea of what I am trying to do.
Is this possible?
The proper way to do this with JSON combinators is to use orElse. Each piece of the combinator must be a Reads[YourType], so if/else doesn't quite work because your if clause doesn't return a Boolean, it returns Reads[PlaylistItemInfo] checked against null which will always be true. orElse let's us combine one Reads that looks for the type1 field, and a second one that looks for the type2 field as a fallback.
This might not follow your exact structure, but here's the idea:
import play.api.libs.json._
import play.api.libs.functional.syntax._
case class PlaylistItemInfo(id: Option[String], tpe: String)
object PlaylistItemInfo {
implicit val reads: Reads[PlaylistItemInfo] = (
(__ \ "id").readNullable[String] and
(__ \ "type1").read[String].orElse((__ \ "type2").read[String])
)(PlaylistItemInfo.apply _)
}
// Read type 1 over type 2
val js = Json.parse("""{"id": "test", "type1": "111", "type2": "2222"}""")
scala> js.validate[PlaylistItemInfo]
res1: play.api.libs.json.JsResult[PlaylistItemInfo] = JsSuccess(PlaylistItemInfo(Some(test),111),)
// Read type 2 when type 1 is unavailable
val js = Json.parse("""{"id": "test", "type2": "22222"}""")
scala> js.validate[PlaylistItemInfo]
res2: play.api.libs.json.JsResult[PlaylistItemInfo] = JsSuccess(PlaylistItemInfo(Some(test),22222),)
// Error from neither
val js = Json.parse("""{"id": "test", "type100": "fake"}""")
scala> js.validate[PlaylistItemInfo]
res3: play.api.libs.json.JsResult[PlaylistItemInfo] = JsError(List((/type2,List(ValidationError(error.path.missing,WrappedArray())))))

Defaults for missing properties in play 2 JSON formats

I have an equivalent of the following model in play scala :
case class Foo(id:Int,value:String)
object Foo{
import play.api.libs.json.Json
implicit val fooFormats = Json.format[Foo]
}
For the following Foo instance
Foo(1, "foo")
I would get the following JSON document:
{"id":1, "value": "foo"}
This JSON is persisted and read from a datastore. Now my requirements have changed and I need to add a property to Foo. The property has a default value :
case class Foo(id:String,value:String, status:String="pending")
Writing to JSON is not a problem :
{"id":1, "value": "foo", "status":"pending"}
Reading from it however yields a JsError for missing the "/status" path.
How can I provide a default with the least possible noise ?
(ps: I have an answer which I will post below but I am not really satisfied with it and would upvote and accept any better option)
Play 2.6+
As per #CanardMoussant's answer, starting with Play 2.6 the play-json macro has been improved and proposes multiple new features including using the default values as placeholders when deserializing :
implicit def jsonFormat = Json.using[Json.WithDefaultValues].format[Foo]
For play below 2.6 the best option remains using one of the options below :
play-json-extra
I found out about a much better solution to most of the shortcomings I had with play-json including the one in the question:
play-json-extra which uses [play-json-extensions] internally to solve the particular issue in this question.
It includes a macro which will automatically include the missing defaults in the serializer/deserializer, making refactors much less error prone !
import play.json.extra.Jsonx
implicit def jsonFormat = Jsonx.formatCaseClass[Foo]
there is more to the library you may want to check: play-json-extra
Json transformers
My current solution is to create a JSON Transformer and combine it with the Reads generated by the macro. The transformer is generated by the following method:
object JsonExtensions{
def withDefault[A](key:String, default:A)(implicit writes:Writes[A]) = __.json.update((__ \ key).json.copyFrom((__ \ key).json.pick orElse Reads.pure(Json.toJson(default))))
}
The format definition then becomes :
implicit val fooformats: Format[Foo] = new Format[Foo]{
import JsonExtensions._
val base = Json.format[Foo]
def reads(json: JsValue): JsResult[Foo] = base.compose(withDefault("status","bidon")).reads(json)
def writes(o: Foo): JsValue = base.writes(o)
}
and
Json.parse("""{"id":"1", "value":"foo"}""").validate[Foo]
will indeed generate an instance of Foo with the default value applied.
This has 2 major flaws in my opinion:
The defaulter key name is in a string and won't get picked up by a refactoring
The value of the default is duplicated and if changed at one place will need to be changed manually at the other
The cleanest approach that I've found is to use "or pure", e.g.,
...
((JsPath \ "notes").read[String] or Reads.pure("")) and
((JsPath \ "title").read[String] or Reads.pure("")) and
...
This can be used in the normal implicit way when the default is a constant. When it's dynamic, then you need to write a method to create the Reads, and then introduce it in-scope, a la
implicit val packageReader = makeJsonReads(jobId, url)
An alternative solution is to use formatNullable[T] combined with inmap from InvariantFunctor.
import play.api.libs.functional.syntax._
import play.api.libs.json._
implicit val fooFormats =
((__ \ "id").format[Int] ~
(__ \ "value").format[String] ~
(__ \ "status").formatNullable[String].inmap[String](_.getOrElse("pending"), Some(_))
)(Foo.apply, unlift(Foo.unapply))
I think the official answer should now be to use the WithDefaultValues coming along Play Json 2.6:
implicit def jsonFormat = Json.using[Json.WithDefaultValues].format[Foo]
Edit:
It is important to note that the behavior differs from the play-json-extra library. For instance if you have a DateTime parameter that has a default value to DateTime.Now, then you will now get the startup time of the process - probably not what you want - whereas with play-json-extra you had the time of the creation from the JSON.
I was just faced with the case where I wanted all JSON fields to be optional (i.e. optional on user side) but internally I want all fields to be non-optional with precisely defined default values in case the user does not specify a certain field. This should be similar to your use case.
I'm currently considering an approach which simply wraps the construction of Foo with fully optional arguments:
case class Foo(id: Int, value: String, status: String)
object FooBuilder {
def apply(id: Option[Int], value: Option[String], status: Option[String]) = Foo(
id getOrElse 0,
value getOrElse "nothing",
status getOrElse "pending"
)
val fooReader: Reads[Foo] = (
(__ \ "id").readNullable[Int] and
(__ \ "value").readNullable[String] and
(__ \ "status").readNullable[String]
)(FooBuilder.apply _)
}
implicit val fooReader = FooBuilder.fooReader
val foo = Json.parse("""{"id": 1, "value": "foo"}""")
.validate[Foo]
.get // returns Foo(1, "foo", "pending")
Unfortunately, it requires writing explicit Reads[Foo] and Writes[Foo], which is probably what you wanted to avoid? One further drawback is that the default value will only be used if the key is missing or the value is null. However if the key contains a value of the wrong type, then again the whole validation returns a ValidationError.
Nesting such optional JSON structures is not a problem, for instance:
case class Bar(id1: Int, id2: Int)
object BarBuilder {
def apply(id1: Option[Int], id2: Option[Int]) = Bar(
id1 getOrElse 0,
id2 getOrElse 0
)
val reader: Reads[Bar] = (
(__ \ "id1").readNullable[Int] and
(__ \ "id2").readNullable[Int]
)(BarBuilder.apply _)
val writer: Writes[Bar] = (
(__ \ "id1").write[Int] and
(__ \ "id2").write[Int]
)(unlift(Bar.unapply))
}
case class Foo(id: Int, value: String, status: String, bar: Bar)
object FooBuilder {
implicit val barReader = BarBuilder.reader
implicit val barWriter = BarBuilder.writer
def apply(id: Option[Int], value: Option[String], status: Option[String], bar: Option[Bar]) = Foo(
id getOrElse 0,
value getOrElse "nothing",
status getOrElse "pending",
bar getOrElse BarBuilder.apply(None, None)
)
val reader: Reads[Foo] = (
(__ \ "id").readNullable[Int] and
(__ \ "value").readNullable[String] and
(__ \ "status").readNullable[String] and
(__ \ "bar").readNullable[Bar]
)(FooBuilder.apply _)
val writer: Writes[Foo] = (
(__ \ "id").write[Int] and
(__ \ "value").write[String] and
(__ \ "status").write[String] and
(__ \ "bar").write[Bar]
)(unlift(Foo.unapply))
}
This probably won't satisfy the "least possible noise" requirement, but why not introduce the new parameter as an Option[String]?
case class Foo(id:String,value:String, status:Option[String] = Some("pending"))
When reading a Foo from an old client, you'll get a None, which I'd then handle (with a getOrElse) in your consumer code.
Or, if you don't like this, introduce an BackwardsCompatibleFoo:
case class BackwardsCompatibleFoo(id:String,value:String, status:Option[String] = "pending")
case class Foo(id:String,value:String, status: String = "pending")
and then turn that one into a Foo to work with further on, avoiding to have to deal with this kind of data gymnastics all along in the code.
You may define status as an Option
case class Foo(id:String, value:String, status: Option[String])
use JsPath like so:
(JsPath \ "gender").readNullable[String]