Json "Validate" for Play - json

For the validate method on request.body it matches the attribute name and value type of the json object to those defined in the model definition. Now if I were to add an extra attribute to the json object and try to validate it, it passes as a JsSuccess when it shouldn't.
{
"Name": "Bob",
"Age": 20,
"Random_Field_Not_Defined_in_Models": "Test"
}
My Person Class is defined as follows
case class Person(name: String, age: Int)

I'm assuming you've been using the built-in Reads[T] or Format[T] converters that Play gives you via Json.reads[T], e.g.:
import play.api.libs.json._
val standardReads = Json.reads[Person]
While these are super-handy, if you need additional validation, you'll have to define a custom Reads[Person] class; but fortunately we can still leverage the built-in JSON-to-case-class macro to do the basic checking and conversion, and then add an extra layer of custom checks if things seem OK:
val standardReads = Json.reads[Person]
val strictReads = new Reads[Person] {
val expectedKeys = Set("name", "age")
def reads(jsv:JsValue):JsResult[Person] = {
standardReads.reads(jsv).flatMap { person =>
checkUnwantedKeys(jsv, person)
}
}
private def checkUnwantedKeys(jsv:JsValue, p:Person):JsResult[Person] = {
val obj = jsv.asInstanceOf[JsObject]
val keys = obj.keys
val unwanted = keys.diff(expectedKeys)
if (unwanted.isEmpty) {
JsSuccess(p)
} else {
JsError(s"Keys: ${unwanted.mkString(",")} found in the incoming JSON")
}
}
}
Note how we utilize standardReads first, to make sure we're dealing with something that can be converted to a Person. No need to reinvent the wheel here.
We use flatMap to effectively short-circuit the conversion if we get a JsError from standardReads - i.e. we only call checkUnwantedKeys if needed.
checkUnwantedKeys just uses the fact that a JsObject is really just a wrapper around a Map, so we can easily check the names of the keys against a whitelist.
Note that you could also write that flatMap using a for-comprehension, which starts to look a lot cleaner if you need even more checking stages:
for {
p <- standardReads.reads(jsv)
r1 <- checkUnexpectedFields(jsv, p)
r2 <- checkSomeOtherStuff(jsv, r1)
r3 <- checkEvenMoreStuff(jsv, r2)
} yield r3

If you want to avoid too much boilerplate it is possible to make a more generic solution using a little bit of scala reflection:
import play.api.libs.json._
import scala.reflect.runtime.universe._
def checkedReads[T](underlyingReads: Reads[T])(implicit typeTag: TypeTag[T]): Reads[T] = new Reads[T] {
def classFields[T: TypeTag]: Set[String] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.decodedName.toString
}.toSet
def reads(json: JsValue): JsResult[T] = {
val caseClassFields = classFields[T]
json match {
case JsObject(fields) if (fields.keySet -- caseClassFields).nonEmpty =>
JsError(s"Unexpected fields provided: ${(fields.keySet -- caseClassFields).mkString(", ")}")
case _ => underlyingReads.reads(json)
}
}
}
Then you can specify your reads instances as:
implicit val reads = checkedReads(Json.reads[Person])
This leverages a fair bit of Scala type magic and also the reflection library (that lets you look at fields on classes).
Rather than relying on a fixed set of fields the classFields method gets all of the fields dynamically for the case class (type param T). It looks at all of the members and collects only the case class accessors (otherwise we'd pick up methods like toString). It returns a Set[String] of field names.
You'll notice that the checkedReads takes an implicit TypeTag[T]. This is supplied by the compiler at compile time and used by the typeOf method.
The remaining code is fairly self explanatory. If the incoming json matches our first case (it is a JsObject and there are fields not on the case class) then we return a JsError. Otherwise we pass it on to the underlying reader.

Related

Scala/Play: modify Json.format macro behaviour for single field

I have a big case class with 10+ fields, that represents a JSON payload, that comes from user. Most of fields are optional, so I'm using Option in this cases (Option[String] for string fields. And that is nice approach, until I need an optional sequence. I think that writing Option[Seq[String]] is weird, cause empty sequence is enough to show that there is no data (for any reason).
I could handle it in manual way, though:
implicit val reads = new Reads[MyCaseClass] {
def reads(js: JsValue): JsResult[MyCaseClass] = {
JsSuccess(MyCaseClass(
(js \ "unit_code").as[String],
// other fields omited
(js \ "positions").asOpt[Seq[String]] match {
case Some(seq: Seq[String]) => seq
case None => Seq.empty[String]
}
))
}
}
But I don't won't to write all this stuff manually. There could be mistakes, I need to test it separately, and, for sure, it takes much more time, rather than to write implicit val f = Json.format[MyCaseClass].
Is there any option to separately handle only one field, that other fields handling let on the default macro?
Thanks for user 'cchantep' to pointing to Json transformers.
So that is how I solve the issue:
Case class:
case class MyCaseClass(unit_code: String, positions: Seq[String] = Seq.empty)
Companion object
object MyCaseClass {
private val readsTransformer: Reads[JsObject] = __.json.update(
__.read[JsObject]
.map{ o =>
if (o.keys.exists(p => p.equals("positions"))) {
o
} else {
o ++ Json.obj("positions" -> JsArray())
}
}
)
implicit val readsImplicit: Reads[MyCaseClass] = readsTransformer.andThen(Json.reads[MyCaseClass])
implicit val writesImplicit: OWrites[MyCaseClass] = Json.writes[MyCaseClass]
}
That looks a bit cumbersome, but it is possible to write common function that creates transformer for exact field, so in each companion object it wouldn't be so verbose.

How to fail on json with unused fields in play-json scala? [duplicate]

For the validate method on request.body it matches the attribute name and value type of the json object to those defined in the model definition. Now if I were to add an extra attribute to the json object and try to validate it, it passes as a JsSuccess when it shouldn't.
{
"Name": "Bob",
"Age": 20,
"Random_Field_Not_Defined_in_Models": "Test"
}
My Person Class is defined as follows
case class Person(name: String, age: Int)
I'm assuming you've been using the built-in Reads[T] or Format[T] converters that Play gives you via Json.reads[T], e.g.:
import play.api.libs.json._
val standardReads = Json.reads[Person]
While these are super-handy, if you need additional validation, you'll have to define a custom Reads[Person] class; but fortunately we can still leverage the built-in JSON-to-case-class macro to do the basic checking and conversion, and then add an extra layer of custom checks if things seem OK:
val standardReads = Json.reads[Person]
val strictReads = new Reads[Person] {
val expectedKeys = Set("name", "age")
def reads(jsv:JsValue):JsResult[Person] = {
standardReads.reads(jsv).flatMap { person =>
checkUnwantedKeys(jsv, person)
}
}
private def checkUnwantedKeys(jsv:JsValue, p:Person):JsResult[Person] = {
val obj = jsv.asInstanceOf[JsObject]
val keys = obj.keys
val unwanted = keys.diff(expectedKeys)
if (unwanted.isEmpty) {
JsSuccess(p)
} else {
JsError(s"Keys: ${unwanted.mkString(",")} found in the incoming JSON")
}
}
}
Note how we utilize standardReads first, to make sure we're dealing with something that can be converted to a Person. No need to reinvent the wheel here.
We use flatMap to effectively short-circuit the conversion if we get a JsError from standardReads - i.e. we only call checkUnwantedKeys if needed.
checkUnwantedKeys just uses the fact that a JsObject is really just a wrapper around a Map, so we can easily check the names of the keys against a whitelist.
Note that you could also write that flatMap using a for-comprehension, which starts to look a lot cleaner if you need even more checking stages:
for {
p <- standardReads.reads(jsv)
r1 <- checkUnexpectedFields(jsv, p)
r2 <- checkSomeOtherStuff(jsv, r1)
r3 <- checkEvenMoreStuff(jsv, r2)
} yield r3
If you want to avoid too much boilerplate it is possible to make a more generic solution using a little bit of scala reflection:
import play.api.libs.json._
import scala.reflect.runtime.universe._
def checkedReads[T](underlyingReads: Reads[T])(implicit typeTag: TypeTag[T]): Reads[T] = new Reads[T] {
def classFields[T: TypeTag]: Set[String] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.decodedName.toString
}.toSet
def reads(json: JsValue): JsResult[T] = {
val caseClassFields = classFields[T]
json match {
case JsObject(fields) if (fields.keySet -- caseClassFields).nonEmpty =>
JsError(s"Unexpected fields provided: ${(fields.keySet -- caseClassFields).mkString(", ")}")
case _ => underlyingReads.reads(json)
}
}
}
Then you can specify your reads instances as:
implicit val reads = checkedReads(Json.reads[Person])
This leverages a fair bit of Scala type magic and also the reflection library (that lets you look at fields on classes).
Rather than relying on a fixed set of fields the classFields method gets all of the fields dynamically for the case class (type param T). It looks at all of the members and collects only the case class accessors (otherwise we'd pick up methods like toString). It returns a Set[String] of field names.
You'll notice that the checkedReads takes an implicit TypeTag[T]. This is supplied by the compiler at compile time and used by the typeOf method.
The remaining code is fairly self explanatory. If the incoming json matches our first case (it is a JsObject and there are fields not on the case class) then we return a JsError. Otherwise we pass it on to the underlying reader.

Explicitly output JSON null in case of missing optional value

Consider this example using Play's JSON API (play.api.libs.json):
case class FooJson(
// lots of other fields omitted
location: Option[LocationJson]
)
object FooJson {
implicit val writes = Json.writes[FooJson]
}
and
case class LocationJson(latitude: Double, longitude: Double)
object LocationJson {
implicit val writes = Json.writes[LocationJson]
}
If location is None, the resulting JSON won't have location field at all. This is fine and understadable. But if I wanted for some reason (say, to make my API more self-documenting), how can I explicitly output null in JSON?
{
"location": null
}
I also tried defining the field as location: LocationJson and passing option.orNull to it, but it does not work (scala.MatchError: null at play.api.libs.json.OWrites$$anon$2.writes). For non-custom types such as String or Double, this approach would produce null in JSON output.
So, while using Json.writes[FooJson] as shown above (or something equally simple, i.e. not having to write a custom Writes implementation), is there a clean way to include nulls in JSON?
What I'm asking is analogous to JsonInclude.Include.ALWAYS in the Jackson library (also Jackson's default behaviour). Similarly in Gson this would be trivial
(new GsonBuilder().serializeNulls().create()).
Play 2.4.4
Greg Methvin, a Play committer, wrote this answer to me in a related GitHub issue:
The JSON macros only support one way of encoding optional values,
which is to omit None values from the JSON. This is not a bug but
rather a limitation of the implementation. If you want to include
nulls you're unfortunately going to have to implement your own Writes.
I do think we should try to provide more configurability for the
macros though.
In this case, I'll let Play exclude this field when the value is null, even if it slightly sacrifices API consistency and self-documentability. It's still such a minor thing (in this particular API) that it doesn't warrant uglifying the code as much as a custom Writes would take for a case class with a dozen values.
I'm hoping they do make this more configurable in future Play versions.
Hello from the future.
As of Play 2.7, a fairly simple solution was introduced for automated JSON codecs. We can introduce the appropriate implicit value for JsonConfiguration in the scope for the Format/Reads/Writes. The following configuration will write nulls for empty Options instead of omitting the fields entirely.
import play.api.libs.json._
implicit val config = JsonConfiguration(optionHandlers = OptionHandlers.WritesNull)
implicit val residentWrites = Json.writes[Resident]
Reference
Here's a way to do it:
object MyWrites extends DefaultWrites{
override def OptionWrites[T](implicit fmt: Writes[T]): Writes[Option[T]] = new Writes[Option[T]] {
override def writes(o: Option[T]): JsValue = {
o match {
case Some(a) => Json.toJson(a)(fmt)
case None => JsNull
}
}
}
}
This will overwrite the default implementation which will not create an element. I used this in your sample code:
case class FooJson(
// ...
location: Option[LocationJson]
)
case class LocationJson(latitude: Double, longitude: Double)
object LocationJson {
implicit val writes = Json.writes[LocationJson]
}
implicit val fooJsonWriter: Writes[FooJson] = new Writes[FooJson] {
override def writes(o: FooJson): JsValue = {
JsObject(Seq(
"location" -> Json.toJson(o.location)
// Additional fields go here.
))
}
}
Json.toJson(FooJson(None))
And got this result res0: play.api.libs.json.JsValue = {"location":null}.
if we have null values then we have to add the option with members in case class which will resolve the issue
case class response(
name:String,
age: option[int]
)
object response {
implicit val format = Json.format[response]
}
Here the option is the answer for us. and if we are the JSON response for age is coming as null and this will handle the solution for us.

Handling Pk[Int] values in spray-json

[edit]
So, i got a quick and dirty solution, thanks to Edmondo1984, I don't know if it's the best solution. I don't handle null values with pattern matching at the write function. You can read more details about my problem after this editing. Here is my code now:
object DBNames extends DefaultJsonProtocol {
implicit val pkFormat: JsonFormat[Pk[Int]] = new JsonFormat[Pk[Int]] {
def write(obj: Pk[Int]): JsValue = JsNumber(obj.get)
def read(json: JsValue): Pk[Int] = json.asJsObject.getFields("id") match {
case Seq(JsNumber(id)) => new Pk[Int] { id.toInt }
case _ => throw new DeserializationException("Int expected")
}
}
implicit val nameFormat = jsonFormat2(Name)
jsonFormat2 will implicitly use pkFormat to parse Pk[Int] values.
In my controller class I have this:
def listNames() = Action {
val names = DBNames.findAll()
implicit val writer = DBNames.nameFormat
var json = names.toJson
Ok(json.toString()).as("application/json")
}
I had to get the nameFormat from my model and make it implicit, so bars.toJson could use it to parse the Seq[Name] names.
[/edit]
I'm trying to use Play! Framework with Scala, I'm new to Scala programming and Play Framework, and everything seems nice, but I'm working on this problem during several hours and didn't find a solution.
I have a Case Class:
case class Name (id: Pk[Int], name: String)
And an object to deal with MySql. I created a implicit val nameFormat = jsonFormat2(Name) to deal with JSON.
object DBNames extends DefaultJsonProtocol {
implicit val nameFormat = jsonFormat2(Name)
var parser =
{
get[Pk[Int]]("id") ~
get[String]("name") map {
case id ~ name => Name(id,name)
}
}
def findAll():Seq[Name] =
{
DB.withConnection {
implicit connection =>
SQL("select * from names").as(DBNames.parser *)
}
}
def create(name: Name){
DB.withConnection {
implicit connection =>
SQL("insert into names (name) values ({name})").on(
'name -> name.name
).executeUpdate()
}
}
}
But when I try to compile it, Play! gives me this result:
[error] D:\ProjetosJVM\TaskList\app\models\Names.scala:20: could not find implicit value for evidence parameter of type models.DBNames.JF[anorm.Pk[Int]]
It seems like he couldn't find a way to parse the id value, since it is a Pk[Int] value.
So, by reading this: https://github.com/spray/spray-json I didn't found a way to parse it without creating a complete object parser like they show in the documentation:
object MyJsonProtocol extends DefaultJsonProtocol {
implicit object ColorJsonFormat extends RootJsonFormat[Color] {
def write(c: Color) = JsObject(
"name" -> JsString(c.name),
"red" -> JsNumber(c.red),
"green" -> JsNumber(c.green),
"blue" -> JsNumber(c.blue)
)
def read(value: JsValue) = {
value.asJsObject.getFields("name", "red", "green", "blue") match {
case Seq(JsString(name), JsNumber(red), JsNumber(green), JsNumber(blue)) =>
new Color(name, red.toInt, green.toInt, blue.toInt)
case _ => throw new DeserializationException("Color expected")
}
}
}
}
I have a "big" (actually small) project where I want to make most of things work with Ajax, so I think this is not a good way to do it.
How can I deal with JSON objects in this project, where almost all case classes will have a "JSON parser", without creating large ammounts of code, like the snippet above? And also, how can I make it work with an Seq[Name]?
You don't need to write a complete parser. The compiler says:
[error] D:\ProjetosJVM\TaskList\app\models\Names.scala:20: could not find implicit
value for evidence parameter of type models.DBNames.JF[anorm.Pk[Int]]
The scala compiler is looking for an implicit parameter of type JF[anorm.Pk[Int]] and there is no such an implicit parameter in scope. What is JF[anorm.Pk[Int]]? Well, you need to know the library and I didn't, so I had browsed spray-json source and found out:
trait StandardFormats {
this: AdditionalFormats =>
private[json] type JF[T] = JsonFormat[T] // simple alias for reduced verbosity
so JF[T] is just an alias for JsonFormat[T]. It all make sense: PK[Int] is a class coming from Anorm and spray-json provides out-of-the-box json support for standard types, but does not even know Anorm exists So you have to code your support for Pk[Int] and make it implicit in scope.
You will have code like the following:
object DBNames extends DefaultJsonProtocol {
implicit val pkFormat : JsonFormat[Pk[Int]] = new JsonFormat[Pk[Int]] {
//implementation
}
// rest of your code
}
If you have just started with Scala, you would probably have to read more about implicits and their resolution. I am providing you with a minimal answer: once you have provided the right implementation, your code will compile. I suggest you to refer to the javadoc of anorm.Pk and of JsonFormat to understand how to implement it correctly for your type.
Pk looks like scala.Option and in StandardFormats source code inside spray-json you find the JsonFormat implementation for Option, from which you can copy

Create a generic Json serialization function

Is it possible to create a generic function in Scala, using Play Framework 2.2, that will serialize an arbitrary object to JSON, without having to be supplied a writer or formatter?
For instance, this non-generic code will create a JSON response given a Customer:
import play.api.libs.json._
import play.api.libs.functional.syntax._
case class Customer(id: Int, name: String)
object scratch {
val p = Customer(1, "n")
//> p : Customer = Customer(1,n)
def createJsonResponseCustomer(data: Customer) = {
implicit val formatter = Json.format[Customer]
Json.obj("success" -> true, "data" -> Json.toJson[Customer](data))
}
createJsonResponseCustomer(p)
//> res0: play.api.libs.json.JsObject = {"success":true,"data":{"id":1,"name":"n"}}
}
To avoid having to define the formatter for each different object, I'd like to create a generic function like this:
def createJsonResponse[T](data: T) = {
implicit val formatter = Json.format[T]
Json.obj("success" -> true, "data" -> Json.toJson[T](data))
}
But this attempt produces the error No unapply function found at Json.format[T].
In other words, this works:
def getFormatter(c: Customer) = Json.format[Customer]
but this doesn't:
def getFormatterGeneric[T](c: T) = Json.format[T]
Is there any way around this?
You need to define the formatter somewhere, for each type you wish to read or write. This is because the formatter instances are resolved at compile time, not at runtime. This is a good thing, because it means trying to serialize a type that does not have a serializer becomes a compile-time error, not a runtime one.
Instead of defining the formatters on the fly, define them in a module that you can reuse, e.g.
object JsonFormatters {
implicit val customerWrites: Format[Customer] = Json.format[Customer]
}
Then import JsonFormatters._ in the scope that you want to write some JSON.
Now, you can write a generic method similar to what you wanted: you just have to specify the requirement for a formatter in the signature of your method. In practice, this is an implicit paramter of type Writes[T].
def createJsonResponse[T](data: T)(implicit writes: Writes[T]) =
Json.obj("success" -> true, "data" -> Json.toJson[T](data))
You can also write this method signature using context bound syntax, i.e.
def createJsonResponse[T : Writes](data: T) = ...
This requires that there is an instance of Writes[T] in scope; but the compiler will choose the correct instance for you based on the type T, rather than you resolving it explicitly.
Note that Writes[T] is a supertype of Format[T]; since you are only writing JSON in this method, there's no need to specify a requirement for Format[T], which would also give you Reads[T].