I am using jsoniter-scala to map Json files to Scala objects (case classes). To make it simple, assume I have two types of Json files, and thus two case classes, for instance:
case class JsonTypeOne(name: String, id: Long)
case class JsonTypeTwo(name: String, notes: Seq[String])
When I use
val codec: JsonValueCodec[JsonTypeOne] = JsonCodecMaker.make
or
val codec: JsonValueCodec[JsonTypeTwo] = JsonCodecMaker.make
everything works perfectly and as expected. Now, I'd like to create a function that takes the class as an in input parameter and passes it on to jsoniter. In pseudo-code this would be something like this:
def getJsonWithClass(aClass: SomeType, jsonString: String) {
val codec: JsonValueCodec[aClass] = JsonCodecMaker.make
readFromString(jsonString)(codec)
}
and would then be called as follows:
getJsonWithClass(JsonTypeOne, json1String)
getJsonWithClass(JsonTypeTwo, json2String)
Tried a number of variations, including defining a trait for the case classes and using that as "SomeType" or using generics, but thus far without success.
Any suggestions on how to resolve this issue would be highly appreciated.
As it is not possible to parameterize the codec generation as originally intended (thanks for the clarifications Luis and Andriy), I ended up with a workaround. It's not as cleanly decoupled as if parameterized typing would be possible, but at least the coupling is localized in a pattern matching and can easily be extended:
def getJsonWithClass[T](aJsonClass: T, jsonString: String):T = {
val jsonCodec = aJsonClass match {
case JsonTypeOne => JsonCodecMaker.make[JsonTypeOne]
case JsonTypeTwo => JsonCodecMaker.make[JsonTypeTwo]
}
val jsonObj = readFromString(jsonString)(jsonCodec)
jsonObj.asInstanceOf[T]
}
This can then be called as in
getJsonWithClass(JsonTypeOne, aJsonTypeOneString)
Related
For the validate method on request.body it matches the attribute name and value type of the json object to those defined in the model definition. Now if I were to add an extra attribute to the json object and try to validate it, it passes as a JsSuccess when it shouldn't.
{
"Name": "Bob",
"Age": 20,
"Random_Field_Not_Defined_in_Models": "Test"
}
My Person Class is defined as follows
case class Person(name: String, age: Int)
I'm assuming you've been using the built-in Reads[T] or Format[T] converters that Play gives you via Json.reads[T], e.g.:
import play.api.libs.json._
val standardReads = Json.reads[Person]
While these are super-handy, if you need additional validation, you'll have to define a custom Reads[Person] class; but fortunately we can still leverage the built-in JSON-to-case-class macro to do the basic checking and conversion, and then add an extra layer of custom checks if things seem OK:
val standardReads = Json.reads[Person]
val strictReads = new Reads[Person] {
val expectedKeys = Set("name", "age")
def reads(jsv:JsValue):JsResult[Person] = {
standardReads.reads(jsv).flatMap { person =>
checkUnwantedKeys(jsv, person)
}
}
private def checkUnwantedKeys(jsv:JsValue, p:Person):JsResult[Person] = {
val obj = jsv.asInstanceOf[JsObject]
val keys = obj.keys
val unwanted = keys.diff(expectedKeys)
if (unwanted.isEmpty) {
JsSuccess(p)
} else {
JsError(s"Keys: ${unwanted.mkString(",")} found in the incoming JSON")
}
}
}
Note how we utilize standardReads first, to make sure we're dealing with something that can be converted to a Person. No need to reinvent the wheel here.
We use flatMap to effectively short-circuit the conversion if we get a JsError from standardReads - i.e. we only call checkUnwantedKeys if needed.
checkUnwantedKeys just uses the fact that a JsObject is really just a wrapper around a Map, so we can easily check the names of the keys against a whitelist.
Note that you could also write that flatMap using a for-comprehension, which starts to look a lot cleaner if you need even more checking stages:
for {
p <- standardReads.reads(jsv)
r1 <- checkUnexpectedFields(jsv, p)
r2 <- checkSomeOtherStuff(jsv, r1)
r3 <- checkEvenMoreStuff(jsv, r2)
} yield r3
If you want to avoid too much boilerplate it is possible to make a more generic solution using a little bit of scala reflection:
import play.api.libs.json._
import scala.reflect.runtime.universe._
def checkedReads[T](underlyingReads: Reads[T])(implicit typeTag: TypeTag[T]): Reads[T] = new Reads[T] {
def classFields[T: TypeTag]: Set[String] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.decodedName.toString
}.toSet
def reads(json: JsValue): JsResult[T] = {
val caseClassFields = classFields[T]
json match {
case JsObject(fields) if (fields.keySet -- caseClassFields).nonEmpty =>
JsError(s"Unexpected fields provided: ${(fields.keySet -- caseClassFields).mkString(", ")}")
case _ => underlyingReads.reads(json)
}
}
}
Then you can specify your reads instances as:
implicit val reads = checkedReads(Json.reads[Person])
This leverages a fair bit of Scala type magic and also the reflection library (that lets you look at fields on classes).
Rather than relying on a fixed set of fields the classFields method gets all of the fields dynamically for the case class (type param T). It looks at all of the members and collects only the case class accessors (otherwise we'd pick up methods like toString). It returns a Set[String] of field names.
You'll notice that the checkedReads takes an implicit TypeTag[T]. This is supplied by the compiler at compile time and used by the typeOf method.
The remaining code is fairly self explanatory. If the incoming json matches our first case (it is a JsObject and there are fields not on the case class) then we return a JsError. Otherwise we pass it on to the underlying reader.
For the validate method on request.body it matches the attribute name and value type of the json object to those defined in the model definition. Now if I were to add an extra attribute to the json object and try to validate it, it passes as a JsSuccess when it shouldn't.
{
"Name": "Bob",
"Age": 20,
"Random_Field_Not_Defined_in_Models": "Test"
}
My Person Class is defined as follows
case class Person(name: String, age: Int)
I'm assuming you've been using the built-in Reads[T] or Format[T] converters that Play gives you via Json.reads[T], e.g.:
import play.api.libs.json._
val standardReads = Json.reads[Person]
While these are super-handy, if you need additional validation, you'll have to define a custom Reads[Person] class; but fortunately we can still leverage the built-in JSON-to-case-class macro to do the basic checking and conversion, and then add an extra layer of custom checks if things seem OK:
val standardReads = Json.reads[Person]
val strictReads = new Reads[Person] {
val expectedKeys = Set("name", "age")
def reads(jsv:JsValue):JsResult[Person] = {
standardReads.reads(jsv).flatMap { person =>
checkUnwantedKeys(jsv, person)
}
}
private def checkUnwantedKeys(jsv:JsValue, p:Person):JsResult[Person] = {
val obj = jsv.asInstanceOf[JsObject]
val keys = obj.keys
val unwanted = keys.diff(expectedKeys)
if (unwanted.isEmpty) {
JsSuccess(p)
} else {
JsError(s"Keys: ${unwanted.mkString(",")} found in the incoming JSON")
}
}
}
Note how we utilize standardReads first, to make sure we're dealing with something that can be converted to a Person. No need to reinvent the wheel here.
We use flatMap to effectively short-circuit the conversion if we get a JsError from standardReads - i.e. we only call checkUnwantedKeys if needed.
checkUnwantedKeys just uses the fact that a JsObject is really just a wrapper around a Map, so we can easily check the names of the keys against a whitelist.
Note that you could also write that flatMap using a for-comprehension, which starts to look a lot cleaner if you need even more checking stages:
for {
p <- standardReads.reads(jsv)
r1 <- checkUnexpectedFields(jsv, p)
r2 <- checkSomeOtherStuff(jsv, r1)
r3 <- checkEvenMoreStuff(jsv, r2)
} yield r3
If you want to avoid too much boilerplate it is possible to make a more generic solution using a little bit of scala reflection:
import play.api.libs.json._
import scala.reflect.runtime.universe._
def checkedReads[T](underlyingReads: Reads[T])(implicit typeTag: TypeTag[T]): Reads[T] = new Reads[T] {
def classFields[T: TypeTag]: Set[String] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.decodedName.toString
}.toSet
def reads(json: JsValue): JsResult[T] = {
val caseClassFields = classFields[T]
json match {
case JsObject(fields) if (fields.keySet -- caseClassFields).nonEmpty =>
JsError(s"Unexpected fields provided: ${(fields.keySet -- caseClassFields).mkString(", ")}")
case _ => underlyingReads.reads(json)
}
}
}
Then you can specify your reads instances as:
implicit val reads = checkedReads(Json.reads[Person])
This leverages a fair bit of Scala type magic and also the reflection library (that lets you look at fields on classes).
Rather than relying on a fixed set of fields the classFields method gets all of the fields dynamically for the case class (type param T). It looks at all of the members and collects only the case class accessors (otherwise we'd pick up methods like toString). It returns a Set[String] of field names.
You'll notice that the checkedReads takes an implicit TypeTag[T]. This is supplied by the compiler at compile time and used by the typeOf method.
The remaining code is fairly self explanatory. If the incoming json matches our first case (it is a JsObject and there are fields not on the case class) then we return a JsError. Otherwise we pass it on to the underlying reader.
Consider this example using Play's JSON API (play.api.libs.json):
case class FooJson(
// lots of other fields omitted
location: Option[LocationJson]
)
object FooJson {
implicit val writes = Json.writes[FooJson]
}
and
case class LocationJson(latitude: Double, longitude: Double)
object LocationJson {
implicit val writes = Json.writes[LocationJson]
}
If location is None, the resulting JSON won't have location field at all. This is fine and understadable. But if I wanted for some reason (say, to make my API more self-documenting), how can I explicitly output null in JSON?
{
"location": null
}
I also tried defining the field as location: LocationJson and passing option.orNull to it, but it does not work (scala.MatchError: null at play.api.libs.json.OWrites$$anon$2.writes). For non-custom types such as String or Double, this approach would produce null in JSON output.
So, while using Json.writes[FooJson] as shown above (or something equally simple, i.e. not having to write a custom Writes implementation), is there a clean way to include nulls in JSON?
What I'm asking is analogous to JsonInclude.Include.ALWAYS in the Jackson library (also Jackson's default behaviour). Similarly in Gson this would be trivial
(new GsonBuilder().serializeNulls().create()).
Play 2.4.4
Greg Methvin, a Play committer, wrote this answer to me in a related GitHub issue:
The JSON macros only support one way of encoding optional values,
which is to omit None values from the JSON. This is not a bug but
rather a limitation of the implementation. If you want to include
nulls you're unfortunately going to have to implement your own Writes.
I do think we should try to provide more configurability for the
macros though.
In this case, I'll let Play exclude this field when the value is null, even if it slightly sacrifices API consistency and self-documentability. It's still such a minor thing (in this particular API) that it doesn't warrant uglifying the code as much as a custom Writes would take for a case class with a dozen values.
I'm hoping they do make this more configurable in future Play versions.
Hello from the future.
As of Play 2.7, a fairly simple solution was introduced for automated JSON codecs. We can introduce the appropriate implicit value for JsonConfiguration in the scope for the Format/Reads/Writes. The following configuration will write nulls for empty Options instead of omitting the fields entirely.
import play.api.libs.json._
implicit val config = JsonConfiguration(optionHandlers = OptionHandlers.WritesNull)
implicit val residentWrites = Json.writes[Resident]
Reference
Here's a way to do it:
object MyWrites extends DefaultWrites{
override def OptionWrites[T](implicit fmt: Writes[T]): Writes[Option[T]] = new Writes[Option[T]] {
override def writes(o: Option[T]): JsValue = {
o match {
case Some(a) => Json.toJson(a)(fmt)
case None => JsNull
}
}
}
}
This will overwrite the default implementation which will not create an element. I used this in your sample code:
case class FooJson(
// ...
location: Option[LocationJson]
)
case class LocationJson(latitude: Double, longitude: Double)
object LocationJson {
implicit val writes = Json.writes[LocationJson]
}
implicit val fooJsonWriter: Writes[FooJson] = new Writes[FooJson] {
override def writes(o: FooJson): JsValue = {
JsObject(Seq(
"location" -> Json.toJson(o.location)
// Additional fields go here.
))
}
}
Json.toJson(FooJson(None))
And got this result res0: play.api.libs.json.JsValue = {"location":null}.
if we have null values then we have to add the option with members in case class which will resolve the issue
case class response(
name:String,
age: option[int]
)
object response {
implicit val format = Json.format[response]
}
Here the option is the answer for us. and if we are the JSON response for age is coming as null and this will handle the solution for us.
I'm new to scala and play framework.
Why does scala not have something like this??
class Customer (idx: Int, emailx: String) {
val id: Int = idx
val email: String = emailx
}
....
def customers = Action {
val customer = new Customer(1, "Customer1")
Ok(Json.toJson(customer))
}
I like play frameowrk (with scala, its productivity)
But,
Why should I map each field of my object manulay to json field?? Was it so hard for scala to implement this future like in Java or C#, even php has json_encode.
Is there any way to achieve this simple goal (return object as json) without any additional manipulations?
Macros are slick and perfect for generating simple case class formats
implicit val jsonFormat = Json.format[Customer]
Typically you put this declaration in your companion object to the type you are generating a format for. This way it is implicitly in scope in any file that you import your type (Customer). Like this:
case class Customer(...)
object Customer {
implicit val jsonFormat = Json.format[Customer]
}
Then in your controller you can do
Json.toJson(customer)
which will produce the JsValue type expected by Play.
For my classes I define a convertor, so that I can write exactly what you have written, e.g. Json.toJson(customer), but the convertor, though simple, does currently have to be written once. E.g.
implicit val customerWrites = new Writes[Customer] {
def writes(customer:Customer) = Json.obj(
"id" -> customer.id,
"email" -> customer.email
)
}
Perhaps macros, into which I have not delved, could do this more automatically...
I need to serialize/deserialize a Scala class with structure something like the following:
#JsonIgnoreProperties(ignoreUnknown = true, value = Array("body"))
case class Example(body: Array[Byte]) {
lazy val isNativeText = bodyIsNativeText
lazy val textEncodedBody = (if (isNativeText) new String(body, "UTF-8") else Base64.encode(body))
def this(isNativeText: Boolean, textEncodedBody: String) = this((if(isNativeText) str.getBytes("UTF-8") else Base64.decode(textEncodedBody)))
def bodyIsNativeText: Boolean = // determine if the body was natively a string or not
}
It's main member is an array of bytes, which MIGHT represent a UTF-8 encoded textual string, but might not. The primary constructor accepts an array of bytes, but there is an alternate constructor which accepts a string with a flag indicating whether this string is base64 encoded binary data, or the actual native text we want to store.
For serializing to a JSON object, I want to store the body as a native string rather than a base64-encoded string if it is native text. That's why I use #JsonIgnoreProperties to not include the body property, and instead have a textEncodedBody that gets echoed out in the JSON.
The problem comes when I try to deserialize it like so:
val e = Json.parse[Example]("""{'isNativeText': true, 'textEncodedBody': 'hello'}""")
I receive the following error:
com.codahale.jerkson.ParsingException: Invalid JSON. Needed [body],
but found [isNativeText, textEncodedBody].
Clearly, I have a constructor that will work...it just is not the default one. How can I force Jerkson to use this non-default constructor?
EDIT: I've attempted to use both the #JsonProperty and #JsonCreator annotation, but jerkson appears to disregard both of those.
EDIT2: Looking over the jerkson case class serialization source code, it looks like a case class method with the same name as its field will be used in the way that a #JsonProperty would function - that is, as a JSON getter. If I could do that, it would solve my problem. Not being super familiar with Scala, I have no idea how to do that; is it possible for a case class to have a user-defined method with the same name as one of its fields?
For reference, here is the code below that leads me to this conclusion...
private val methods = klass.getDeclaredMethods
.filter { _.getParameterTypes.isEmpty }
.map { m => m.getName -> m }.toMap
def serialize(value: A, json: JsonGenerator, provider: SerializerProvider) {
json.writeStartObject()
for (field <- nonIgnoredFields) {
val methodOpt = methods.get(field.getName)
val fieldValue: Object = methodOpt.map { _.invoke(value) }.getOrElse(field.get(value))
if (fieldValue != None) {
val fieldName = methodOpt.map { _.getName }.getOrElse(field.getName)
provider.defaultSerializeField(if (isSnakeCase) snakeCase(fieldName) else fieldName, fieldValue, json)
}
}
json.writeEndObject()
}
Correct me if I'm wrong, but it looks like Jackson/Jerkson will not support arbitrarily nested JSON. There's an example on the wiki that uses nesting, but it looks like the target class must have nested classes corresponding to the nested JSON.
Anyway, if you're not using nesting with your case classes then simply declaring a second case class and a couple implicit conversions should work just fine:
case class Example(body: Array[Byte]) {
// Note that you can just inline the body of bodyIsNativeText here
lazy val isNativeText: Boolean = // determine if the body was natively a string or not
}
case class ExampleRaw(isNativeText: Boolean, textEncodedBody: String)
implicit def exampleToExampleRaw(ex: Example) = ExampleRaw(
ex.isNativeText,
if (ex.isNativeText) new String(ex.body, "UTF-8")
else Base64.encode(ex.body)
)
implicit def exampleRawToExample(raw: ExampleRaw) = Example(
if (raw.isNativeText) raw.textEncodedBody.getBytes("UTF-8")
else Base64.decode(textEncodedBody)
)
Now you should be able to do this:
val e: Example = Json.parse[ExampleRaw](
"""{'isNativeText': true, 'textEncodedBody': 'hello'}"""
)
You could leave the original methods and annotations you added to make the JSON generation continue to work with the Example type, or you could just convert it with a cast:
generate(Example(data): ExampleRaw)
Update:
To help catch errors you might want to do something like this too:
case class Example(body: Array[Byte]) {
// Note that you can just inline the body of bodyIsNativeText here
lazy val isNativeText: Boolean = // determine if the body was natively a string or not
lazy val doNotSerialize: String = throw new Exception("Need to convert Example to ExampleRaw before serializing!")
}
That should cause an exception to be thrown if you accidentally pass an instance of Example instead of ExampleRaw to a generate call.