Argonaut, custom JSON format mapping - json

Model:
case class DateValue(year: Option[Int] = None, month: Option[Int] = None)
Argonaut-based decoder:
implicit val dateValueDecode = casecodec2(DateValue.apply, DateValue.unapply)("year", "month")
This allows to parse such format:
{
"year": "2013",
"month": "10"
}
Now I want to simplify JSON format and use
"2013/10"
instead, but leave my model unchanged. How to accomplish this with Argonaut?

The following is off the cuff but it should work. Note that I'm assuming that you just want an empty string on either side of the divider when that value is empty, and I'm not validating that the month value is between 1 and 12.
import argonaut._, Argonaut._
import scalaz._, Scalaz._
case class DateValue(year: Option[Int] = None, month: Option[Int] = None)
object YearMonth {
def unapplySeq(s: String) =
"""((?:\d\d\d\d)?)/((?:\d?\d)?)""".r.unapplySeq(s).map {
case List("", "") => List(None, None)
case List(y, "") => List(Some(y.toInt), None)
case List("", m) => List(None, Some(m.toInt))
case List(y, m) => List(Some(y.toInt), Some(m.toInt))
}
}
implicit val DateValueCodecJson: CodecJson[DateValue] = CodecJson(
{ case DateValue(year, month) => jString(~year + "/" + ~month) },
c => c.as[String].flatMap {
case YearMonth(y, m) => DecodeResult.ok(DateValue(y, m))
case _ => DecodeResult.fail("Not a valid date value!", c.history)
}
)
And then:
val there = Parse.decodeValidation[DateValue](""""2013/12"""")
val back = there.map(DateValueCodecJson.encode)
Which gives us:
scala> println(there)
Success(DateValue(Some(2013),Some(12)))
scala> println(back)
Success("2013/12")
As expected.
The trick is to provide your own encoding and decoding functions to CodecJson.apply. The encoding function is very straightforward—it just takes something of the encoded type and returns a Json value. The decoding method is a little more complicated, since it takes an HCursor and returns a DecodeResult, but these are also pretty easy to work with.

Related

Scala Play: List to Json-Array

I've got a List which holds some Personalization-objects. The latter is defined like this:
sealed case class Personalization(firstname: String, lastname: String, keycardId: String)
I need to map this list to a Json-Array structure which has to look like this:
"personalization": [
{
"id": "firstname",
"value": "John"
},
{
"id": "lastname",
"value": "Doe"
}...
I am struggling with the part of mapping the field information to id/value pairs. Normally, I would create a play.api.libs.json.Format out of the Personalization class and let it map automatically -> Json.format[Personalization] - but this time, I need to create an array where an entry can hold n attributes.
Therefore I am asking for advice, if there is a possibility to use the Scala Play-Framework?
Any input is much appreciated, thank you!
Writing as such JSON representation is not quite complex, using Writes.transform.
import play.api.libs.json._
case class Personalization(firstname: String, lastname: String, keycardId: String) // No need to seal a case class
implicit def writes: Writes[Personalization] = {
val tx: JsValue => JsValue = {
case JsObject(fields) => Json.toJson(fields.map {
case (k, v) => Json.obj("id" -> k, "value" -> v)
})
case jsUnexpected => jsUnexpected // doesn't happen with OWrites
}
Json.writes[Personalization].transform(tx)
}
Which can be tested as bellow.
val personalization = Personalization(
firstname = "First",
lastname = "Last",
keycardId = "Foo")
val jsonRepr = Json.toJson(personalization)
// => [{"id":"firstname","value":"First"},{"id":"lastname","value":"Last"},{"id":"keycardId","value":"Foo"}]
Reading is a little bit tricky:
implicit def reads: Reads[Personalization] = {
type Field = (String, Json.JsValueWrapper)
val fieldReads = Reads.seq(Reads[Field] { js =>
for {
id <- (js \ "id").validate[String]
v <- (js \ "value").validate[JsValue]
} yield id -> v
})
val underlying = Json.reads[Personalization]
Reads[Personalization] { js =>
js.validate(fieldReads).flatMap { fields =>
Json.obj(fields: _*).validate(underlying)
}
}
}
Which can be tested as bellow.
Json.parse("""[
{"id":"firstname","value":"First"},
{"id":"lastname","value":"Last"},
{"id":"keycardId","value":"Foo"}
]""").validate[Personalization]
// => JsSuccess(Personalization(First,Last,Foo),)
Note that is approach can be used for any case class format.
Probably it is possible to do it a more elegant way I did, but you can use the following snippet:
case class Field(id: String, value: String)
object Field {
implicit val fieldFormatter: Format[Field] = Json.format[Field]
}
sealed case class Personalization(firstname: String, lastname: String, keycardId: String)
object Personalization {
implicit val personalizationFormatter: Format[Personalization] = new Format[Personalization] {
override def reads(json: JsValue): JsResult[Personalization] =
Try {
val data = (json \ "personalization").as[JsValue]
data match {
case JsArray(value) =>
val fields = value.map(_.as[Field]).map(f => f.id -> f.value).toMap
val firstname = fields.getOrElse("firstname", throw new IllegalArgumentException("Mandatory field firstname is absent."))
val lastname = fields.getOrElse("lastname", throw new IllegalArgumentException("Mandatory field lastname is absent."))
val keycardId = fields.getOrElse("keycardId", throw new IllegalArgumentException("Mandatory field keycardId is absent."))
Personalization(firstname, lastname, keycardId)
case _ => throw new IllegalArgumentException("Incorrect json format for Personalization.")
}
}.toEither.fold(e => JsError(e.getMessage), JsSuccess(_))
override def writes(o: Personalization): JsValue = {
val fields = List(Field("firstname", o.firstname), Field("lastname", o.lastname), Field("keycardId", o.keycardId))
JsObject(List("personalization" -> Json.toJson(fields)))
}
}
}
It converts {"personalization":[{"id":"firstname","value":"John"},{"id":"lastname","value":"Doe"},{"id":"keycardId","value":"1234"}]} to Personalization(John,Doe,1234) and vice versa

In Json4s why does an integer field in a JSON object get automatically converted to a String?

If I have a JSON object like:
{
"test": 3
}
Then I would expect that extracting the "test" field as a String would fail because the types don't line up:
import org.json4s._
import org.json4s.jackson.JsonMethods
import org.json4s.JsonAST.JValue
def getVal[T: Manifest](json: JValue, fieldName: String): Option[T] = {
val field = json findField {
case JField(name, _) if name == fieldName => true
case _ => false
}
field.map {
case (_, value) => value.extract[T]
}
}
val json = JsonMethods.parse("""{"test":3}""")
val value: Option[String] = getVal[String](json, "test") // Was Some(3) but expected None
Is this automatic conversion from a JSON numeric to a String expected in Json4s? If so, are there any workarounds for this where the extracted field has to be of the same type that is specified in the type parameter to the extract method?
This is the default nature of most if not all of the parsers. If you request a value of type T and if the value can be safely cast to that specific type then the library would cast it for you. for instance take a look at the typesafe config with the similar nature of casting Numeric field to String.
import com.typesafe.config._
val config = ConfigFactory parseString """{ test = 3 }"""
val res1 = config.getString("test")
res1: String = 3
if you wanted not to automatically cast Integer/Boolean to String you could do something like this manually checking for Int/Boolean types as shown below.
if(Try(value.extract[Int]).isFailure || Try(value.extract[Boolean]).isFailure) {
throw RuntimeException(s"not a String field. try Int or Boolean")
} else {
value.extract[T]
}
One simple workaround is to create a custom serializer for cases where you want "strict" behavior. For example:
import org.json4s._
val stringSerializer = new CustomSerializer[String](_ => (
{
case JString(s) => s
case JNull => null
case x => throw new MappingException("Can't convert %s to String." format x)
},
{
case s: String => JString(s)
}
))
Adding this serializer to your implicit formats ensures the strict behavior:
implicit val formats = DefaultFormats + stringSerializer
val js = JInt(123)
val str = js.extract[String] // throws MappingException

How can I make json4s extract fail if json doesn't contain field

I have a case class that only is a wrapper of a collection like this:
case class MyClass(list: List[String])
If I now try do deserialize some arbitrary json into this case class it doesn't fail if the list field is missing.
Is it possible to force it to fail during extraction?
Code example:
import org.json4s.jackson.JsonMethods.parse
implicit val formats = org.json4s.DefaultFormats
val json = parse("""{ "name": "joe" }""")
case class MyClass(list: List[String])
val myClass = json.extract[MyClass] // Works!
assert(myClass.list.isEmpty)
case class MyClass2(test: String)
val myClass2 = json.extract[MyClass2] // Fails!
I need it to fail for the missing list field as it does for the string field.
Please help. Thx!
Solution:
You can create a MyClass deserializer to deserialize MyClass with the List field, like:
class MyClassSerializer extends Serializer[MyClass] {
private val MyClassClass = classOf[MyClass]
override def deserialize(implicit format: Formats): PartialFunction[(TypeInfo, JValue), MyClass] = {
case (TypeInfo(MyClassClass, _), json) => json match {
case JObject(JField("list", JArray(l)) :: _) =>
MyClass(l.map(i => i.extract[String]))
case x => throw new MappingException("Can't convert " + x + " to MyClass")
}
}
override def serialize(implicit format: Formats): PartialFunction[Any, JValue] = ???
}
The cause missing List field can't parse with Exception is in Extraction.scala:
private class CollectionBuilder(json: JValue, tpe: ScalaType)(implicit formats: Formats) {
private[this] val typeArg = tpe.typeArgs.head
private[this] def mkCollection(constructor: Array[_] => Any) = {
val array: Array[_] = json match {
case JArray(arr) => arr.map(extract(_, typeArg)).toArray
case JNothing | JNull => Array[AnyRef]()
case x => fail("Expected collection but got " + x + " for root " + json + " and mapping " + tpe)
}
constructor(array)
}
we can see the above code snippet, when we meet a collection in the case class, Json4S will retrieve the fields in the Json AST by the constructor parameter name(list), if it can't find the parameter name(list), it will return JNothing, but for JNothing in the above code snippet, it will create a new collection(case JNothing | JNull => Array[AnyRef]()) without throwing errors.

ScalaJson implicit Write, found: Any required: play.api.libs.json.Json.JsValueWrapper

I am building a web app using Scala / Play Framework and Reactive Mongo and I want the models to be defined in the database instead of having them hardcoded.
To do so, I am writing a class EntityInstance taking a Sequence of FieldInstance :
case class EntityInstance(fields: Seq[FieldInstance])
I am trying to accept fields from any types and to convert them to Json : example
new FieldInstance("name", "John") | json: { "name": "John" }
new FieldInstance("age", 18) | json: { "age": 18 }
At the moment I am trying to accept Strings, Booleans and Integers and if the type is not supported I write some error :
new FieldInstance("profilePicture", new Picture("john.jpg") | json: { "profilePicture": "Unsupported type
I wrote a FieldInstance class taking a fieldName as a String and a value as any type. As soon as that class is instantiated I cast the value to a known type or to the String describing the error.
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec match {
case v: Int => v
case v: String => v
case v: Boolean => v
case _ => "Unrecognized type"
}
}
object FieldInstance {
implicit val fieldInstanceWrites = new Writes[FieldInstance] {
def writes(fieldInstance: FieldInstance) = Json.obj(
fieldInstance.fieldName -> fieldInstance.value
)
}
}
I created a companion object with an implicit Write to json so I can call "Json.toJson()" on an instance of FieldInstance and get a json as described on my examples above.
I get an error : found: Any required: play.api.libs.json.Json.JsValueWrapper
I understand that it comes from the fact that my value is of type Any but I thought the cast would change that Any to String || Boolean || Int before hitting the Writer.
PS: Ignore the bad naming of the classes, I could not name EntityInstance and FieldInstance, Entity and Field because these as the classes I use to describe my models.
I found a fix to my problem :
The type matching that I was doing in the class should be done in the implicit Write !
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec
override def toString(): String = "(" + fieldName + "," + value + ")";
}
object FieldInstance {
implicit val fieldInstanceWrites = new Writes[FieldInstance] {
def writes(fieldInstance: FieldInstance) =
fieldInstance.value match {
case v: Int => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[Int])
case v: String => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[String])
case v: Boolean => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[Boolean])
case _ => Json.obj(fieldInstance.fieldName -> "Unsupported type")
}
}
}
This code now allows a user to create an EntityInstance with Fields of Any type :
val ei = new EntityInstance(Seq[FieldInstance](new FieldInstance("name", "George"), new FieldInstance("age", 25), new FieldInstance("married", true)))
println("-- TEST ENTITY INSTANCE TO JSON --")
println(Json.toJson(ei))
prints : {"entity":[{"name":"George"},{"age":25},{"married":true}]}
Here is my EntityInstance code if you are trying to test it :
case class EntityInstance(fields: Seq[FieldInstance])
object EntityInstance {
implicit val EntityInstanceWrites = new Writes[EntityInstance] {
def writes(entityInstance: EntityInstance) =
Json.obj("entity" -> entityInstance.fields)
}
}
It is returning a String, Int or Boolean but Json.obj is expecting the value parameter of type (String, JsValueWrapper)
def obj(fields: (String, JsValueWrapper)*): JsObject = JsObject(fields.map(f => (f._1, f._2.asInstanceOf[JsValueWrapperImpl].field)))
a quick fix could be to convert the matched value v with toJson provided the implicit Writes[T] for type T is available (which they are for String, Int and Boolean)
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec match {
case v: Int => Json.toJson(v)
case v: String => Json.toJson(v)
case v: Boolean => Json.toJson(v)
case _ => Json.toJson("Unrecognized type")
}
}
If you'd like to see which DefaultWrites are available you can browse them in the play.api.libs.json package in trait DefaultWrites
for example:
/**
* Serializer for Boolean types.
*/
implicit object BooleanWrites extends Writes[Boolean] {
def writes(o: Boolean) = JsBoolean(o)
}

Play: validate JSON with a field with possible multiple types

I use Scala case classes and Play "format" to validate JSON messages. For example:
case class Person(name: String)
implicit val formatPerson = Json.format[Person]
The following JSON:
{
"name": "Alice"
}
would be validated through the method
Json.validate[Person](json)
Now I would like to validate a JSON message with a field "x" that could be a String or a Integer.
For example the two following messages would be both validated with the same method:
{
"x": "hello"
}
{
"x": 8
}
I tried with the following trick but it does not work:
case class Foo(x: Either[String,Int])
implicit val formatFoo = Json.format[Foo]
When I try to define the format for Foo class, the compiler says: "No apply function found matching unapply parameters". Thanks in advance.
You'll have to define a custom Format[Foo]. This should work for Play 2.4
implicit val formatFoo = new Format[Foo]{
override def writes(o: Foo): JsValue = o.x match {
case Left(str) => Json.obj("x" -> str)
case Right(n) => Json.obj("x" -> n)
}
override def reads(json: JsValue): JsResult[Foo] = json \ "x" match {
case JsDefined(JsString(str)) => JsSuccess(Foo(Left(str)))
case JsDefined(JsNumber(n)) if n.isValidInt => JsSuccess(Foo(Right(n.toInt)))
case _ => JsError("error")
}
}