I'm trying to make all keys in a json object formatted in PascalCase when serializing a case class. It looks like the right way to do this is to define a CustomKeySerializer from the org.json4s package and reformat keys as I wish. However, while I am able to get a CustomSerializer to work, I'm not able to get a CustomKeySerializer to actually get used when serializing a case class (with nested case classes of unknown types). My code looks like the following:
case object PascalCaseSerializer extends CustomKeySerializer[String](format => (
{ case _ => "this is the deserializer and I don't need it" },
{ case _ => "this does nothing" }
))
implicit val formats: Formats = DefaultFormats + PascalCaseSerializer
case class Foo(thingId: Int, eventData: Any)
case class Bar(numThings: Int)
val event = Foo(1, Bar(2))
val payloadJson = write(event) // """{"thingId":1,"eventData":{"numThings":2}}"""
What am I missing here?
It looks like you will have to use CustomSerializer. If you look at the Extraction.scala source at internalDecomposeWithBuilder, you may notice a piece of code that looks like:
while(iter.hasNext) {
iter.next() match {
case (k: String, v) => addField(k, v, obj)
case (k: Symbol, v) => addField(k.name, v, obj)
...
case (k, v) => {
val customKeySerializer = formats.customKeySerializer(formats)
if(customKeySerializer.isDefinedAt(k)) {
addField(customKeySerializer(k), v, obj)
} else {
fail("Do not know how to serialize key of type " + k.getClass + ". " +
"Consider implementing a CustomKeySerializer.")
}
}
}
}
It means that you can't use CustomKeySerializer[String] to override default behavior for String keys. You can only use CustomKeySerializer to add some behavior for key types not explicitly defined in this pattern matching.
Related
I have some json when I parsed that it is returning stucture Some(List(Map...))
How to match with something and get the values
Below is the code I tried, I need to get all the map values
import scala.util.parsing.json._
val result = JSON.parseFull("[{\"start\":\"starting\",\"test\":123,\"test2\":324,\"end\":\"ending\"}]")
result match {
case Some(map: Map[String, Any]) => { println(map)
}
case None => println("Parsing failed")
case other => println("Unknown data structure: " + other)
}
but its printing non matching
Unknown data structure: Some(List(Map(start -> starting, test -> 123, test2 -> 324, end -> ending)))
Because of type erasure, you cannot pattern match on generics types. List, Map, Option are generic containers, and in runtime compiler will erase types of these generic containers. e.g. List[String], String will be erased and type will be List[_].
case Some(map: List[Map[String, Any]]) => println(map)
In your case above case, if result is val result: Option[Any] = Some(List(12)) i.e. 12, which type is Int and not Map[String, Any], compiler will still match the result with above case even it expects Map[String, Any]] as type of List.
So, Whats going on?
Its all because of type erasure. The compiler will erase all the types and it won't have any type information at runtime unless you use reflection. That means:
case Some(map: List[Map[String, Any]]) => println(map) is essentially case Some(map: List[_]) => println(map) and therefore, match will success for any type parameter of List e.g. List[Map[String, Any]], List[Map[String, Int]], List[String], List[Int] etc.
Therefore, In case you need to match on such generic container, you have to resolve each container and its nested subtypes explicitly.
def resolve(result: Any): Unit = result match {
case Some(res) if res.isInstanceOf[List[_]] && res.asInstanceOf[List[_]].isEmpty => println("Empty List.") //Some(List())
case Some(res) if res.isInstanceOf[List[_]] && !res.asInstanceOf[List[_]].exists(p => p.isInstanceOf[Map[_, _]] && p.asInstanceOf[Map[_, _]].nonEmpty) => println("List is not empty but each item of List is empty Map.") //Some(List(Map(), Map()))
case Some(res) if res.isInstanceOf[List[_]] && res.asInstanceOf[List[_]].filter(p => p.isInstanceOf[Map[_, _]] && p.asInstanceOf[Map[_, _]].nonEmpty).map(_.asInstanceOf[Map[_,_]]).exists(p => {p.head match {case e if e._1.isInstanceOf[String] && e._2.isInstanceOf[Any] => true; case _ => false}}) => println("Correct data.") // Some(List(Map("key1"-> 1), Map("key2" -> 2)))
case None => println("none")
case other => println("other")
}
val a: Option[Any] = Some(List())
val b: Option[Any] = Some(List(Map(), Map()))
val c: Option[Any] = Some(List(Map("key1"-> 1), Map("key2" -> 2)))
val d: Option[Any] = None
val e: Option[Any] = Some("apple")
resolve(a) // match first case
resolve(b) // match second case
resolve(c) // match third case
resolve(d) // match fourth case
resolve(e) // match fifth case
As has been pointed out the return type is actually Option[List[Map[String, Any]]] so you need to unpick this. However you cannot do this with a single match because of type erasure, so you need to do nested matches to ensure that you have the correct type. This is really tedious, so I thoroughly recommend using something like the Extraction.extract function in json4s that will attempt to match your JSON to a specific Scala type:
type ResultType = List[Map[String, Any]]
def extract(json: JValue)(implicit formats: Formats, mf: Manifest[ResultType]): ResultType =
Extraction.extract[ResultType](json)
If you must do it by hand, it looks something like this:
result match {
case Some(l: List[_]) =>
l.headOption match {
case Some(m) =>
m match {
case m: Map[_,_] =>
m.headOption match {
case Some(p) =>
p match {
case (_: String, _) =>
m.foreach(println(_))
case _ => println("Map key was not String")
}
case _ => println("Map was empty")
}
case _ => println("List did not contain a Map")
}
case _ => println("Result List was empty")
}
case _ => println("Parsing failed")
}
Your output is Option[List[Map[String, Any]]], not Option[Map[String, Any]]. match on the List and you'll be fine:
import scala.util.parsing.json._
val result = JSON.parseFull("[{\"start\":\"starting\",\"test\":123,\"test2\":324,\"end\":\"ending\"}]")
val l: List[Map[String, Any]] = result match {
case Some(list: List[Map[String, Any]]) => list
case _ => throw new Exception("I shouldn't be here") // whatever for a non-match
}
Then you can map (if you want a non-Unit return type)/ foreach (if you don't care about a Unit return type) on that List and do whatever you want it:
l.foreach(println)
l.map(_.toString) // or whatever you want ot do with the Map
If I have a JSON object like:
{
"test": 3
}
Then I would expect that extracting the "test" field as a String would fail because the types don't line up:
import org.json4s._
import org.json4s.jackson.JsonMethods
import org.json4s.JsonAST.JValue
def getVal[T: Manifest](json: JValue, fieldName: String): Option[T] = {
val field = json findField {
case JField(name, _) if name == fieldName => true
case _ => false
}
field.map {
case (_, value) => value.extract[T]
}
}
val json = JsonMethods.parse("""{"test":3}""")
val value: Option[String] = getVal[String](json, "test") // Was Some(3) but expected None
Is this automatic conversion from a JSON numeric to a String expected in Json4s? If so, are there any workarounds for this where the extracted field has to be of the same type that is specified in the type parameter to the extract method?
This is the default nature of most if not all of the parsers. If you request a value of type T and if the value can be safely cast to that specific type then the library would cast it for you. for instance take a look at the typesafe config with the similar nature of casting Numeric field to String.
import com.typesafe.config._
val config = ConfigFactory parseString """{ test = 3 }"""
val res1 = config.getString("test")
res1: String = 3
if you wanted not to automatically cast Integer/Boolean to String you could do something like this manually checking for Int/Boolean types as shown below.
if(Try(value.extract[Int]).isFailure || Try(value.extract[Boolean]).isFailure) {
throw RuntimeException(s"not a String field. try Int or Boolean")
} else {
value.extract[T]
}
One simple workaround is to create a custom serializer for cases where you want "strict" behavior. For example:
import org.json4s._
val stringSerializer = new CustomSerializer[String](_ => (
{
case JString(s) => s
case JNull => null
case x => throw new MappingException("Can't convert %s to String." format x)
},
{
case s: String => JString(s)
}
))
Adding this serializer to your implicit formats ensures the strict behavior:
implicit val formats = DefaultFormats + stringSerializer
val js = JInt(123)
val str = js.extract[String] // throws MappingException
I have a case class that only is a wrapper of a collection like this:
case class MyClass(list: List[String])
If I now try do deserialize some arbitrary json into this case class it doesn't fail if the list field is missing.
Is it possible to force it to fail during extraction?
Code example:
import org.json4s.jackson.JsonMethods.parse
implicit val formats = org.json4s.DefaultFormats
val json = parse("""{ "name": "joe" }""")
case class MyClass(list: List[String])
val myClass = json.extract[MyClass] // Works!
assert(myClass.list.isEmpty)
case class MyClass2(test: String)
val myClass2 = json.extract[MyClass2] // Fails!
I need it to fail for the missing list field as it does for the string field.
Please help. Thx!
Solution:
You can create a MyClass deserializer to deserialize MyClass with the List field, like:
class MyClassSerializer extends Serializer[MyClass] {
private val MyClassClass = classOf[MyClass]
override def deserialize(implicit format: Formats): PartialFunction[(TypeInfo, JValue), MyClass] = {
case (TypeInfo(MyClassClass, _), json) => json match {
case JObject(JField("list", JArray(l)) :: _) =>
MyClass(l.map(i => i.extract[String]))
case x => throw new MappingException("Can't convert " + x + " to MyClass")
}
}
override def serialize(implicit format: Formats): PartialFunction[Any, JValue] = ???
}
The cause missing List field can't parse with Exception is in Extraction.scala:
private class CollectionBuilder(json: JValue, tpe: ScalaType)(implicit formats: Formats) {
private[this] val typeArg = tpe.typeArgs.head
private[this] def mkCollection(constructor: Array[_] => Any) = {
val array: Array[_] = json match {
case JArray(arr) => arr.map(extract(_, typeArg)).toArray
case JNothing | JNull => Array[AnyRef]()
case x => fail("Expected collection but got " + x + " for root " + json + " and mapping " + tpe)
}
constructor(array)
}
we can see the above code snippet, when we meet a collection in the case class, Json4S will retrieve the fields in the Json AST by the constructor parameter name(list), if it can't find the parameter name(list), it will return JNothing, but for JNothing in the above code snippet, it will create a new collection(case JNothing | JNull => Array[AnyRef]()) without throwing errors.
I am building a web app using Scala / Play Framework and Reactive Mongo and I want the models to be defined in the database instead of having them hardcoded.
To do so, I am writing a class EntityInstance taking a Sequence of FieldInstance :
case class EntityInstance(fields: Seq[FieldInstance])
I am trying to accept fields from any types and to convert them to Json : example
new FieldInstance("name", "John") | json: { "name": "John" }
new FieldInstance("age", 18) | json: { "age": 18 }
At the moment I am trying to accept Strings, Booleans and Integers and if the type is not supported I write some error :
new FieldInstance("profilePicture", new Picture("john.jpg") | json: { "profilePicture": "Unsupported type
I wrote a FieldInstance class taking a fieldName as a String and a value as any type. As soon as that class is instantiated I cast the value to a known type or to the String describing the error.
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec match {
case v: Int => v
case v: String => v
case v: Boolean => v
case _ => "Unrecognized type"
}
}
object FieldInstance {
implicit val fieldInstanceWrites = new Writes[FieldInstance] {
def writes(fieldInstance: FieldInstance) = Json.obj(
fieldInstance.fieldName -> fieldInstance.value
)
}
}
I created a companion object with an implicit Write to json so I can call "Json.toJson()" on an instance of FieldInstance and get a json as described on my examples above.
I get an error : found: Any required: play.api.libs.json.Json.JsValueWrapper
I understand that it comes from the fact that my value is of type Any but I thought the cast would change that Any to String || Boolean || Int before hitting the Writer.
PS: Ignore the bad naming of the classes, I could not name EntityInstance and FieldInstance, Entity and Field because these as the classes I use to describe my models.
I found a fix to my problem :
The type matching that I was doing in the class should be done in the implicit Write !
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec
override def toString(): String = "(" + fieldName + "," + value + ")";
}
object FieldInstance {
implicit val fieldInstanceWrites = new Writes[FieldInstance] {
def writes(fieldInstance: FieldInstance) =
fieldInstance.value match {
case v: Int => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[Int])
case v: String => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[String])
case v: Boolean => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[Boolean])
case _ => Json.obj(fieldInstance.fieldName -> "Unsupported type")
}
}
}
This code now allows a user to create an EntityInstance with Fields of Any type :
val ei = new EntityInstance(Seq[FieldInstance](new FieldInstance("name", "George"), new FieldInstance("age", 25), new FieldInstance("married", true)))
println("-- TEST ENTITY INSTANCE TO JSON --")
println(Json.toJson(ei))
prints : {"entity":[{"name":"George"},{"age":25},{"married":true}]}
Here is my EntityInstance code if you are trying to test it :
case class EntityInstance(fields: Seq[FieldInstance])
object EntityInstance {
implicit val EntityInstanceWrites = new Writes[EntityInstance] {
def writes(entityInstance: EntityInstance) =
Json.obj("entity" -> entityInstance.fields)
}
}
It is returning a String, Int or Boolean but Json.obj is expecting the value parameter of type (String, JsValueWrapper)
def obj(fields: (String, JsValueWrapper)*): JsObject = JsObject(fields.map(f => (f._1, f._2.asInstanceOf[JsValueWrapperImpl].field)))
a quick fix could be to convert the matched value v with toJson provided the implicit Writes[T] for type T is available (which they are for String, Int and Boolean)
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec match {
case v: Int => Json.toJson(v)
case v: String => Json.toJson(v)
case v: Boolean => Json.toJson(v)
case _ => Json.toJson("Unrecognized type")
}
}
If you'd like to see which DefaultWrites are available you can browse them in the play.api.libs.json package in trait DefaultWrites
for example:
/**
* Serializer for Boolean types.
*/
implicit object BooleanWrites extends Writes[Boolean] {
def writes(o: Boolean) = JsBoolean(o)
}
I use Scala case classes and Play "format" to validate JSON messages. For example:
case class Person(name: String)
implicit val formatPerson = Json.format[Person]
The following JSON:
{
"name": "Alice"
}
would be validated through the method
Json.validate[Person](json)
Now I would like to validate a JSON message with a field "x" that could be a String or a Integer.
For example the two following messages would be both validated with the same method:
{
"x": "hello"
}
{
"x": 8
}
I tried with the following trick but it does not work:
case class Foo(x: Either[String,Int])
implicit val formatFoo = Json.format[Foo]
When I try to define the format for Foo class, the compiler says: "No apply function found matching unapply parameters". Thanks in advance.
You'll have to define a custom Format[Foo]. This should work for Play 2.4
implicit val formatFoo = new Format[Foo]{
override def writes(o: Foo): JsValue = o.x match {
case Left(str) => Json.obj("x" -> str)
case Right(n) => Json.obj("x" -> n)
}
override def reads(json: JsValue): JsResult[Foo] = json \ "x" match {
case JsDefined(JsString(str)) => JsSuccess(Foo(Left(str)))
case JsDefined(JsNumber(n)) if n.isValidInt => JsSuccess(Foo(Right(n.toInt)))
case _ => JsError("error")
}
}