scala spray json optional id parameter in entity class - json

I'm using Salat library for serializing case classes as mongoDb objects. My Item.scala file looks like this:
case class Item(_id: String = (new ObjectId).toString, itemId: Int, var name: String, var active: Boolean) extends WithId {
override def id: ObjectId = new ObjectId(_id)
}
object Item extends MongoDb[Item] with MongoDao[Item] {
override def collectionName: String = "items"
}
object ItemJsonProtocol extends DefaultJsonProtocol {
implicit val itemFormat = jsonFormat4(Item.apply)
}
Now, I'm using it to post the Item entities as Json via Spray HTTP. I'd want to invoke it as follows:
curl.exe -H "Content-Type: application/json" -X PUT -d "{\"itemId\":
1, \"active\":true, \"name\" : \"test\"}" http://localhost:8080/items/
hoping it would provide generated id if I don't provide one.
However, after invoking curl command I'm getting an error:
The request content was malformed:
Object is missing required member '_id'
Is there any way to mark the _id field as optional without making the Option out of it (this field will always be set) and defining custom JsonFormat thus (de)serializing the object by myself?
I've read this post: https://stackoverflow.com/a/10820293/1042869, but I was wondering if there's any other way to do that as I got many cases of the _id fields. There was a comment too saying "you but you can give that field a default value in the case class definition, so if the field is not in the json, it will assign the default value to it.", but as you can see here it doesn't seem to work.
Best,
Marcin

So I solved the problem by writing a custom RootJsonFormat::
implicit object ItemJsonFormat extends RootJsonFormat[Item] {
override def read(json: JsValue): Item = json.asJsObject.getFields("_id", "itemId", "name", "active") match {
case Seq(JsString(_id), JsNumber(itemId), JsString(name), JsBoolean(active)) => Item(_id = _id, itemId = itemId.toInt, name = name, active = active)
case Seq(JsNumber(itemId), JsString(name), JsBoolean(active)) => Item(itemId = itemId.toInt, name = name, active = active)
case _ => throw new DeserializationException("Item expected")
}
override def write(obj: Item): JsValue = JsObject(
"_id" -> JsString(obj._id),
"itemId" -> JsNumber(obj.itemId),
"name" -> JsString(obj.name),
"active" -> JsBoolean(obj.active)
)
}
Basically what it does is it checks if we received the _id in json, if we did then we're using it to construct the object, and in other case keep the auto-generated id field.
One other thing which might cause some trouble but in my opinion deserves mentioning somewhere - if anyone has a problem with nested objects ("non-primitive" types) - I advise using .toJson in write def (like obj.time.toJson, where obj.time is jodatime's DateTime) and JsValue's .convertTo[T] def in read, like time = JsString(time).convertTo[DateTime]. In order for this to work there have to be defined implicit json formats for those "non-primitive" objects.
Best,
Marcin

I would use this solution:
case class Item(_id: Option[String], itemId: Int, var name: String, var active: Boolean)
implicit object ItemJsonFormat extends RootJsonFormat[Item] {
override def read(value: JsValue) = {
val _id = fromField[Option[String]](value, "_id")
val itemId = fromField[Int](value, "itemId")
val expires = fromField[Long](value, "expires")
val name = fromField[String](value, "name")
val active = fromField[Boolean](value, "active")
Item(_id, itemId, name, active)
}
override def write(obj: Item): JsValue = JsObject(
"_id" -> JsString(obj._id),
"itemId" -> JsNumber(obj.itemId),
"name" -> JsString(obj.name),
"active" -> JsBoolean(obj.active)
)
}
The advantage over the json.asJsObject.getFields solution is that you have better control over what gets accepted on the case of an undefined id. The example where that would fail is the following:
itemId is a string, same as id
id is defined but itemId is not
In this case the match case would interpret the specified id as a itemId and not catch the error.

Related

serialize Map with key and value as case class using circe

Below is the simplified code that has a Map with key and value as case class.
Key and value object are successfully serialized with circe.
But facing challenge to serialize Map[CaseClassKey, CaseClassValue] with circe.
//////case classes and common vals starts here//////
case class Person(personId : Int, phoneNumber : Int)
case class Item(name : String)
case class Basket(items : List[Item], bills : Map[Int, Int])
def createBasketInstance() : Basket = {
val milk = Item("milk")
val coffee = Item("coffee")
val bills = Map(1 -> 20, 2 -> 75)
Basket( List(milk, coffee), bills )
}
val basket = createBasketInstance()
val person = Person(1, 987654)
//////case classes and common vals ends here//////
import io.circe._
import io.circe.generic.semiauto._
import io.circe.syntax._
import io.circe.parser._
//Serializing Person instance that is used as Key in Map
{
implicit val personCodec :Codec[Person] = deriveCodec[Person]
val jsonString = person.asJson.spaces2
println(jsonString)
}
println("-" * 50)
//Serializing Basket instance that is used as Value in Map
{
implicit val itemCodec :Codec[Item] = deriveCodec[Item]
implicit val basketCodec :Codec[Basket] = deriveCodec[Basket]
val jsonString = basket.asJson.spaces2
println(jsonString)
}
println("-" * 50)
//Serializing Map[Person, Basket]
//TODO : not able to make it work
{
implicit val itemCodec :Codec[Item] = deriveCodec[Item]
implicit val basketCodec :Codec[Basket] = deriveCodec[Basket]
val map = Map(person -> basket)
//TODO : How to make below lines work
//val jsonString = map.asJson.spaces2
//println(jsonString)
}
scalafiddle link : https://scalafiddle.io/sf/SkZNa1L/2
Note : Looking to just serialize and deserialize data (Map[Person, Basket]) correctly. How json looks is not really important in this particular case.
Strictly speaking, you are trying to create an invalid json.
Json map structure isn't an arbitrary map, it's an object structure where keys are property names.
https://www.json.org/json-en.html
See also Can we make object as key in map when using JSON?
Update:
I suggest a slight change to your model to get the job done.
Instead of Map use array of objects, each having two properties: key and value
Something like this:
case class Entry(key: Person, value: Basket)
So you can replace Map[Person, Basket] with Seq[Entry], and convert it back to Map if needed.

Configure spray-json for non strict parsing deserialization

How to configure the spray-json parsing on parsing options?
Similarly as Jackson Parsing Features.
For example, I am parsing a json that has a field that my case class has not, and it is breaking:
spray.json.DeserializationException: Object is missing required member 'myfield'
UPDATE :
A simple example:
case class MyClass(a: String, b: Long);
and try to parse an incomplete json like
val data = "{a: \"hi\"}"
with a spray-json format like:
jsonFormat2(MyClass.apply)
// ...
data.parseJson.convertTo[MyClass]
(simplified code).
But the question goes further, I want to ask about configuration options like in other parsers. More examples:
Be able to ignore fields that exist in the JSON but not in the case class.
Ways of managing nulls or nonexistent values.
etc.
SprayJson allows you to define custom parsers like so:
case class Foo(a: String, b: Int)
implicit object FooJsonFormat extends RootJsonFormat[Foo] {
override def read(json: JsValue): Foo = {
json.asJsObject.getFields("name", "id") match {
case Seq(JsString(name), id) =>
Foo(name, id.convertTo[Int])
}
}
override def write(obj: Foo): JsValue = obj.toJson
}
This allows you to parse any arbitrary payload and pull out the fields "name" and "id" - other fields are ignored. If those fields are not guaranteed you can add something like:
case Seq(JsString(name), JsNull) =>
Foo(name, 0)
You should look at what's available in JsValue.scala - in particular JsArray may come in handy if you're getting payloads with anonymous arrays (i.e. the root is [{...}] instead of {"field":"value"...})
Spray Json doesn't support default parameters. So You cannot have a case class like
case class MyClass(a: String, b: Int = 0)
and then parse json like {"a":"foo"}
However if you make the second parameter as Option. then it works.
import spray.json._
case class MyClass(a: String, b: Option[Int] = None)
object MyProtocol extends DefaultJsonProtocol {
implicit val f = jsonFormat2(MyClass)
}
import MyProtocol.f
val mc1 = MyClass("foo", Some(10))
val strJson = mc1.toJson.toString
val strJson2 = """{"a": "foo"}"""
val mc2 = strJson2.parseJson.convertTo[MyClass]
println(mc2)

Create mapping from string to class type in scala

I have a lot of different external JSON entities that I want to parse to different internal case classs via json4s (scala). Everything works fine via the extract function from json4s. I have implemented a parse function which takes a type and a json string and parses the string to the type / the case class. To map the correct json string to the correct case class I have implemented a pattern matching function, which looks like this
entityName match {
case "entity1" => JsonParser.parse[Entity1](jsonString)
case "entity2" => JsonParser.parse[Entity2](jsonString)
....
I don't like the repetition here and would like to do this mapping via a map like this:
val mapping = Map(
"entity1" -> Entity1,
"entity2" -> Entity2
....
With this map in place I could implement the JsonParser.parse function only once like this
JsonParser.parse[mapping(entityName)](jsonString)
This is not working, because the map is referencing to the object and not to the class type. I also tried classOf[Entity1], but this also is not working. Is there a way to do this?
Thanks!
The way you want your JsonParser.parse to work is not possible in Scala. Scala is a strongly and statically typed language. It means that the compiler should know the types of the values at the compile time to be able to verify that you accesss only valid fields and methods on them and/or pass them as valid parameters to methods. Assuming your classes are
case class Entity1(value:Int, unique1:Int)
case class Entity2(value:String, unique2:String)
and you write
val parsed = JsonParser.parse[mapping("entity1")](jsonString)
how the compiler could know the type of parsed to know the type of parsed.value or to know that parsed.unique1 is a valid field while parsed.unique2 is not? The best type compiler could assign to such parsed is something very generic like Any. Of course you can downcast that Any to the specific type later but this means you still have to specify that type explicitly in the asInstanceOf which kind of defeats the whole purpose. Still, if somehow returning Any is OK for you, you may try to do something like this:
import org.json4s.jackson.JsonMethods
implicit val formats = org.json4s.DefaultFormats // or whatever Formats you actually need
val typeMap: Map[String, scala.reflect.Manifest[_]] = Map(
"String" -> implicitly[scala.reflect.Manifest[String]],
"Int" -> implicitly[scala.reflect.Manifest[Int]]
)
def parseAny(typeName: String, jsonString: String): Any = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract(formats, typeMap(typeName))
}
and then do something like this:
def testParseByTypeName(typeName: String, jsonString: String): Unit = {
try {
val parsed = parseAny(typeName, jsonString)
println(s"parsed by name $typeName => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParseByTypeName("String", "\"abc\"")
testParseByTypeName("Int", "123")
}
P.S. If your entityName doesn't come from the outside (i.e. you don't analyze data to find out actual type), you don't actually need it at all. It is enough to use type (without a need for match/case) such as:
def parse[T](jsonString: String)(implicit mf: scala.reflect.Manifest[T]): T = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract[T]
}
def testParse[T](prefix: String, jsonString: String)(implicit mf: scala.reflect.Manifest[T]): Unit = {
try {
val parsed = parse[T](jsonString)
println(s"$prefix => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParse[String]("parse String", "\"abc\"")
testParse[Int]("parse Int", "123")
}
Following idea from #SergGr, as a snippet to paste on Ammonite REPL:
{
import $ivy.`org.json4s::json4s-native:3.6.0-M2`
import org.json4s.native.JsonMethods.parse
import org.json4s.DefaultFormats
import org.json4s.JValue
case class Entity1(name : String, value : Int)
case class Entity2(name : String, value : Long)
implicit val formats = DefaultFormats
def extract[T](input : JValue)(implicit m : Manifest[T]) = input.extract[T]
val mapping: Map[String, Manifest[_]] = Map(
"entity1" -> implicitly[Manifest[Entity1]],
"entity2" -> implicitly[Manifest[Entity2]]
)
val input = parse(""" { "name" : "abu", "value" : 1 } """)
extract(input)(mapping("entity1")) //Entity1("abu", 1)
extract(input)(mapping("entity2")) //Entity2("abu", 1L)
}

How setup spray-json to set null when json element is not present?

Here is spray-json example. Here is NullOptions trait.
The problem is when I declare a case class say
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val some: RootJsonFormat[Some] = jsonFormat2(Some)
}
case class Some (
name:String,
age:Int
)
and json do not contains a field for example:
{
"name":"John"
}
I get: java.util.NoSuchElementException: key not found: age
So I have to add an Option and NullOption trait like that:
object MyJsonProtocol extends DefaultJsonProtocol with NullOptions {
implicit val some: RootJsonFormat[Some] = jsonFormat2(Some)
}
case class Some (
name:String,
age:Option[Int]
)
Everything works. But I do not want to have a case classes where all member are Option. Is there a way to configure spray json unmarshalling to just set nulls without additional Option type?
P.S.
I understand that in general Option is better then null check, but in my case it is just monkey code.
Also complete example of marshalling during response processing is here
The only way I can think of is to implement your own Protocol via read/write, which might be cumbersome. Below is a simplified example. Note that I changed the age to be an Integer instead of an Int since Int is an AnyVal, which is not nullable by default. Furthermore, I only consider the age field to be nullable, so you might need to adopt as necessary. Hope it helps.
case class Foo (name:String, age: Integer)
object MyJsonProtocol extends DefaultJsonProtocol {
implicit object FooJsonFormat extends RootJsonFormat[Foo] {
def write(foo: Foo) =
JsObject("name" -> JsString(foo.name),
"age" -> Option(foo.age).map(JsNumber(_)).getOrElse(JsNull))
def read(value: JsValue) = value match {
case JsObject(fields) =>
val ageOpt: Option[Integer] = fields.get("age").map(_.toString().toInt) // implicit conversion from Int to Integer
val age: Integer = ageOpt.orNull[Integer]
Foo(fields.get("name").get.toString(), age)
case _ => deserializationError("Foo expected")
}
}
}
import MyJsonProtocol._
import spray.json._
val json = """{ "name": "Meh" }""".parseJson
println(json.convertTo[Foo]) // prints Foo("Meh",null)
It seems you're out of luck
From the doc you linked:
spray-json will always read missing optional members as well as null optional members as None
You can customize the json writing, but not the reading.

Scala - Instantiate classes dynamically from a String

I am trying to create a dynamic parser which allows me to parse json content into different classes depending on a class name.
I will get the json and the class name (as String) and I would like to do something like this:
val theCaseClassName = "com.ardlema.JDBCDataProviderProperties"
val myCaseClass = Class.forName(theCaseClassName)
val jsonJdbcProperties = """{"url":"myUrl","userName":"theUser","password":"thePassword"}"""
val json = Json.parse(jsonJdbcProperties)
val value = Try(json.as[myClass])
The above code obviously does not compile because the json.as[] method tries to convert the node into a "T" (I have an implicit Reads[T] defined for my case class)
What would be the best way to get a proper "T" to pass in to the json.as[] method from the original String?
A great solution that might work would be to do polymorphic deserialization. This allows you to add a field (like "type") to your json and allow Jackson (assuming you're using an awesome json parser like Jackson) to figure out the proper type on your behalf. It looks like you might not be using Jackson; I promise it's worth using.
This post gives a great introduction to polymorphic types. It covers many useful cases including the case where you can't modify 3rd party code (here you add a Mixin to annotate the type hierarchy).
The simplest case ends up looking like this (and all of this works great with Scala objects too -- jackson even has a great scala module):
object Test {
#JsonTypeInfo(
use = JsonTypeInfo.Id.NAME,
include = JsonTypeInfo.As.PROPERTY,
property = "type"
)
#JsonSubTypes(Array(
new Type(value = classOf[Cat], name = "cat"),
new Type(value = classOf[Dog], name = "dog")
))
trait Animal
case class Dog(name: String, breed: String, leash_color: String) extends Animal
case class Cat(name: String, favorite_toy: String) extends Animal
def main(args: Array[String]): Unit = {
val objectMapper = new ObjectMapper with ScalaObjectMapper
objectMapper.registerModule(DefaultScalaModule)
val dogStr = """{"type": "dog", "name": "Spike", "breed": "mutt", "leash_color": "red"}"""
val catStr = """{"type": "cat", "name": "Fluffy", "favorite_toy": "spider ring"}"""
val animal1 = objectMapper.readValue[Animal](dogStr)
val animal2 = objectMapper.readValue[Animal](catStr)
println(animal1)
println(animal2)
}
}
This generates this output:
// Dog(Spike,mutt,red)
// Cat(Fluffy,spider ring)
You can also avoid listing the subtype mapping, but it requires that the json "type" field is a bit more complex. Experiment with it; you might like it. Define Animal like this:
#JsonTypeInfo(
use = JsonTypeInfo.Id.CLASS,
include = JsonTypeInfo.As.PROPERTY,
property = "type"
)
trait Animal
And it produces (and consumes) json like this:
/*
{
"breed": "mutt",
"leash_color": "red",
"name": "Spike",
"type": "classpath.to.Test$Dog"
}
{
"favorite_toy": "spider ring",
"name": "Fluffy",
"type": "classpath.to.Test$Cat"
}
*/
You should select your Reads[T] based on the class name. Unfortunately this will probably have to be a manual pattern match:
val r: Reads[_] = theCaseClassName match {
case "com.ardlema.JDBCDataProviderProperties" => JDBCReads
case ... => ...
}
val value = json.as(r).asInstanceOf[...]
Alternately, look at the implementation of json.as; at some point it's probably requiring a classTag and then calling .runtimeClass on it. Assuming that's so, you can just do whatever it is and pass your own myCaseClass there.