serialize Map with key and value as case class using circe - json

Below is the simplified code that has a Map with key and value as case class.
Key and value object are successfully serialized with circe.
But facing challenge to serialize Map[CaseClassKey, CaseClassValue] with circe.
//////case classes and common vals starts here//////
case class Person(personId : Int, phoneNumber : Int)
case class Item(name : String)
case class Basket(items : List[Item], bills : Map[Int, Int])
def createBasketInstance() : Basket = {
val milk = Item("milk")
val coffee = Item("coffee")
val bills = Map(1 -> 20, 2 -> 75)
Basket( List(milk, coffee), bills )
}
val basket = createBasketInstance()
val person = Person(1, 987654)
//////case classes and common vals ends here//////
import io.circe._
import io.circe.generic.semiauto._
import io.circe.syntax._
import io.circe.parser._
//Serializing Person instance that is used as Key in Map
{
implicit val personCodec :Codec[Person] = deriveCodec[Person]
val jsonString = person.asJson.spaces2
println(jsonString)
}
println("-" * 50)
//Serializing Basket instance that is used as Value in Map
{
implicit val itemCodec :Codec[Item] = deriveCodec[Item]
implicit val basketCodec :Codec[Basket] = deriveCodec[Basket]
val jsonString = basket.asJson.spaces2
println(jsonString)
}
println("-" * 50)
//Serializing Map[Person, Basket]
//TODO : not able to make it work
{
implicit val itemCodec :Codec[Item] = deriveCodec[Item]
implicit val basketCodec :Codec[Basket] = deriveCodec[Basket]
val map = Map(person -> basket)
//TODO : How to make below lines work
//val jsonString = map.asJson.spaces2
//println(jsonString)
}
scalafiddle link : https://scalafiddle.io/sf/SkZNa1L/2
Note : Looking to just serialize and deserialize data (Map[Person, Basket]) correctly. How json looks is not really important in this particular case.

Strictly speaking, you are trying to create an invalid json.
Json map structure isn't an arbitrary map, it's an object structure where keys are property names.
https://www.json.org/json-en.html
See also Can we make object as key in map when using JSON?
Update:
I suggest a slight change to your model to get the job done.
Instead of Map use array of objects, each having two properties: key and value
Something like this:
case class Entry(key: Person, value: Basket)
So you can replace Map[Person, Basket] with Seq[Entry], and convert it back to Map if needed.

Related

Configure spray-json for non strict parsing deserialization

How to configure the spray-json parsing on parsing options?
Similarly as Jackson Parsing Features.
For example, I am parsing a json that has a field that my case class has not, and it is breaking:
spray.json.DeserializationException: Object is missing required member 'myfield'
UPDATE :
A simple example:
case class MyClass(a: String, b: Long);
and try to parse an incomplete json like
val data = "{a: \"hi\"}"
with a spray-json format like:
jsonFormat2(MyClass.apply)
// ...
data.parseJson.convertTo[MyClass]
(simplified code).
But the question goes further, I want to ask about configuration options like in other parsers. More examples:
Be able to ignore fields that exist in the JSON but not in the case class.
Ways of managing nulls or nonexistent values.
etc.
SprayJson allows you to define custom parsers like so:
case class Foo(a: String, b: Int)
implicit object FooJsonFormat extends RootJsonFormat[Foo] {
override def read(json: JsValue): Foo = {
json.asJsObject.getFields("name", "id") match {
case Seq(JsString(name), id) =>
Foo(name, id.convertTo[Int])
}
}
override def write(obj: Foo): JsValue = obj.toJson
}
This allows you to parse any arbitrary payload and pull out the fields "name" and "id" - other fields are ignored. If those fields are not guaranteed you can add something like:
case Seq(JsString(name), JsNull) =>
Foo(name, 0)
You should look at what's available in JsValue.scala - in particular JsArray may come in handy if you're getting payloads with anonymous arrays (i.e. the root is [{...}] instead of {"field":"value"...})
Spray Json doesn't support default parameters. So You cannot have a case class like
case class MyClass(a: String, b: Int = 0)
and then parse json like {"a":"foo"}
However if you make the second parameter as Option. then it works.
import spray.json._
case class MyClass(a: String, b: Option[Int] = None)
object MyProtocol extends DefaultJsonProtocol {
implicit val f = jsonFormat2(MyClass)
}
import MyProtocol.f
val mc1 = MyClass("foo", Some(10))
val strJson = mc1.toJson.toString
val strJson2 = """{"a": "foo"}"""
val mc2 = strJson2.parseJson.convertTo[MyClass]
println(mc2)

Create mapping from string to class type in scala

I have a lot of different external JSON entities that I want to parse to different internal case classs via json4s (scala). Everything works fine via the extract function from json4s. I have implemented a parse function which takes a type and a json string and parses the string to the type / the case class. To map the correct json string to the correct case class I have implemented a pattern matching function, which looks like this
entityName match {
case "entity1" => JsonParser.parse[Entity1](jsonString)
case "entity2" => JsonParser.parse[Entity2](jsonString)
....
I don't like the repetition here and would like to do this mapping via a map like this:
val mapping = Map(
"entity1" -> Entity1,
"entity2" -> Entity2
....
With this map in place I could implement the JsonParser.parse function only once like this
JsonParser.parse[mapping(entityName)](jsonString)
This is not working, because the map is referencing to the object and not to the class type. I also tried classOf[Entity1], but this also is not working. Is there a way to do this?
Thanks!
The way you want your JsonParser.parse to work is not possible in Scala. Scala is a strongly and statically typed language. It means that the compiler should know the types of the values at the compile time to be able to verify that you accesss only valid fields and methods on them and/or pass them as valid parameters to methods. Assuming your classes are
case class Entity1(value:Int, unique1:Int)
case class Entity2(value:String, unique2:String)
and you write
val parsed = JsonParser.parse[mapping("entity1")](jsonString)
how the compiler could know the type of parsed to know the type of parsed.value or to know that parsed.unique1 is a valid field while parsed.unique2 is not? The best type compiler could assign to such parsed is something very generic like Any. Of course you can downcast that Any to the specific type later but this means you still have to specify that type explicitly in the asInstanceOf which kind of defeats the whole purpose. Still, if somehow returning Any is OK for you, you may try to do something like this:
import org.json4s.jackson.JsonMethods
implicit val formats = org.json4s.DefaultFormats // or whatever Formats you actually need
val typeMap: Map[String, scala.reflect.Manifest[_]] = Map(
"String" -> implicitly[scala.reflect.Manifest[String]],
"Int" -> implicitly[scala.reflect.Manifest[Int]]
)
def parseAny(typeName: String, jsonString: String): Any = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract(formats, typeMap(typeName))
}
and then do something like this:
def testParseByTypeName(typeName: String, jsonString: String): Unit = {
try {
val parsed = parseAny(typeName, jsonString)
println(s"parsed by name $typeName => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParseByTypeName("String", "\"abc\"")
testParseByTypeName("Int", "123")
}
P.S. If your entityName doesn't come from the outside (i.e. you don't analyze data to find out actual type), you don't actually need it at all. It is enough to use type (without a need for match/case) such as:
def parse[T](jsonString: String)(implicit mf: scala.reflect.Manifest[T]): T = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract[T]
}
def testParse[T](prefix: String, jsonString: String)(implicit mf: scala.reflect.Manifest[T]): Unit = {
try {
val parsed = parse[T](jsonString)
println(s"$prefix => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParse[String]("parse String", "\"abc\"")
testParse[Int]("parse Int", "123")
}
Following idea from #SergGr, as a snippet to paste on Ammonite REPL:
{
import $ivy.`org.json4s::json4s-native:3.6.0-M2`
import org.json4s.native.JsonMethods.parse
import org.json4s.DefaultFormats
import org.json4s.JValue
case class Entity1(name : String, value : Int)
case class Entity2(name : String, value : Long)
implicit val formats = DefaultFormats
def extract[T](input : JValue)(implicit m : Manifest[T]) = input.extract[T]
val mapping: Map[String, Manifest[_]] = Map(
"entity1" -> implicitly[Manifest[Entity1]],
"entity2" -> implicitly[Manifest[Entity2]]
)
val input = parse(""" { "name" : "abu", "value" : 1 } """)
extract(input)(mapping("entity1")) //Entity1("abu", 1)
extract(input)(mapping("entity2")) //Entity2("abu", 1L)
}

How can I define a dynamic base json object?

I would like to design a base trait/class in Scala that can produce the following json:
trait GenericResource {
val singularName: String
val pluralName: String
}
I would inherit this trait in a case class:
case class Product(name: String) extends GenericResource {
override val singularName = "product"
override val pluralName = "products"
}
val car = Product("car")
val jsonString = serialize(car)
the output should look like: {"product":{"name":"car"}}
A Seq[Product] should produce {"products":[{"name":"car"},{"name":"truck"}]} etc...
I'm struggling with the proper abstractions to accomplish this. I am open to solutions using any JSON library (available in Scala).
Here's about the simplest way I can think of to do the singular part generically with circe:
import io.circe.{ Decoder, Encoder, Json }
import io.circe.generic.encoding.DerivedObjectEncoder
trait GenericResource {
val singularName: String
val pluralName: String
}
object GenericResource {
implicit def encodeResource[A <: GenericResource](implicit
derived: DerivedObjectEncoder[A]
): Encoder[A] = Encoder.instance { a =>
Json.obj(a.singularName -> derived(a))
}
}
And then if you have some case class extending GenericResource like this:
case class Product(name: String) extends GenericResource {
val singularName = "product"
val pluralName = "products"
}
You can do this (assuming all the members of the case class are encodeable):
scala> import io.circe.syntax._
import io.circe.syntax._
scala> Product("car").asJson.noSpaces
res0: String = {"product":{"name":"car"}}
No boilerplate, no extra imports, etc.
The Seq case is a little trickier, since circe automatically provides a Seq[A] encoder for any A that has an Encoder, but it doesn't do what you want—it just encodes the items and sticks them in a JSON array. You can write something like this:
implicit def encodeResources[A <: GenericResource](implicit
derived: DerivedObjectEncoder[A]
): Encoder[Seq[A]] = Encoder.instance {
case values # (head +: _) =>
Json.obj(head.pluralName -> Encoder.encodeList(derived)(values.toList))
case Nil => Json.obj()
}
And use it like this:
scala> Seq(Product("car"), Product("truck")).asJson.noSpaces
res1: String = {"products":[{"name":"car"},{"name":"truck"}]}
But you can't just stick it in the companion object and expect everything to work—you have to put it somewhere and import it when you need it (otherwise it has the same priority as the default Seq[A] instances).
Another issue with this encodeResources implementation is that it just returns an empty object if the Seq is empty:
scala> Seq.empty[Product].asJson.noSpaces
res2: String = {}
This is because the plural name is attached to the resource at the instance level, and if you don't have an instance there's no way to get it (short of reflection). You could of course conjure up a fake instance by passing nulls to the constructor or whatever, but that seems out of the scope of this question.
This issue (the resource names being attached to instances) is also going to be trouble if you need to decode this JSON you've encoded. If that is the case, I'd suggest considering a slightly different approach where you have something like a GenericResourceCompanion trait that you mix into the companion object for the specific resource type, and to indicate the names there. If that's not an option, you're probably stuck with reflection or fake instances, or both (but again, probably not in scope for this question).

trouble extracting recursive data structures in scala with json4s

I have a json format that consists of map -> map -> ... -> int for an arbitrary number of maps per key. The keys are always strings and the leaf types are always ints. The depth of the map structure varies per key in the map. For example, key "A" could be type Map[String, Int] and key "B" could be type Map[String, Map[String, Int]]. I know I can parse this format successfully as Map[String, Any], but I'm trying to preserve the types to make these structures easier to merge later in the code.
I can't seem to define my nested structure in such a way as to not throw an error on the json4s extract. I'm not quite sure if the problem is in my structure definition or if I'm not doing the json extract correctly.
Here is the code
sealed trait NestedMap[A]
case class Elem[A](val e : A) extends NestedMap[A]
case class NMap[A](val e : Map[String, NestedMap[A]]) extends NestedMap[A]
// XXX this next line doesn't seem to help or hurt
case class Empty extends NestedMap[Nothing]
implicit val formats = DefaultFormats
val s = "{\"1\": 1, \"2\": 1, \"3\": {\"4\": 1}}"
val b = parse(s).extract[NMap[Int]]
Here is the error that always comes up
org.json4s.package$MappingException: No usable value for e
Expected object but got JNothing
Do I need to add another value that extends NestedMap? Am I going about this entirely wrong? Any help is much appreciated.
By default, a tree like yours is expanded to to a different json.
import org.json4s._
import org.json4s.native.Serialization
import org.json4s.native.Serialization.{read, write}
import org.json4s.native.JsonMethods._
implicit val formats = Serialization.formats(NoTypeHints)
sealed trait Elem
case class Leaf(val e:Int) extends Elem
case class Tree(val e:Map[String, Elem]) extends Elem
scala> val t = Tree(Map("1"->Leaf(1),"2"->Leaf(2),"3"->Tree(Map("4"->Leaf(4)))))
t: Tree = Tree(Map(1 -> Leaf(1), 2 -> Leaf(2), 3 -> Tree(Map(4 -> Leaf(4)))))
scala> write(t)
res0: String = {"e":{"1":{"e":1},"2":{"e":2},"3":{"e":{"4":{"e":4}}}}}
scala> val jt = parse(res0)
jt: org.json4s.JValue = JObject(List((e,JObject(List((1,JObject(List((e,JInt(1))))), (2,JObject(List((e,JInt(2))))), (3,JObject(List((e,JObject(List((4,JObject(List((e,JInt(4))))))))))))))))
scala> val s = """{"1":1,"2":2, "3":{"4":1}}"""
s: String = {"1":1,"2":2, "3":{"4":1}}
scala> val st = parse(s)
st: org.json4s.JValue = JObject(List((1,JInt(1)), (2,JInt(2)) (3,JObject(List((4,JInt(1)))))))

scala spray json optional id parameter in entity class

I'm using Salat library for serializing case classes as mongoDb objects. My Item.scala file looks like this:
case class Item(_id: String = (new ObjectId).toString, itemId: Int, var name: String, var active: Boolean) extends WithId {
override def id: ObjectId = new ObjectId(_id)
}
object Item extends MongoDb[Item] with MongoDao[Item] {
override def collectionName: String = "items"
}
object ItemJsonProtocol extends DefaultJsonProtocol {
implicit val itemFormat = jsonFormat4(Item.apply)
}
Now, I'm using it to post the Item entities as Json via Spray HTTP. I'd want to invoke it as follows:
curl.exe -H "Content-Type: application/json" -X PUT -d "{\"itemId\":
1, \"active\":true, \"name\" : \"test\"}" http://localhost:8080/items/
hoping it would provide generated id if I don't provide one.
However, after invoking curl command I'm getting an error:
The request content was malformed:
Object is missing required member '_id'
Is there any way to mark the _id field as optional without making the Option out of it (this field will always be set) and defining custom JsonFormat thus (de)serializing the object by myself?
I've read this post: https://stackoverflow.com/a/10820293/1042869, but I was wondering if there's any other way to do that as I got many cases of the _id fields. There was a comment too saying "you but you can give that field a default value in the case class definition, so if the field is not in the json, it will assign the default value to it.", but as you can see here it doesn't seem to work.
Best,
Marcin
So I solved the problem by writing a custom RootJsonFormat::
implicit object ItemJsonFormat extends RootJsonFormat[Item] {
override def read(json: JsValue): Item = json.asJsObject.getFields("_id", "itemId", "name", "active") match {
case Seq(JsString(_id), JsNumber(itemId), JsString(name), JsBoolean(active)) => Item(_id = _id, itemId = itemId.toInt, name = name, active = active)
case Seq(JsNumber(itemId), JsString(name), JsBoolean(active)) => Item(itemId = itemId.toInt, name = name, active = active)
case _ => throw new DeserializationException("Item expected")
}
override def write(obj: Item): JsValue = JsObject(
"_id" -> JsString(obj._id),
"itemId" -> JsNumber(obj.itemId),
"name" -> JsString(obj.name),
"active" -> JsBoolean(obj.active)
)
}
Basically what it does is it checks if we received the _id in json, if we did then we're using it to construct the object, and in other case keep the auto-generated id field.
One other thing which might cause some trouble but in my opinion deserves mentioning somewhere - if anyone has a problem with nested objects ("non-primitive" types) - I advise using .toJson in write def (like obj.time.toJson, where obj.time is jodatime's DateTime) and JsValue's .convertTo[T] def in read, like time = JsString(time).convertTo[DateTime]. In order for this to work there have to be defined implicit json formats for those "non-primitive" objects.
Best,
Marcin
I would use this solution:
case class Item(_id: Option[String], itemId: Int, var name: String, var active: Boolean)
implicit object ItemJsonFormat extends RootJsonFormat[Item] {
override def read(value: JsValue) = {
val _id = fromField[Option[String]](value, "_id")
val itemId = fromField[Int](value, "itemId")
val expires = fromField[Long](value, "expires")
val name = fromField[String](value, "name")
val active = fromField[Boolean](value, "active")
Item(_id, itemId, name, active)
}
override def write(obj: Item): JsValue = JsObject(
"_id" -> JsString(obj._id),
"itemId" -> JsNumber(obj.itemId),
"name" -> JsString(obj.name),
"active" -> JsBoolean(obj.active)
)
}
The advantage over the json.asJsObject.getFields solution is that you have better control over what gets accepted on the case of an undefined id. The example where that would fail is the following:
itemId is a string, same as id
id is defined but itemId is not
In this case the match case would interpret the specified id as a itemId and not catch the error.