Converting "recursive" object to JSON (Play Framework 2.4 with Scala) - json

I have reached a point where my code compiles successfully, yet I have doubts about my solution and am posting this question for that reason.
I have a Node class defined as:
case class Node(id: Long, label: String, parent_id: Option[Long])
The reason I quote/unquote recursive is because technically, I do not store a Node within a Node. Rather, each node has a pointer to its parent and I can say: give me all children of Node id=X.
Here is an example tree, for the sake of visualization. I would like to give the root_node's ID, and obtain the tree's conversion to a Json String:
root_node
|_ node_1
| |_ node_11
| |_ node_111
|_ node_2
|_ node_3
The Json would look like:
{"title": "root_node", "children": [...]}
with the children array containing node_1, 2 and 3 etc... recursively
Here is the Writes Converter for Node:
/** json converter of Node to JSON */
implicit val NodeWrites = new Writes[Node] {
def writes(node: Node) = Json.obj(
"title" -> node.label,
"children" -> Node.getChildrenOf(node.id)
)
}
Quoting Play docs:
The Play JSON API provides implicit Writes for most basic types, such
as Int, Double, String, and Boolean. It also supports Writes for
collections of any type T that a Writes[T] exists.
I need to point out that Node.getChildrenOf(node.id) returns a List of Nodes from the DB. So according to Play's docs, I should be able to convert a List[Node] to Json. It seems that doing this within the Writes converter itself is a bit more troublesome.
Here is the resulting error from running this code:
type mismatch;
found : List[models.Node]
required: play.api.libs.json.Json.JsValueWrapper
Note: implicit value NodeWrites is not applicable here because it comes after the application point and it lacks an explicit result type
I added the "explicit result type" to my Writes converter, here is the result:
/** json converter of Node to JSON */
implicit val NodeWrites: Writes[Node] = new Writes[Node] {
def writes(node: Node) = Json.obj(
"title" -> node.label,
"children" -> Node.getChildrenOf(node.id)
)
}
The code now executes properly, I can visualize the tree on the browser.
Even though this looks to me like the cleanest working solution, IntelliJ still complains about the line:
"children" -> Node.getChildrenOf(node.id)
saying:
Type mismatch: found(String, List[Node]), required (String, Json.JsValueWrapper)
Could it be that IntelliJ's error reporting is not based off of the Scala compiler?
Finally, is the overall approach of the JSON converter terrible?
Thanks and sorry for the long post.

The problem lies in "children" -> Node.getChildrenOf(node.id). Node.getChildrenOf(node.id) returns a List[Node]. Whereas any attribute in the Json.obj expects JsValueWrappers. In this case a JsArray.
Something like this should work:
implicit val writes = new Writes[Node] {
def writes(node: Node) = Json.obj(
"title" -> node.label,
// Note that we have to pass this since the writes hasn't been defined just yet.
"children" -> JsArray(Node.getChildrenOf(node).map(child => Json.toJson(child)(this)))
)
}
This at least compiles, I haven't tested it with any data though.

Related

Error Accumulation with Scalaz Validation

I have a complex JSON that it's persisted in a database. It's complexity is "segregated" in "blocks", as follows:
Entire JSON:
{
"block1" : {
"param1" : "val1",
"param2" : "val2"
},
"block2" : {
"param3" : "val3",
"param4" : "val4"
},
...
}
In the database, every block is stored and treated individually:
Persisted Blocks
"block1" : {
"param1" : "val1",
"param2" : "val2"
}
"block2" : {
"param3" : "val3",
"param4" : "val4"
}
Every block has a business meaning, so, every one it's mapped to a case-class.
I'm building a Play API that stores, updates and retrieves this JSON Structure and i want to validate if someone altered it's data for the sake of integrity.
I'm doing the retrieval (parsing and validation) of every block as follows:
val block1 = Json.parse(block1).validate[Block1].get
val block2 = Json.parse(block2).validate[Block2].get
...
The case-classes:
trait Block
sealed case class Block1 (param1: String, param2: String, ...) extends Block
sealed case class Block2 (param3: String, param4: String, ...) extends Block
sealed case class Request (block1: Block1, block2: Block2, ...)
With the current structure, if some field is altered and doesn't match with the defined type for it Play throws this exception:
[NoSuchElementException: JsError.get]
So, i want to build an accumulative error structure with Scalaz and Validation that catch all possible parsing and validation errors. I have seen this and this so i've coded the validation this way:
def build(block1: String, block2: String, ...): Validation[NonEmptyList[String], Request] = {
val block1 = Option(Json.parse(block1).validate[Block1].get).toSuccess("Error").toValidationNel
val block2 = Option(Json.parse(block2).validate[Block2].get).toSuccess("Error").toValidationNel
...
val request = (Request.apply _).curried
blockn <*> (... <*> (... <*> (...<*> (block2 <*> (block1 map request)))))
}
Note that i'm using the applicative functor <*> because Request has 20 fields (building that is a parentheses-mess with that syntax), |#| applicative functor only applies for case classes up to 12 parameters.
That code works for the happy path, but, when i modify some field Play throws the Execution Exception later described.
Question: I want to accumulate all possible structure faults that Play can detect while parsing every block. How can i do that?
Note: If in some way Shapeless has something to do with this i'm open to use it (i'm already using it).
If multi-validation is the only reason for adding Scalaz to your project then why not consider an alternative which will not require you adapt your Play project to the monadic way in which Scalaz forces you to solve problems (which might be a good thing, but not necessarily if the only reason for that is multi-validation).
Instead consider using a standard Scala approach with scala.util.Try:
val block1 = Try(Json.parse(block1).validate[Block1].get)
val block2 = Try(Json.parse(block2).validate[Block2].get)
...
val allBlocks : List[Try[Block]] = List(block1, block2, ...)
val failures : List[Failure[Block]] = allBlocks.collect { case f : Failure[Block] => f }
This way you can still operate on standard scala collections to retrieve the list of failures to be processed further. Also this approach does not constrain you in terms of number of blocks.

Abstraction to extract data from JSON in Scala

I am looking for a good abstraction to extract data form JSON (I am using json4s now).
Suppose I have a case class A and data in JSON format.
case class A(a1: String, a2: String, a3: String)
{"a1":"xxx", "a2": "yyy", "a3": "zzz"}
I need a function to extract the JSON data and return A with these data as follows:
val a: JValue => A = ...
I do not want to write the function a from scratch. I would rather compose it from primitive functions.
For example, I can write a primitive function to extract string by field name:
val str: (String, JValue) => String = {(fieldName, jval) => ... }
Now I would like to compose the function a: JValue => A from str. Does it make sense ?
Consider use of Play-JSON, which has a composable "Reads" object. If you've ever used ReactiveMongo, it can be used in much the same way. Contrary to some older posts here, it can be used stand-alone, without most of the rest of Play.
It uses the common "implicit translator" (my term) idiom. I found that my favorite deserializing pattern for using it is not highlighted in the docs, though - the pattern they espouse is a lot harder to get right, IMHO. I make heavy use of .as and .asOpt, which are documented on the first linked page above, in the small section "Using JsValue.as/asOpt". When deserializing a JSON object, you can say something like
val person:Person = (someParsedJsonObject \ "aPerson").as[Person]
and as long as you have an implicit Reads[Person] in scope, all just works. There are built-in Reads for all primitive types and many collection types. In many cases, it makes sense to put the Reads and Writes implicit objects in the companion object for, e.g., Person.
I thought json4s had a similar feature, but I could be wrong.
Argonaut is fully functional Scala library.
It allows to encode/decode case classes (JSON codecs).
import argonaut._, Argonaut._
case class Person(name: String, age: Int)
implicit def PersonDecodeJson: DecodeJson[Person]
jdecode2L(Person.apply)("name", "age")
// Codec for Person case class from JSON of form
// { "name": "string", "age": 1 }
It also provides JSON cursor (lenses/monocle) for custom parsing.
implicit def PersonDecodeJson: DecodeJson[Person] =
DecodeJson(c => for {
name <- (c --\ "_name").as[String]
age <- (c --\ "_age").as[String].map(_.toInt)
} yield Person(name, age))
// Decode Person from a JSON with property names different
// from those of the case class, and age passed as string:
// { "_name": "string", "age": "10" }
Parsing result is represented by DecodeResult type that can be composed (.map, .flatMap) and handle error cases.

Suggestions for Writing Map as JSON file in Scala

I have a simple single key-valued Map(K,V) myDictionary that is populated by my program and at the end I want to write it as JSON format string in a text file - as I would need parse them later.
I was using this code earlier,
Some(new PrintWriter(outputDir+"/myDictionary.json")).foreach{p => p.write(compact(render(decompose(myDictionary)))); p.close}
I found it to be slower as the input size increased. Later, I used this var out = new
var out = new PrintWriter(outputDir+"/myDictionary.json");
out.println(scala.util.parsing.json.JSONObject(myDictionary.toMap).toString())
This is proving to be bit faster.
I have run this for sample input and found that this is faster than my earlier approach. I assuming my input map size would reach at least a million values( >1GB text file) (K,V) hence I want to make sure that I follow the faster and memory efficient approach for Map serialization process.What are other approaches that you would recommend,that I can look into to optimize this.
The JSON support in the standard Scala library is probably not the best choice. Unfortunately the situation with JSON libraries for Scala is a bit confusing, there are many alternatives (Lift JSON, Play JSON, Spray JSON, Twitter JSON, Argonaut, ...), basically one library for each day of the week... I suggest you have a look at these at least to see if any of them is easier to use and more performative.
Here is an example using Play JSON which I have chosen for particular reasons (being able to generate formats with macros):
object JsonTest extends App {
import play.api.libs.json._
type MyDict = Map[String, Int]
implicit object MyDictFormat extends Format[MyDict] {
def reads(json: JsValue): JsResult[MyDict] = json match {
case JsObject(fields) =>
val b = Map.newBuilder[String, Int]
fields.foreach {
case (k, JsNumber(v)) => b += k -> v.toInt
case other => return JsError(s"Not a (string, number) pair: $other")
}
JsSuccess(b.result())
case _ => JsError(s"Not an object: $json")
}
def writes(m: MyDict): JsValue = {
val fields: Seq[(String, JsValue)] = m.map {
case (k, v) => k -> JsNumber(v)
} (collection.breakOut)
JsObject(fields)
}
}
val m = Map("hallo" -> 12, "gallo" -> 34)
val serial = Json.toJson(m)
val text = Json.stringify(serial)
println(text)
val back = Json.fromJson[MyDict](serial)
assert(back == JsSuccess(m), s"Failed: $back")
}
While you can construct and deconstruct JsValues directly, the main idea is to use a Format[A] where A is the type of your data structure. This puts more emphasis on type safety than the standard Scala-Library JSON. It looks more verbose, but in end I think it's the better approach.
There are utility methods Json.toJson and Json.fromJson which look for an implicit format of the type you want.
On the other hand, it does construct everything in-memory and it does duplicate your data structure (because for each entry in your map you will have another tuple (String, JsValue)), so this isn't necessarily the most memory efficient solution, given that you are operating in the GB magnitude...
Jerkson is a Scala wrapper for the Java JSON library Jackson. The latter apparently has the feature to stream data. I found this project which says it adds streaming support. Play JSON in turn is based on Jerkson, so perhaps you can even figure out how to stream your object with that. See also this question.

Dynamic JSON Creation in Play

I'm trying to dynamically create a JSON object in the following way. Note that the below code will not compile as I'm requesting your help on how to code it.
val favoriteFoods: JsArray = getArray() // gets Array of Array of JSON objects
val json: JsObject = Json.obj(
"name" : JsString("Kevin"),
"FavoriteFood1" -> favoriteFoods.get(0), // note that I made up the get()
"FavoriteFood2" -> favoriteFoods.get(1)
)
Looking at the JsArray docs, I didn't see any way to get the i'th element of a JsArray.
I tried to add an if statement to check if a new FavoriteFood could be added, but it would not compile.
You didn't specify the version of Play Framework you're using. I'm familiar with 1.X, so this answer is from that perspective. Play provides a renderJSON() method. Here's the docs:
http://www.playframework.com/documentation/1.2.4/controllers
Scroll down to the section that reads 'Return a JSON String'

Inserting JsNumber into Mongo

When trying to insert a MongoDBObject that contains a JsNumber
val obj: DBObject = getDbObj // contains a "JsNumber()"
collection.insert(obj)
the following error occurs:
[error] play - Cannot invoke the action, eventually got an error: java.lang.IllegalArgumentException: can't serialize class scala.math.BigDecimal
I tried to replace the JsNumber with an Int, but I got the same error.
EDIT
Error can be reproduced via this test code. Full code in scalatest (https://gist.github.com/kman007us/6617735)
val collection = MongoConnection()("test")("test")
val obj: JsValue = Json.obj("age" -> JsNumber(100))
val q = MongoDBObject("name" -> obj)
collection.insert(q)
There are no registered handlers for Plays JSON implementation - you could add handlers to automatically translate plays Js Types to BSON types. However, that wont handle mongodb extended json which has a special structure dealing with non native json types eg: date and objectid translations.
An example of using this is:
import com.mongodb.util.JSON
val obj: JsValue = Json.obj("age" -> JsNumber(100))
val doc: DBObject = JSON.parse(obj.toString).asInstanceOf[DBObject]
For an example of a bson transformer see the joda time transformer.
It seems that casbah driver isn't compatible with Plays's JSON implementation. If I look through the cashbah code than it seems that you must use a set of MongoDBObject objects to build your query. The following snippet should work.
val collection = MongoConnection()("test")("test")
val obj = MongoDBObject("age" -> 100)
val q = MongoDBObject("name" -> obj)
collection.insert(q)
If you need the compatibility with Play's JSON implementation then use ReactiveMongo and Play-ReactiveMongo.
Edit
Maybe this Gist can help to convert JsValue objects into MongoDBObject objects.