JSON to Map[String,JsObject] with Play 2.0? - json

I'm new to both Play! & Scala, but I'm trying to make a service that will map the JSON request to a Map[String,JsObject] (or Map[String,JsValue], I'm not sure about the distinction), and then output a list of the keys recursively through the map (preferably as a tree).
But I'm having start issues:
def genericJSONResponse = Action(parse.json) {
request => request.body
var keys = request.keys
Ok("OK")
}
What I would expect here was for keys to be filled with the keys from the request, but of course, it doesn't compile. How should I approach this, given the description above?
Thanks in advance for helping out a Scala noob :-)
Nik

JsValue is the base class for all JSON values. JsObject is a subtype of JsValue (along with JsNull, JsUndefined, JsBoolean, JsNumber, JsString, and JsArray). Take a look at JSON spec if it's unclear: http://json.org/
If you know that the JSON in the body request is a JSON object (as opposed to other types listed above) you can pattern-match it:
def genericJSONResponse = Action(parse.json) { request =>
request.body match {
case JsObject(fields) => Ok("received object:" + fields.toMap + '\n')
case _ => Ok("received something else: " + request.body + '\n')
}
}
fields.toMap is of type you wanted: Map[(String, JsValue)] so you can use map or collect to process the object's keys recursively. (By the way, you can use fields directly, since it's a Seq[(String, JsValue)] and supports map and collect as well).

Related

How can I use http request headers for content negotiation in a Mashaller?

My app supports protobuf and JSON serialzation. For JSON serialization I use com.trueaccord.scalapb.json.JsonFormat, my dtos are generated from proto definitions.
The com.trueaccord serializer wraps option types to JSON objects which is causing issues for some clients so I want to be able to support org.json4s without braking the existing clients.
I would like to be able to pick a serializer based on a custom http header called JFORMAT. The idea is that if this header is sent I will use json4s otherwise I will use the trueaccord serializer.
I managed to create a Unmarshaller which can pick a request serializer based on a header value:
Unmarshaller.withMaterializer[HttpRequest, T](_ => implicit mat => {
case request: HttpRequest =>
val entity = request.entity
entity.dataBytes.runFold(ByteString.empty)(_ ++ _).map(data => {
entity.contentType match {
case `applicationJsonContentType` =>
val jsFormat = {
val header = request.headers.find(h => h.name() == jsonFormatHeaderName)
if (header.isEmpty) "1.0" else header.get.value()
}
val charBuffer = Unmarshaller.bestUnmarshallingCharsetFor(entity)
val jsonText = data.decodeString(charBuffer.nioCharset().name())
val dto = if(jsFormat == "2.0") {
write[T](value)(formats) // New Formatter
} else {
JsonFormat.fromJsonString[T](jsonText) // Old Formatter
}
dto
case `protobufContentType` =>
companion.parseFrom(CodedInputStream.newInstance(data.asByteBuffer)) // Proto Formatter
case _ =>
throw UnsupportedContentTypeException(applicationJsonContentType, protobufContentType)
}
})
I want to do the same with my Marshaller which I use with Marshaller.oneOf and the JSON handling one looks like:
Marshaller.withFixedContentType(contentType) { value =>
val jsonText = JsonSerializer.toJsonString[T](value)
HttpEntity(contentType, jsonText)
}
Is there a way to construct a Mashaller which is aware of the request http headers? The Akka HTTP docs don't have any examples and I cannot make sense of the PredefinedToRequestMarshallers.
Do I need to combine multiple marshallers somehow or can I append some metadata to a context during the request serialization I can use later in the Marshaller? I want to avoid appending meta to my dto if possible or using a custom content type like application/vnd.api+json
There are lots of other useful info I could use from the request when I format the response like Accept-Encoding, custom headers like unique request id to create a correlation id, I could add JSONP support by reading the callback query parmeter, etc.
To clarify: I need a solution to use the Mashaller, subclass of it or a custom version created by a factory method or maybe multiple Marshallers chained together. Marshaller.withFixedContentType already using the Accept header so there must be a way. I added added bounty to reward a solution to a specific challenge. I am ware of hacks and workarounds and I asked the question because I need a clean solution solving a specific scenario.
Custom Marshallers section mentions Marshaller.oneOf overloaded methods, that seems to be what you want:
Helper for creating a "super-marshaller" from a number of
"sub-marshallers". Content-negotiation determines, which
"sub-marshaller" eventually gets to do the job.
The Marshaller companion object has many methods that receive a Seq[HttpHeader]. You can look into their implementations as well.
I don't have the time to look into the source code myself, but if this is not enough to put you on the right path, let me know.
Edit:
How about?
get {
optionalHeaderValueByName("JFORMAT") { format =>
complete {
format match {
case Some(f) => "Complete with json4s"
case _ => "Complete with trueaccord"
}
}
}
}

Play Framework 2.2.1 / How should controller handle a queryString in JSON format

My client side executes a server call encompassing data (queryString) in a JSON object like this:
?q={"title":"Hello"} //non encoded for the sample but using JSON.stringify actually
What is an efficient way to retrieve the title and Hello String?
I tried this:
val params = request.queryString.map {case(k,v) => k->v.headOption}
that returns the Tuple: (q,Some({"title":"hello"}))
I could further extract to retrieve the values (although I would need to manually map the JSON object to a Scala object), but I wonder whether there is an easier and shorter way.
Any idea?
First, if you intend to pluck only the q parameter from a request and don't intend to do so via a route you could simply grab it directly:
val q: Option[String] = request.getQueryString("q")
Next you'd have to parse it as a JSON Object:
import play.api.libs.json._
val jsonObject: Option[JsValue] = q.map(raw: String => json.parse(raw))
with that you should be able to check for the components the jsonObject contains:
val title: Option[String] = jsonObject.flatMap { json: JsValue =>
(json \ "title").asOpt[String]
}
In short, omitting the types you could use a for comprehension for the whole thing like so:
val title = for {
q <- request.getQueryString("q")
json <- Try(Json.parse(q)).toOption
titleValue <- (json \ "title").asOpt[String]
} yield titleValue
Try is defined in scala.util and basically catches Exceptions and wraps it in a processable form.
I must admit that the last version simply ignores Exceptions during the parsing of the raw JSON String and treats them equally to "no title query has been set".
That's not the best way to know what actually went wrong.
In our productive code we're using implicit shortcuts that wraps a None and JsError as a Failure:
val title: Try[String] = for {
q <- request.getQueryString("q") orFailWithMessage "Query not defined"
json <- Try(Json.parse(q))
titleValue <- (json \ "title").validate[String].asTry
} yield titleValue
Staying in the Try monad we gain information about where it went wrong and can provide that to the User.
orFailWithMessage is basically an implicit wrapper for an Option that will transform it into Succcess or Failure with the specified message.
JsResult.asTry is also simply a pimped JsResult that will be Success or Failure as well.

Suggestions for Writing Map as JSON file in Scala

I have a simple single key-valued Map(K,V) myDictionary that is populated by my program and at the end I want to write it as JSON format string in a text file - as I would need parse them later.
I was using this code earlier,
Some(new PrintWriter(outputDir+"/myDictionary.json")).foreach{p => p.write(compact(render(decompose(myDictionary)))); p.close}
I found it to be slower as the input size increased. Later, I used this var out = new
var out = new PrintWriter(outputDir+"/myDictionary.json");
out.println(scala.util.parsing.json.JSONObject(myDictionary.toMap).toString())
This is proving to be bit faster.
I have run this for sample input and found that this is faster than my earlier approach. I assuming my input map size would reach at least a million values( >1GB text file) (K,V) hence I want to make sure that I follow the faster and memory efficient approach for Map serialization process.What are other approaches that you would recommend,that I can look into to optimize this.
The JSON support in the standard Scala library is probably not the best choice. Unfortunately the situation with JSON libraries for Scala is a bit confusing, there are many alternatives (Lift JSON, Play JSON, Spray JSON, Twitter JSON, Argonaut, ...), basically one library for each day of the week... I suggest you have a look at these at least to see if any of them is easier to use and more performative.
Here is an example using Play JSON which I have chosen for particular reasons (being able to generate formats with macros):
object JsonTest extends App {
import play.api.libs.json._
type MyDict = Map[String, Int]
implicit object MyDictFormat extends Format[MyDict] {
def reads(json: JsValue): JsResult[MyDict] = json match {
case JsObject(fields) =>
val b = Map.newBuilder[String, Int]
fields.foreach {
case (k, JsNumber(v)) => b += k -> v.toInt
case other => return JsError(s"Not a (string, number) pair: $other")
}
JsSuccess(b.result())
case _ => JsError(s"Not an object: $json")
}
def writes(m: MyDict): JsValue = {
val fields: Seq[(String, JsValue)] = m.map {
case (k, v) => k -> JsNumber(v)
} (collection.breakOut)
JsObject(fields)
}
}
val m = Map("hallo" -> 12, "gallo" -> 34)
val serial = Json.toJson(m)
val text = Json.stringify(serial)
println(text)
val back = Json.fromJson[MyDict](serial)
assert(back == JsSuccess(m), s"Failed: $back")
}
While you can construct and deconstruct JsValues directly, the main idea is to use a Format[A] where A is the type of your data structure. This puts more emphasis on type safety than the standard Scala-Library JSON. It looks more verbose, but in end I think it's the better approach.
There are utility methods Json.toJson and Json.fromJson which look for an implicit format of the type you want.
On the other hand, it does construct everything in-memory and it does duplicate your data structure (because for each entry in your map you will have another tuple (String, JsValue)), so this isn't necessarily the most memory efficient solution, given that you are operating in the GB magnitude...
Jerkson is a Scala wrapper for the Java JSON library Jackson. The latter apparently has the feature to stream data. I found this project which says it adds streaming support. Play JSON in turn is based on Jerkson, so perhaps you can even figure out how to stream your object with that. See also this question.

Optimal way to read out JSON from MongoDB into a Scalatra API

I have a pre-formatted JSON blob stored as a string in MongoDB as a field in one of collections. Currently in my Scalatra based API, I have a before filter that renders all of my responses with a JSON content type. An example of how I return the content looks like the following:
get ("/boxscore", operation(getBoxscore)) {
val game_id:Int = params.getOrElse("game_id", "3145").toInt
val mongoColl = mongoDb.apply("boxscores")
val q: DBObject = MongoDBObject("game_id" -> game_id)
val res = mongoColl.findOne(q)
res match {
case Some(j) => JSON.parseFull(j("json_body").toString)
case None => NotFound("Requested document could not be found.")
}
}
Now this certainly does work. It doesn't seem the "Scala" way of doing things and I feel like this can be optimized. The worrisome part to me is when I add a caching layer and a cache does not hit that I am spending additional CPU time on re-parsing a String I already formatted as JSON in MongoDB:
JSON.parseFull(j("json_body").toString)
I have to take the result from findOne(), run .toString on it, then re-parse it into JSON afterwards. Is there a more optimal route? Since the JSON is already stored as a String in MongoDB, I'm guessing a serializer / case class isn't the right solution here. Of course I can just leave what's here - but I'd like to learn if there's a way that would be more Scala-like and CPU friendly going forward.
There is the option to extend Scalatra's render pipeline with handling for MongoDB classes. The following two routes act as an example. They return a MongoCursor and a DBObject as result. We are going to convert those to a string.
get("/") {
mongoColl.find
}
get("/:key/:value") {
val q = MongoDBObject(params("key") -> params("value"))
mongoColl.findOne(q) match {
case Some(x) => x
case None => halt(404)
}
}
In order to handle the types we need to define a partial function which takes care of the conversion and sets the appropriate content type.
There are two cases, the first one handles a DBObject. The content type is set to "application/json" and the object is converted to a string by calling the toString method. The second case handles a MongoCursor. Since it implements TraversableOnce the map function can be used.
def renderMongo = {
case dbo: DBObject =>
contentType = "application/json"
dbo.toString
case xs: TraversableOnce[_] => // handles a MongoCursor, be aware of type erasure here
contentType = "application/json"
val ls = xs map (x => x.toString) mkString(",")
"[" + ls + "]"
}: RenderPipeline
(Note the following type definition: type RenderPipeline = PartialFunction[Any, Any])
Now the method needs to get hooked in. After a HTTP call has been handled the result is forwarded to the render pipeline for further conversion. Custom handling can be added by overriding the renderPipeline method from ScalatraBase. With the following definition the renderMongo function is called first:
override protected def renderPipeline = renderMongo orElse super.renderPipeline
This is a basic approach to handle MongoDB types. There are other options as well, for example by making use of json4s-mongo.
Here is the previous code in a working sample project.

How do I deserialize a JSON array using the Play API

I have a string that is a Json array of two objects.
> val ss = """[ {"key1" :"value1"}, {"key2":"value2"}]"""
I want to use the Play Json libraries to deserialize it and create a map from the key values to the objects.
def deserializeJsonArray(ss:String):Map[String, JsValue] = ???
// Returns Map("value1" -> {"key1" :"value1"}, "value2" -> {"key2" :"value2"})
How do I write the deserializeJsonArray function? This seems like it should be easy, but I can't figure it out from either the Play documentation or the REPL.
I'm a bit rusty, so please forgive the mess. Perhaps another overflower can come in here and clean it up for me.
This solution assumes that the JSON is an array of objects, and each of the objects contains exactly one key-value pair. I would highly recommend spicing it up with some error handling and/or pattern matching to validate the parsed JSON string.
def deserializeJsonArray(ss: String): Map[String, JsValue] = {
val jsObjectSeq: Seq[JsObject] = Json.parse(ss).as[Seq[JsObject]]
val jsValueSeq: Seq[JsValue] = Json.parse(ss).as[Seq[JsValue]]
val keys: Seq[String] = jsObjectSeq.map(json => json.keys.head)
(keys zip jsValueSeq).toMap
}