How to get all data I have Inserted? - json

I have made a small app using json and reactiveMongo which inserts
Students Information.
object Applications extends Controller{
val studentDao = StudentDaoAndEntity
val studentqueryReader: Reads[JsObject] = implicitly[Reads[JsObject]]
def saveStudent = Action.async(parse.json) { request =>
request.body.validate[StudentInfo].map {
k => studentDao.insertStudent(k).map {
l => Ok("Successfully inserted")
}
}.getOrElse(Future.successful(BadRequest("Invalid Json"))
In databse
object StudentDaoAndEntity {
val sreader: Reads[StudentInfo] = Json.reads[StudentInfo]
val swriter: Writes[StudentInfo] = Json.writes[StudentInfo]
val studentqueryReader: Reads[JsObject] = implicitly[Reads[JsObject]]
def db = ReactiveMongoPlugin.db
def collection: JSONCollection = db[JSONCollection]("student")
def insertStudent(student: StudentInfo): Future[JsObject]= {
val modelToJsObj = swriter.writes(student).as[JsObject]
collection.insert(modelToJsObj) map (_ => modelToJsObj)
}
This works fine. Now I need to get all data I have inserted. How can I
do that? I am not asking for code but for Idea.

First of all: it seems that you are using Play-ReactiveMongo (as far as i know, JSONCollection is not part of ReactiveMongo itself). If this is the case, then your code is unnecessarily complex. Instead of doing the JSON conversions manually, you just can pass your StudentInfo objects directly to insert. Minimal example:
val studentInfo: StudentInfo = ...
def collection: JSONCollection = db[JSONCollection]("student")
collection.insert(studentInfo)
That's the elegant part of the Play plugin. Yes, MongoDB persists data as JSON (or BSON, to be more precise), but you don't have do deal with it. Just make sure the implicit Writes (or Reads, in case of querying) is in scope, as well as other necessary imports (e.g. play.modules.reactivemongo.json._).
Now I need to get all data I have inserted. How can I do that? I am
not asking for code but for Idea.
Well, you want to have a look at the documentation (scroll down for examples), it's quite simple and there is not more to it. In your case, it could look like this:
// perform query via cursor
val cursor: Cursor[StudentInfo] =
collection.find(Json.obj("lastName" -> "Regmi")).cursor[StudentInfo]
// gather results as list
val futureStudents: Future[List[StudentInfo]] = cursor.collect[List]()
In this case, you get all students with the last name Regmi. If you really want to retrieve all students, then you probably need to pass an empty JsObject as your query. Again, it's not necessary to deal with JSON conversions, as long as the implicit Reads is in scope.

Here is the Complete Answer of my own Question
package controllers
def findAll=Action.async {
val cursor = Json.obj()
StudentDaoAndEntity.findAllStudent(cursor) map {
case Nil => Ok("Student Not Found")
case l:Seq[JsObject] => Ok(Json.toJson(l))
}
}
def findAllStudent(allStd: JsObject): Future[Seq[JsObject]] = {
// gather all the JsObjects in a list
collection.find(allStd).cursor[JsObject].collect[List]()
}

Related

Scala Slick: how to serialize the result of a query to a variable table

I would like to provide a select method to a controller that takes a table name as parameter and returns the result of the SELECT query as JSON. So far I have done this:
def specialSelect(tableName: String) = Action.async {
val tq: TableQuery[_] = tableObjectFactory(tableName)
val res: Future[Seq[_]] = db.run(tq.result)
res.map { p:Seq[_] => Ok(p.toJson) }
}
where tableObjectFactory takes my table name and returns a TableQuery of the specific type:
def tableObjectFactory(tableName: String): TableQuery[_] = {
tableName match {
case "users" => TableQuery[Users]
case "projects" => TableQuery[Projects]
}
}
If fails because there is no JSON serializer defined for a a generic Seq[_] (actually _ should be a Product with Serializable, not sure if that helps).
The table name is not known in advance (found in the URL, such as in "/special_select/<tableName>"), and I have 120 such tables so I can't just implement it for each table.
Is there a way to serialize any Seq[_], knowing that the _ is always a Slick row result (e.g. a case class UsersRow), whatever the table is?
I have read about Generic DAOs and ActiveSlick, but I am not sure if should go so far.
Gson worked, thanks #pamu. Add
libraryDependencies += "com.google.code.gson" % "gson" % "2.8.0"
to build.sbt, then this will convert any Seq[_] or rows to a JsArray that can be given as http response:
import com.google.gson.Gson
import play.api.libs.json._
val gson: Gson = new Gson()
def slickToJson(res: Seq[_]): JsArray = {
JsArray(res.map(gson.toJson).map(Json.parse))
}
// in controller Action.async:
db.run(...).map(p => Ok(slickToJson(p)))
Rows are serialized to String by Gson, then parsed back to JsObjects by Play api, then put in a JsArray. I couldn't find better.

type mismatch; found : scala.collection.immutable.Stream[String] required: String in Play Scala?

I am trying to display json data from database using scala/anorm of Play(2.2.x), If I give the following trial, I am getting the error like: type mismatch; found : scala.collection.immutable.Stream[String] required: String, but If I write like: sql().map(row => rowString).toString - is giving Stream type json data(i don't need it), so how can I get my normal json data for it ? Please help me and thank in advance.
contorller:
class Test extends Controller {
def getTest = Action {
var sql: SqlQuery = SQL("select name::TEXT from test");
def values: String = DB.withConnection { implicit connection =>
sql().map(row => row[String]("name"))//giving error: type mismatch; found : scala.collection.immutable.Stream[String] required: String
}
Ok(values)
}
It looks like you are using Anorm for database access. As noted in the comments above, executing the Anorm query returns a Stream rather than just a value. When you map the value, it is again returning a Stream of String so that the results can be processed as a stream. This is very useful for large result sets where it makes more sense to process the stream incrementally and stream the result to the client. It looks like you are just getting started so you probably don't want to worry about that right now.
You can covert the result from a stream into a conventional list by simply using toList:
val resList = sql().map(row => row[String]("name")).toList
Now, you also need to unescape the string and return it as a JSON result. That can be done by using string interpolations. To do this, you just need to surround the String in a StringContext and call the s method with no arguments.
val newResList = resList.map(s => StringContext(s).s())
Finally, you should actually be converting the String into a PlayJson JsValue type so that your controller is actually returning the right type.
Json.parse(newResList.mkString("[",",","]")
The mkString method is used to convert the list into a valid JSON string. You will need to import play.api.libs.json.Json for this to work. Passing a JsValue to Ok ensures that the mime type of the response is set to "application/json". You can learn more about using JSON with play here. This will be important if you intend to build JSON services.
Putting it together, you get the following:
class Test extends Controller {
def getTest = Action {
var sql: SqlQuery = SQL("select name::TEXT from test");
def values: JsValue = DB.withConnection { implicit connection =>
val resList = sql().map(row => row[String]("name")).toList
val newResList = resList.map(s => StringContext(s).s())
Json.parse(newResList.mkString("[",",","]")
}
Ok(values)
}
You could also use the following alternate structure to simplify and get rid of the rather sloppy mkString call.
class Test extends Controller {
def getTest = Action {
var sql: SqlQuery = SQL("select name::TEXT from test");
def values: JsValue = DB.withConnection { implicit connection =>
val resList = sql().map(row => row[String]("name")).toList
Json.toJson(resList.map(s => Json.parse(StringContext(s).s())))
}
Ok(values)
}
This parses each of the JSON strings in the list and then converts the list into JsArray again using the capabilities of Play's built in JSON library.

Suggestions for Writing Map as JSON file in Scala

I have a simple single key-valued Map(K,V) myDictionary that is populated by my program and at the end I want to write it as JSON format string in a text file - as I would need parse them later.
I was using this code earlier,
Some(new PrintWriter(outputDir+"/myDictionary.json")).foreach{p => p.write(compact(render(decompose(myDictionary)))); p.close}
I found it to be slower as the input size increased. Later, I used this var out = new
var out = new PrintWriter(outputDir+"/myDictionary.json");
out.println(scala.util.parsing.json.JSONObject(myDictionary.toMap).toString())
This is proving to be bit faster.
I have run this for sample input and found that this is faster than my earlier approach. I assuming my input map size would reach at least a million values( >1GB text file) (K,V) hence I want to make sure that I follow the faster and memory efficient approach for Map serialization process.What are other approaches that you would recommend,that I can look into to optimize this.
The JSON support in the standard Scala library is probably not the best choice. Unfortunately the situation with JSON libraries for Scala is a bit confusing, there are many alternatives (Lift JSON, Play JSON, Spray JSON, Twitter JSON, Argonaut, ...), basically one library for each day of the week... I suggest you have a look at these at least to see if any of them is easier to use and more performative.
Here is an example using Play JSON which I have chosen for particular reasons (being able to generate formats with macros):
object JsonTest extends App {
import play.api.libs.json._
type MyDict = Map[String, Int]
implicit object MyDictFormat extends Format[MyDict] {
def reads(json: JsValue): JsResult[MyDict] = json match {
case JsObject(fields) =>
val b = Map.newBuilder[String, Int]
fields.foreach {
case (k, JsNumber(v)) => b += k -> v.toInt
case other => return JsError(s"Not a (string, number) pair: $other")
}
JsSuccess(b.result())
case _ => JsError(s"Not an object: $json")
}
def writes(m: MyDict): JsValue = {
val fields: Seq[(String, JsValue)] = m.map {
case (k, v) => k -> JsNumber(v)
} (collection.breakOut)
JsObject(fields)
}
}
val m = Map("hallo" -> 12, "gallo" -> 34)
val serial = Json.toJson(m)
val text = Json.stringify(serial)
println(text)
val back = Json.fromJson[MyDict](serial)
assert(back == JsSuccess(m), s"Failed: $back")
}
While you can construct and deconstruct JsValues directly, the main idea is to use a Format[A] where A is the type of your data structure. This puts more emphasis on type safety than the standard Scala-Library JSON. It looks more verbose, but in end I think it's the better approach.
There are utility methods Json.toJson and Json.fromJson which look for an implicit format of the type you want.
On the other hand, it does construct everything in-memory and it does duplicate your data structure (because for each entry in your map you will have another tuple (String, JsValue)), so this isn't necessarily the most memory efficient solution, given that you are operating in the GB magnitude...
Jerkson is a Scala wrapper for the Java JSON library Jackson. The latter apparently has the feature to stream data. I found this project which says it adds streaming support. Play JSON in turn is based on Jerkson, so perhaps you can even figure out how to stream your object with that. See also this question.

Play 2.1 Reading JSON Objects in order

JSON to Parse: http://www.dota2.com/jsfeed/heropickerdata?v=18874723138974056&l=english
Hero Class and JSON Serialization
case class Hero(
var id:Option[Int],
name: String,
bio: String,
var trueName:Option[String]
){}
implicit val modelReader: Reads[Hero] = Json.reads[Hero]
Reading Data
val future: Future[play.api.libs.ws.Response] = WS.url("http://www.dota2.com/jsfeed/heropickerdata?v=18874723138974056&l=english").get()
val json = Json.parse(Await.result(future,5 seconds).body).as[Map[String, Hero]]
var i = 1
json.foreach(p => {
p._2.trueName = Some(p._1)
p._2.id = Some(i)
p._2.commitToDatabase
i += 1
})
I need to get the id of each hero. The order of heros in the json matches their id. Obviously a map is unordered and wont work. Does anyone have any other ideas?
I have tried to use a LinkedHashMap. I even tried to make an implicit Reads for LinkedHashMap but I've failed. If anyone thinks that this is the answer then would you please give me some guidance?
It keeps just saying "No Json deserializer found for type scala.collection.mutable.LinkedHashMap[String,models.Hero]. Try to implement an implicit Reads or Format for this type.". I have the trait imported into the file i'm trying to read from. I have a funny feeling that the last line in my Reads is the problem. i think I can't just do the asInstanceOf, however I have no other ideas of how to do this reads.
LinkedHashMap Implicit Reads Code: http://pastebin.com/cf5NpSCX
You can try extracting data in order from the JsObject returned by Json.parse directly, possibly like this:
val json = Json.parse(Await.result(future,5 seconds).body)
val heroes: Map[String, Hero] = json match {
case obj: JsObject =>
obj.fields.zipWithIndex.map{ case ((name: String, heroJson: JsValue), id) =>
heroJson.asOpt[Hero].map{ _.copy(id = Some(id)) }
}.flatten.toMap
case _ = > Seq.empty
}
I don't believe you'll need an order-preserving map anymore since the ids are generated and fixed.

Optimal way to read out JSON from MongoDB into a Scalatra API

I have a pre-formatted JSON blob stored as a string in MongoDB as a field in one of collections. Currently in my Scalatra based API, I have a before filter that renders all of my responses with a JSON content type. An example of how I return the content looks like the following:
get ("/boxscore", operation(getBoxscore)) {
val game_id:Int = params.getOrElse("game_id", "3145").toInt
val mongoColl = mongoDb.apply("boxscores")
val q: DBObject = MongoDBObject("game_id" -> game_id)
val res = mongoColl.findOne(q)
res match {
case Some(j) => JSON.parseFull(j("json_body").toString)
case None => NotFound("Requested document could not be found.")
}
}
Now this certainly does work. It doesn't seem the "Scala" way of doing things and I feel like this can be optimized. The worrisome part to me is when I add a caching layer and a cache does not hit that I am spending additional CPU time on re-parsing a String I already formatted as JSON in MongoDB:
JSON.parseFull(j("json_body").toString)
I have to take the result from findOne(), run .toString on it, then re-parse it into JSON afterwards. Is there a more optimal route? Since the JSON is already stored as a String in MongoDB, I'm guessing a serializer / case class isn't the right solution here. Of course I can just leave what's here - but I'd like to learn if there's a way that would be more Scala-like and CPU friendly going forward.
There is the option to extend Scalatra's render pipeline with handling for MongoDB classes. The following two routes act as an example. They return a MongoCursor and a DBObject as result. We are going to convert those to a string.
get("/") {
mongoColl.find
}
get("/:key/:value") {
val q = MongoDBObject(params("key") -> params("value"))
mongoColl.findOne(q) match {
case Some(x) => x
case None => halt(404)
}
}
In order to handle the types we need to define a partial function which takes care of the conversion and sets the appropriate content type.
There are two cases, the first one handles a DBObject. The content type is set to "application/json" and the object is converted to a string by calling the toString method. The second case handles a MongoCursor. Since it implements TraversableOnce the map function can be used.
def renderMongo = {
case dbo: DBObject =>
contentType = "application/json"
dbo.toString
case xs: TraversableOnce[_] => // handles a MongoCursor, be aware of type erasure here
contentType = "application/json"
val ls = xs map (x => x.toString) mkString(",")
"[" + ls + "]"
}: RenderPipeline
(Note the following type definition: type RenderPipeline = PartialFunction[Any, Any])
Now the method needs to get hooked in. After a HTTP call has been handled the result is forwarded to the render pipeline for further conversion. Custom handling can be added by overriding the renderPipeline method from ScalatraBase. With the following definition the renderMongo function is called first:
override protected def renderPipeline = renderMongo orElse super.renderPipeline
This is a basic approach to handle MongoDB types. There are other options as well, for example by making use of json4s-mongo.
Here is the previous code in a working sample project.