Mapping a sequence of results from Slick monadic join to Json - json

I'm using Play 2.4 with Slick 3.1.x, specifically the Slick-Play plugin v1.1.1. Firstly, some context... I have the following search/filter method in a DAO, which joins together 4 models:
def search(
departureCity: Option[String],
arrivalCity: Option[String],
departureDate: Option[Date]
) = {
val monadicJoin = for {
sf <- slickScheduledFlights.filter(a =>
departureDate.map(d => a.date === d).getOrElse(slick.lifted.LiteralColumn(true))
)
fl <- slickFlights if sf.flightId === fl.id
al <- slickAirlines if fl.airlineId === al.id
da <- slickAirports.filter(a =>
fl.departureAirportId === a.id &&
departureCity.map(c => a.cityCode === c).getOrElse(slick.lifted.LiteralColumn(true))
)
aa <- slickAirports.filter(a =>
fl.arrivalAirportId === a.id &&
arrivalCity.map(c => a.cityCode === c).getOrElse(slick.lifted.LiteralColumn(true))
)
} yield (fl, sf, al, da, aa)
db.run(monadicJoin.result)
}
The output from this is a Vector containing sequences, e.g:
Vector(
(
Flight(Some(1),123,216,2013,3,1455,2540,3,905,500,1150),
ScheduledFlight(Some(1),1,2016-04-13,90,10),
Airline(Some(216),BA,BAW,British Airways,United Kingdom),
Airport(Some(2013),LHR,Heathrow,LON,...),
Airport(Some(2540),JFK,John F Kennedy Intl,NYC...)
),
(
etc ...
)
)
I'm currently rendering the JSON in the controller by calling .toJson on a Map and inserting this Vector (the results param below), like so:
flightService.search(departureCity, arrivalCity, departureDate).map(results => {
Ok(
Map[String, Any](
"status" -> "OK",
"data" -> results
).toJson
).as("application/json")
})
While this sort of works, it produces output in an unusual format; an array of results (the rows) within each result object the joins are nested inside objects with keys: "_1", "_2" and so on.
So the question is: How should I go about restructuring this?
There doesn't appear to be anything which specifically covers this sort of scenario in the Slick docs. Therefore I would be grateful for some input on what might be the best way to refactor this Vector of Seq's, with a view to renaming each of the joins or even flattening it out and only keeping certain fields?
Is this best done in the DAO search method before it's returned (by mapping it somehow?) or in the controller after I get back the Future results Vector from the search method?
Or I'm wondering whether it would be preferable to abstract this sort of mutation out somewhere else entirely, using a transformer perhaps?

You need JSON Reads/Writes/Format Combinators
In the first place you must have Writes[T] for all your classes (Flight, ScheduledFlight, Airline, Airport).
Simple way is using Json macros
implicit val flightWrites: Writes[Flight] = Json.writes[Flight]
implicit val scheduledFlightWrites: Writes[ScheduledFlight] = Json.writes[ScheduledFlight]
implicit val airlineWrites: Writes[Airline] = Json.writes[Airline]
implicit val airportWrites: Writes[Airport] = Json.writes[Airport]
You must implement OWrites[(Flight, ScheduledFlight, Airline, Airport, Airport)] for Vector item also. For example:
val itemWrites: OWrites[(Flight, ScheduledFlight, Airline, Airport, Airport)] = (
(__ \ "flight").write[Flight] and
(__ \ "scheduledFlight").write[ScheduledFlight] and
(__ \ "airline").write[Airline] and
(__ \ "airport1").write[Airport] and
(__ \ "airport2").write[Airport]
).tupled
for writing whole Vector as JsAray use Writes.seq[T]
val resultWrites: Writes[Seq[(Flight, ScheduledFlight, Airline, Airport, Airport)]] = Writes.seq(itemWrites)
We have all to response your data
flightService.search(departureCity, arrivalCity, departureDate).map(results =>
Ok(
Json.obj(
"status" -> "Ok",
"data" -> resultWrites.writes(results)
)
)

Related

Play framework 2.5 reads optional filter

I have the following reads function for parsing JSON files.
case class tables(col1 : Option[List[another case class]], col2 : Option[List[another case class]], col3 : Option[List[another case class]], col4 : Option[List[another case class]])
implicit val tablesRead: Reads[tables] = (
(JsPath \ "col1").read(Reads.optionWithNull[List[data1]]).filterNot(_.get.isEmpty) and
(JsPath \ "col2").read(Reads.optionWithNull[List[data2]]).filterNot(_.get.isEmpty) and
(JsPath \ "col3").read(Reads.optionWithNull[List[data3]]).filterNot(_.get.isEmpty) and
(JsPath \ "col4").read(Reads.optionWithNull[List[data4]]).filterNot(_.get.isEmpty)
) (tables.apply _)
I want to then insert the JSON into a database after having validated it. I have therefore declared the following function.
def createFromJson = Action.async(parse.json) { request =>
request.body.validate[jsonWrapper] match {
case JsSuccess(data, _) =>
for {
dbFuture <- dataFuture(data.userID)
lastError <- dbFuture.insert(data.tables)
} yield {
Ok("Success\n")
}
case JsError(errors) => Future.successful(BadRequest("Failed :" + Error.show(errors)))
}
}
This works and correctly rejects JSONs looking like this:
{"tables":{"col1":[],"col2":[],"col3":[],"col4":[]}, "userID":"irrelavent"}
and accepts JSONs with actual data in, like so:
{"tables":{"col1":[{data1}],"col2":[{data2}],"col3":[{data3}],"col4":[{data4}]}, "userID":"irrelavent"}
But want i need is something that does this but also accepts a JSON with missing fields
{"tables":{"col1":[{data1}],"col2":[],"col3":[{data3}],"col4":[{data4}]}, "userID":"irrelavent"}
And preferable ignore them (i.e. return something like :
{"tables":{"col1":[{data1}],"col3":[{data3}],"col4":[{data4}]}, "userID":"irrelavent"})
Is this possible to do?
Many thanks,
Peter M.
You can automatically generate a Reads[tables] using Json.reads macro with the behavior you want:
implicit val tablesRead: Reads[tables] = Json.reads[tables]
If the fields is missing from the JSON the column will be None.
On a minor note, the common form in scala is to start a class name with a capital letter so you should rename tables to Tables.

Play JSON: Reading and validating a JsObject with unknown keys

I'm reading a nested JSON document using several Reads[T] implementations, however, I'm stuck with the following sub-object:
{
...,
"attributes": {
"keyA": [1.68, 5.47, 3.57],
"KeyB": [true],
"keyC": ["Lorem", "Ipsum"]
},
...
}
The keys ("keyA", "keyB"...) as well as the amount of keys are not known at compile time and can vary. The values of the keys are always JsArray instances, but of different size and type (however, all elements of a particular array must have the same JsValue type).
The Scala representation of one single attribute:
case class Attribute[A](name: String, values: Seq[A])
// 'A' can only be String, Boolean or Double
The goal is to create a Reads[Seq[Attribute]] that can be used for the "attributes"-field when transforming the whole document (remember, "attributes" is just a sub-document).
Then there is a simple map that contains allowed combinations of keys and array types that should be used to validate attributes. Edit: This map is specific for each request (or rather specific for every type of json document). But you can assume that it is always available in the scope.
val required = Map(
"KeyA" -> "Double",
"KeyB" -> "String",
"KeyD" -> "String",
)
So in the case of the JSON shown above, the Reads should create two errors:
"keyB" does exist, but has the wrong type (expected String, was boolean).
"keyD" is missing (whereas keyC is not needed and can be ignored).
I'm having trouble creating the necessary Reads. The first thing I tried as a first step, from the perspective of the outer Reads:
...
(__ \ "attributes").reads[Map[String, JsArray]]...
...
I thought this is a nice first step because if the JSON structure is not an object containing Strings and JsArrays as key-value pairs, then the Reads fails with proper error messages. It works, but: I don't know how to go on from there. Of course I just could create a method that transforms the Map into a Seq[Attribute], but this method somehow should return a JsResult, since there are further validations to do.
The second thing I tried:
val attributeSeqReads = new Reads[Seq[Attribute]] {
def reads(json: JsValue) = json match {
case JsObject(fields) => processAttributes(fields)
case _ => JsError("attributes not an object")
}
def processAttributes(fields: Map[String, JsValue]): JsResult[Seq[Attribute]] = {
// ...
}
}
The idea was to validate each element of the map manually within processAttributes. But I think this is too complicated. Any help is appreciated.
edit for clarification:
At the beginning of the post I said that the keys (keyA, keyB...) are unknown at compile time. Later on I said that those keys are part of the map required which is used for validation. This sounds like a contradiction, but the thing is: required is specific for each document/request and is also not known at compile time. But you don't need to worry about that, just assume that for every request the correct required is already available in the scope.
You are too confused by the task
The keys ("keyA", "keyB"...) as well as the amount of keys are not known at compile time and can vary
So the number of keys and their types are known in advance and the final?
So in the case of the JSON shown above, the Reads should create two
errors:
"keyB" does exist, but has the wrong type (expected String, was
boolean).
"keyD" is missing (whereas keyC is not needed and can be ignored).
Your main task is just to check the availability and compliance?
You may implement Reads[Attribute] for every your key with Reads.list(Reads.of[A]) (this Reads will check type and required) and skip omitted (if not required) with Reads.pure(Attribute[A]). Then tuple convert to list (_.productIterator.toList) and you will get Seq[Attribute]
val r = (
(__ \ "attributes" \ "keyA").read[Attribute[Double]](list(of[Double]).map(Attribute("keyA", _))) and
(__ \ "attributes" \ "keyB").read[Attribute[Boolean]](list(of[Boolean]).map(Attribute("keyB", _))) and
((__ \ "attributes" \ "keyC").read[Attribute[String]](list(of[String]).map(Attribute("keyC", _))) or Reads.pure(Attribute[String]("keyC", List()))) and
(__ \ "attributes" \ "keyD").read[Attribute[String]](list(of[String]).map(Attribute("keyD", _)))
).tupled.map(_.productIterator.toList)
scala>json1: play.api.libs.json.JsValue = {"attributes":{"keyA":[1.68,5.47,3.57],"keyB":[true],"keyD":["Lorem","Ipsum"]}}
scala>res37: play.api.libs.json.JsResult[List[Any]] = JsSuccess(List(Attribute(keyA,List(1.68, 5.47, 3.57)), Attribute(KeyB,List(true)), Attribute(keyC,List()), Attribute(KeyD,List(Lorem, Ipsum))),)
scala>json2: play.api.libs.json.JsValue = {"attributes":{"keyA":[1.68,5.47,3.57],"keyB":[true],"keyC":["Lorem","Ipsum"]}}
scala>res38: play.api.libs.json.JsResult[List[Any]] = JsError(List((/attributes/keyD,List(ValidationError(List(error.path.missing),WrappedArray())))))
scala>json3: play.api.libs.json.JsValue = {"attributes":{"keyA":[1.68,5.47,3.57],"keyB":["Lorem"],"keyC":["Lorem","Ipsum"]}}
scala>res42: play.api.libs.json.JsResult[List[Any]] = JsError(List((/attributes/keyD,List(ValidationError(List(error.path.missing),WrappedArray()))), (/attributes/keyB(0),List(ValidationError(List(error.expected.jsboolean),WrappedArray())))))
If you will have more than 22 attributes, you will have another problem: Tuple with more than 22 properties.
for dynamic properties in runtime
inspired by 'Reads.traversableReads[F[_], A]'
def attributesReads(required: Map[String, String]) = Reads {json =>
type Errors = Seq[(JsPath, Seq[ValidationError])]
def locate(e: Errors, idx: Int) = e.map { case (p, valerr) => (JsPath(idx)) ++ p -> valerr }
required.map{
case (key, "Double") => (__ \ key).read[Attribute[Double]](list(of[Double]).map(Attribute(key, _))).reads(json)
case (key, "String") => (__ \ key).read[Attribute[String]](list(of[String]).map(Attribute(key, _))).reads(json)
case (key, "Boolean") => (__ \ key).read[Attribute[Boolean]](list(of[Boolean]).map(Attribute(key, _))).reads(json)
case _ => JsError("")
}.iterator.zipWithIndex.foldLeft(Right(Vector.empty): Either[Errors, Vector[Attribute[_ >: Double with String with Boolean]]]) {
case (Right(vs), (JsSuccess(v, _), _)) => Right(vs :+ v)
case (Right(_), (JsError(e), idx)) => Left(locate(e, idx))
case (Left(e), (_: JsSuccess[_], _)) => Left(e)
case (Left(e1), (JsError(e2), idx)) => Left(e1 ++ locate(e2, idx))
}
.fold(JsError.apply, { res =>
JsSuccess(res.toList)
})
}
(__ \ "attributes").read(attributesReads(Map("keyA" -> "Double"))).reads(json)
scala> json: play.api.libs.json.JsValue = {"attributes":{"keyA":[1.68,5.47,3.57],"keyB":[true],"keyD":["Lorem","Ipsum"]}}
scala> res0: play.api.libs.json.JsResult[List[Attribute[_ >: Double with String with Boolean]]] = JsSuccess(List(Attribute(keyA,List(1.68, 5.47, 3.57))),/attributes)

Simple CRUD with Scala and Playframework

Technology stack: Scala, PlayFramework 2.3.8, MongoDB ( Salat + casbah ).
I am developing some restful APIs, with simple CRUD.
The entity:
case class Company(
_id: ObjectId = new ObjectId,
user_id: Any = "",
cinema_id: Int = 0,
name: String = "",
status: String = "",
releaseId: Int, fileId: Int,
date: CompanyDate,
showTime: List[Int] = List())
case class CompanyDate(start: DateTime, end: DateTime)
I create Reads for creating the object from JSON:
def createFromJson(json: JsValue, user: User): Reads[Company] = {
val user_id = (for {
cinema_id <- (json \ "cinema_id").asOpt[Int]
cinema <- user.cinema.find(_.id == cinema_id)
} yield cinema.getUser(user._id.toString)
).getOrElse(0)
(
Reads.pure(new ObjectId) and
Reads.pure[Any](user_id).filter(ValidationError("You have no access to this cinema"))(_ != 0) and
(__ \ "cinema_id").read[Int] and
(__ \ "name").read[String](minLength[String](1) keepAnd maxLength[String](100)) and
Reads.pure[String]("new") and
(__ \ "release_id").read[Int] and
(__ \ "file_id").read[Int] and
(__ \ "date").read[CompanyDate] and
(__ \ "show_time").read[List[Int]](minLength[List[Int]](1))
)(Company.apply _).flatMap(company => Reads { _ =>
if (company.date.start isAfter company.date.end) {
JsError(JsPath \ "date" \ "start" -> ValidationError("Start < end"))
} else {
JsSuccess(company)
}
})
}
It took me create wrapped function to pass user to Reads, because some field ( user_id ) calculate from current user - client doesn't know this information.
It looks a little ugly, but works ok.
The Controller for create action is very simple, it takes json, casts it to class and saves to DB.
Next thing: Update. In update method i want reuse some validation cases ( check date.start < date.end, e.g. ), but i don't receive the full object. My client only sends me changed fields.
Some fields ( cinema_id and user_id ) can only be set when creating object, they can't be updated.
Example request (UPDATED):
PUT /api/company/1
{
"name": "Name",
"status": "delete"
}
How can i validate this json and update existing object?
What is the best approach to solve this task?

How to send Json from client with missing fields for its corresponding Case Class after using Json.format function

I have a case Class and its companion object like below. Now, when I send JSON without id, createdAt and deletedAt fields, because I set them elsewhere, I get [NoSuchElementException: JsError.get] error. It's because I do not set above properties.
How could I achieve this and avoid getting the error?
case class Plan(id: String,
companyId: String,
name: String,
status: Boolean = true,
#EnumAs planType: PlanType.Value,
brochureId: Option[UUID],
lifePolicy: Seq[LifePolicy] = Nil,
createdAt: DateTime,
updatedAt: DateTime,
deletedAt: Option[DateTime]
)
object Plan {
implicit val planFormat = Json.format[Plan]
def fromJson(str: JsValue): Plan = Json.fromJson[Plan](str).get
def toJson(plan: Plan): JsValue = Json.toJson(plan)
def toJsonSeq(plan: Seq[Plan]): JsValue = Json.toJson(plan)
}
JSON I send from client
{
"companyId": "e8c67345-7f59-466d-a958-7c722ad0dcb7",
"name": "Creating First Plan with enum Content",
"status": true,
"planType": "Health",
"lifePolicy": []
}
You can introduce another case class just to handle serialization from request:
like this
case class NewPlan(name: String,
status: Boolean = true,
#EnumAs planType: PlanType.Value,
brochureId: Option[UUID],
lifePolicy: Seq[LifePolicy] = Nil
)
and then use this class to populate your Plan class.
The fundamental issue is that by the time a case class is instantiated to represent your data, it must be well-typed. To shoe horn your example data into your example class, the types don't match because some fields are missing. It's literally trying to call the constructor without enough arguments.
You've got a couple options:
You can make a model that represents the incomplete data (as grotrianster suggested).
You can make the possible missing fields Option types.
You can custom-write the Reads part of your Format to introduce intelligent values or dummy values for the missing ones.
Option 3 might look something like:
// Untested for compilation, might need some corrections
val now: DateTime = ...
val autoId = Reads[JsObject] {
case obj: JsObject => JsSuccess(obj \ 'id match {
case JsString(_) => obj
case _ => obj.transform(
__.update((__ \ 'id).json.put("")) andThen
__.update((__ \ 'createdTime).json.put(now)) andThen
__.update((__ \ 'updatedTime).json.put(now))
)
})
case _ => JsError("JsObject expected")
}
implicit val planFormat = Format[Plan](
autoId andThen Json.reads[Plan],
Json.writes[Plan])
Once you do this once, if the issue is the same for all your other models, you can probably abstract it into some Format factory utility function.
This may be slightly cleaner for autoId:
val autoId = Reads[JsObject] {
// Leave it alone if we have an ID already
case obj: JsObject if (obj \ 'id).asOpt[String].isSome => JsSuccess(obj)
// Insert dummy values if we don't have an `id`
case obj: JsObject => JsSuccess(obj.transform(
__.update((__ \ 'id).json.put("")) andThen
__.update((__ \ 'createdTime).json.put(now)) andThen
__.update((__ \ 'updatedTime).json.put(now))
))
case _ => JsError("JsObject expected")
}

Why JSArray parsing behaves differently, depending on the code structure, while logic remained?

I'm doing small refactoring, trying to keep logical outcomes intact:
After refactoring:
val mapped:Seq[Option[String]] = (mr.getNormalizedValue(1) \ "getapworkflowinfo1" ).as[JsArray].value.map(v => {
(v \ "Description").as[String] match {
case value if List("referral to electrophysiology").exists(value.toLowerCase.equals) =>
Some("true")
case _ =>
None
}}
)
mapped.flatten.lastOption
To:
val referralIndicators: Seq[Boolean] =
(mr.getNormalizedValue(1) \ "getapworkflowinfo1").as[JsArray].value
// Step 1.1 Extracting and checking description
.map(d => (d \ "Description").as[String].toLowerCase().equals("referral to electrophysiology"))
// Step 2. Returning if at least once there was referral to electrophysiology
Some(referralIndicators.exists(v => v)).map(v => v.toString)
Which should be logically equal (and there for should generate the same outputs on the same inputs).
Effectively improves parsing, and results returned in refactored code are better, then before.
Can someone explain, what is the different between those two?