I defined a user which has a "video" information:
case class User(name:String, video: Option[Video])
case class Video(title:String, url:String)
And we have such a json:
{
"name": "somename",
"video": {
"title": "my-video",
"url": "https://youtube.com/watch?v=123123"
}
}
I can use such code to parse it:
implicit def DecodeUser: DecodeJson[User] = for {
name <- as[String]("name")
video <- as[Option[Video]]("video")
} yield User(name, video)
implicit def DecodeVideo: DecodeJson[Option[Video]] = for {
titleOpt <- as[Option[String]]("title")
urlOpt <- as[Option[String]]("url")
} yield (titleOpt, urlOpt) match {
case (Some(title), Some(url)) => Video(title, url)
case _ => None
}
From the DecodeVideo, you can see I just want to provide the video only if both "title" and "url" provided.
It's working well if the json contains "video" section. But if it doesn't, argonaut will report that "video" section is not provided.
How to make "video" optional?
I can't seem to figure out how your code integrates with argonaut. All instances of the method as[T] don't seem to match the signature that you're using. Anyways, here's a similar problem and the solution:
object Test {
case class Video(title: String, url: String)
def test(titleOptIn: List[Option[String]], urlOptIn: List[Option[String]]): List[Option[Video]] = {
for {
titleOpt <- titleOptIn
urlOpt <- urlOptIn
} yield (titleOpt, urlOpt) match {
case (Some(title), Some(url)) => Some(Video(title, url))
case _ => None.asInstanceOf[Option[Video]]
}
}
def main(args: Array[String]): Unit = {
test(List(Some("t")), List(Some("u"), None)).foreach(println)
}
}
// Has Output:
// Some(Video(t,u))
// None
Specifically note that the yield comprehension should return an Option[String] since your yield is likely to be wrapping the result in DecodeJson just like my example wraps it in a List. Note that asInstanceOf on None is optional; IntelliJ complains if it's not there but it actually compiles just fine.
The specific thing that I believe you're missing is wrapping Video in Some.
Related
I am currently building a very simple JSON parser in Scala that has to deal with two (slightly) different schemas. My objective is to parse one value in the json, and based on that value, I would like to dispatch it to the relevant decoder. I have used circe for my implementation, but other implementations and/or suggestions are also welcome.
I have formulated a simplified version for my example to help clarifying the question.
There are two types of JSONs that I can receive, either a stock:
"data": {
"name": "XYZ"
},
"type": "STOCK"
}
Or a quote (which is similar to the stock but includes a price).
"data": {
"name": "ABC",
"price": 1151.6214,
},
"type": "QUOTE"
}
On my side, I have developed a simple decoder that looks like this (for the stock):
implicit private val dataDecoder: Decoder[Stock] = (hCursor: HCursor) => {
for {
isin <- hCursor.downField("data").downField("name").as[String]
typ <- hCursor.downField("type").as[StockType]
} yield Instrument(name, typ, LocalDateTime.now())
}
I could also develop a parser that only parses the "type" part of the JSON, and then send the data to be handled by the relevant parser (quote or stock). However, I am wondering what is the:
Efficient way to do this
Idiomatic/clean way to do this
To help rephrase my question if needed, what is the right & efficient way to handle slightly different JSON schemas, and forward them to be handled by the right parser.
I usually encounter this situation when need to serialize ADTs. As somebody mentioned in the comments to your question, circe has support for ADT codec auto generation, however I usually prefer to manually write the codecs.
In any case in a situation like your I would do something along these lines:
sealed trait Data
case class StockData(name: String) extends Data
case class QuoteData(name: String, quote: Double) extends Data
implicit val stockDataEncoder: Encoder[StockData] = ???
implicit val stockDataDecoder: Decoder[StockData] = ???
implicit val quoteDataEncoder: Encoder[QuoteData] = ???
implicit val quoteDataDecoder: Decoder[QuoteData] = ???
implicit val dataEncoder: Encoder[Data] = Encoder.instance {
case s: StockData => stockDataEncoder(s).withObject(_.add("type", "stock))
case q: QuoteData => quoteDataEncoder(q).withObject(_.add("type", "quote"))
}
implicit val dataDecoder: Decoder[Data] = Decoder.instance { c =>
for {
stype <- c.get[String]("type)
res <- stype match {
case "stock" => stockDataDecoder(c)
case "quote" => quoteDataDecoder(c)
case unk => Left(DecodingFailure(s"Unsupported data type: ${unk}", c.history))
}
} yield res
}
I've seen many examples with pretty much everything one can need to understand how to validate json representation of a given simple model.
Official play documentation is very specific on the subject, covering from simple validations to transformers and combinators.
I'm just wondering how can I check json againts a given data type structure, like:
{
title : "title1",
tags : ["funny", "others"],
props : [
{ prop1: "val1" },
{ prop2: "val2" }
]
}
I'd like to check the previous json example, to validate if it has this structure:
title: String
tags: Array[String]
props: Array[(String->String] // ???
This is of course a simplification, since a case class would be as simple as:
case class Example1(title: String, tags: Array[String], props: Array[???])
As you can see, my questions has two parts:
- first, I want to properly use validation/transformation/reads or whatever API Play 2.3.x is ment to perform that kind of json validation without a model
- second, how to specify an array of simple key/value objects
Validation with Reads does not require a model class.
val json = { ... }
val titleMustBeString: Reads[String] = (JsPath \ "title").read[String]
json.validate[String](titleMustBeString) match {
case title: JsSuccess[String] => println(s"Title: $title")
case err: JsError => println("Invalid json")
}
I'm using Playframework 2.3.X. I'm trying to construct an action function that validates fields in a JSON request.body. The use case is to build "blocks" of validation that I can then chain together.
However, I can't seem to access the request.body as a JSON inside the action builder. The following doesn't compile. The compiler can't resolve "asJson":
def ValidateJsonBodyAction = new ActionBuilder[Request] {
def invokeBlock[A](request: Request[A], block: (Request[A]) => Future[Result]): Future[Result] = {
val body= request.body.asJson
}
}
UPDATE: It could also be that I'm approaching this the wrong way. I'm new to play, so alternative approaches are also welcome.
UPDATE: Yes, it appears that a may be doing things the wrong way. I'm not the only one with this issue. See https://github.com/playframework/playframework/issues/3387
I think you might have to pattern match on the body type (which I admit is not a beautiful solution):
import scala.concurrent.Future.{successful => immediate}
def JsonValidator(validator: Reads[JsValue]) = new ActionBuilder[Request] {
def jsonBody[A](body: A): Option[JsValue] = body match {
case js: JsValue => Some(js)
case any: AnyContent => any.asJson
case _ => None
}
override def invokeBlock[A](request: Request[A], block: (Request[A]) => Future[Result]): Future[Result] = {
jsonBody(request.body) match {
case Some(js) => js.validate(validator) match {
case JsSuccess(_, _) => block(request)
case JsError(e) => immediate(BadRequest(s"Bad Json: $e"))
}
case None => immediate(UnsupportedMediaType("Bad content type: expected Json"))
}
}
}
This is basically what happens behind the scenes when you, for example, bind a form on a request with an arbitrary and unknown body.
On the other hand you might find it better to just write a specific BodyParser to validate your data, e.g:
def jsonFormatParser(validator: Reads[JsValue]): BodyParser[JsValue] = parse.json.validate { js =>
js.validate(validator) match {
case JsSuccess(_, _) => Right(js)
case JsError(e) => Left(BadRequest(s"Json didn't have the right format: $e"))
}
}
// A validator which checks the JSON contains a "hello" path which
// then contains a "world" path
val helloWorldValidator: Reads[JsValue] = Reads.required(__ \ "hello" \ "world")
def helloWorldAction = Action(jsonFormatParser(helloWorldValidator)) { implicit request =>
Ok("Got the right type of JSON")
}
It depends on your use-case. Since JSON validators (Reads[JsValue]) are nicely composable via andThen that might be your best bet.
Using Spray Routing, I would like have a single directive that merges the query string parameters with a JSON entity, with both being optional. I would want to have this happen before any marshalling happens.
Something like this:
val myRoute = mergedParametersAndEntity(as[DomainSpecificClass]) { myobj =>
// ... code code code ...
complete(OK, myobj.someMethod)
}
Basically what I was hoping for was the following behavior:
When someone does a request like:
POST /v1/resource?a=helloQS&b=helloQS
Content-Type: application/json
{"a":"helloFromJson","c":"helloFromJson"}
Then the object above (myobj) could contain the keys:
a -> helloFromJson
b -> helloQS
c -> helloFromJson
In other words, items specified in the request body would override things in the query string. I know this must be possible somehow, but I simply cannot figure out how to do it. Can anyone help?
Thank you!
My suggestion: don't take this approach. It feels dirty. Go back to the drawing board if you can.
With that in mind, you won't be able to interject a merge like you want with the existing JSON marshalling support in Spray. You'll need to stitch it together yourself. Something like this should at least point you in the right direction (provided it be the direction you must go):
import org.json4s.JsonAST._
import org.json4s.JsonDSL._
def mergedParametersAndEntity[T](implicit m:Manifest[T]): Directive1[T] = {
entity(as[JObject]).flatMap { jObject =>
parameterMap flatMap { params =>
val left = JObject(params.mapValues { v => JString(v) }.toSeq : _*)
provide(left.merge(jObject).extract[T])
}
}
}
If anyone is still wondering how to do this here's a small Spray directive that allows copying param values to the JSON before unmarshalling it. It uses JSON lenses (https://github.com/jrudolph/json-lenses) to modify the request body before unmarshalling.
def updateJson(update: Update): Directive0 = {
mapRequest((request: HttpRequest) => {
request.entity match {
case Empty => request
case NonEmpty(contentType, data) =>
try {
request.copy(entity = HttpEntity(`application/json`, JsonParser(data.asString).update(update).toString))
}
catch {
case e: ParsingException => request
}
}
})
}
And here's how you use it. Say you want to update a resource with a PUT request and pass the ID from the URL to the unmarshaller (a very common scenario btw):
// Model
case class Thing(id: Long, title: String, description: String, createdAt: DateTime)
// Actor message for the update operation
case class UpdateThing(id: Long, title: String)
// Spray routing
def thingsRoute =
path("things" / Segment) { id =>
put {
updateJson(field("id") ! set[Long](id)) {
entity(as[UpdateThing]) { updateThing =>
complete {
// updateThing now has the ID set
(someActor ? updateThing).mapTo[Thing]
}
}
}
}
}
You can also combine it with the parameters directive to set arbitrary GET params.
Say I have JSON as such:
{
"field":{
"nested":{
"foo":"foo val",
"bar":"bar val",
},
"toignore1":{
},
"toignore2":{
}
}
}
I can't seem to parse this correctly, and since it's possible I don't know all the fields to ingore, e.g. toignore3..., I don't want to call them out in the models. I just need a few values from the whole response. If JSON_STRING represents the JSON above, why can't I do this when parsing with Jerkson?
case class JsonModel(val field: FieldModel)
case class FieldModel(val nested: NestedModel) // ignoring other stuff here
case class NestedModel(val foo: String, bar: String)
val parsed = parse[JsonModel](JSON_STRING)
You can do this one of two ways:
case class CaseClassWithIgnoredField(id: Long) {
#JsonIgnore
val uncomfortable = "Bad Touch"
}
#JsonIgnoreProperties(Array("uncomfortable", "unpleasant"))
case class CaseClassWithIgnoredFields(id: Long) {
val uncomfortable = "Bad Touch"
val unpleasant = "The Creeps"
}