I am using Play Framework (Scala) for a micro-service and am using Kafka as the event bus. I have an event consumer that maps to an event class that looks like:
case class MovieEvent[T] (
mediaId: String,
config: T
)
object MovieEvent {
implicit def movieEventFormat[T: Format]: Format[MovieEvent[T]] =
((__ \ "mediaId").format[String] ~
(__ \ "config").format[T]
)(MovieEvent.apply _, unlift(MovieEvent.unapply))
}
object MovieProvider extends SerializableEnumeration {
implicit val providerReads: Reads[MovieProvider.Value] = SerializableEnumeration.jsonReader(MovieProvider)
implicit val providerWrites: Writes[MovieProvider.Value] = SerializableEnumeration.jsonWrites
val Dreamworks, Disney, Paramount = Value
}
The consumer looks like:
class MovieEventConsumer #Inject()(movieService: MovieService
) extends ConsumerRecordProcessor with LazyLogging {
override def process(record: IncomingRecord): Unit = {
val movieEventJson = Json.parse(record.valueString).validate[MovieEvent[DreamworksConfiguration]]
movieEventJson match {
case event: JsSuccess[MovieEvent[DreamworksJobOptions]] => processMovieEvent(event.get)
case er: JsError =>
logger.error("Unrecognized MovieEvent, attempting to parse as MovieUploadEvent: " + JsError.toJson(er).toString())
try {
val data = (Json.parse(record.valueString) \ "upload").as[MovieUploadEvent]
processUploadEvent(data)
} catch {
case er: Exception => logger.error("Unrecognized kafka event", er)
}
}
}
def processMovieEvent[T](event: MovieEvent[T]): Unit = {
logger.debug(s"Received movie event: ${event}")
movieService.createMovieJob(event)
}
def processUploadEvent(event: MovieUploadEvent): Unit = {
logger.debug(s"Received upload event: ${event}")
movieService.addToCollection(event)
}
}
Right now, I can only validate one of the three different MovieEvent configurations (Dreamwork, Disney, and Paramount). I can swap out which one I validate through the code but thats not the point. However, I would like to validate any of the three without having to make additional consumers. I've tried playing with a few different ideas but none of them compile. I'm pretty new to Play and Kafka and wondering if there is a good way to do this.
Thanks in advance!
I am going to assume that the number of possible configurations is finite and all known in compile time (in your example, 3).
One possibility is to make MovieEvent a sealed trait with a generic type T. Here's a minimal example:
case class DreamWorksJobOptions(anOption: String, anotherOption: String)
case class DisneyJobOptions(anOption: String)
sealed trait MovieEvent[T] {
def mediaId: String
def config: T
}
case class DreamWorksEvent(mediaId: String, config: DreamWorksJobOptions) extends MovieEvent[DreamWorksJobOptions]
case class DisneyEvent(mediaId: String, config: DisneyJobOptions) extends MovieEvent[DisneyJobOptions]
def tryParse(jsonString: String): MovieEvent[_] = {
// ... parsing logic goes here
DreamWorksEvent("dw", DreamWorksJobOptions("some option", "another option"))
}
val parseResult = tryParse("asdfasdf")
parseResult match {
case DreamWorksEvent(mediaId, config) => println(mediaId + " : " + config.anOption + " : " + config.anotherOption)
case DisneyEvent(mediaId, config) => println(mediaId + config)
}
which prints out
dw : some option : another option
I omitted the parsing part because I don't have access to Play Json atm. But since you have a sealed hierarchy, you can try each of your options one by one. (And you pretty much have to, since we cannot guarantee statically that DreamWorksEvent doesn't have the same Json structure as DisneyEvent - you need to decide which gets tried first and fallback to parsing the JSON as another type when the first fails to parse).
Now your other code is very generic. To add a new Event type you just need to add another subclass to MovieEvent, as well as making sure your parsing logic handles that new case. The magic here is that you don't have to specify your T when referring to MovieEvent, since you know you have a sealed hierachy and can thus recover T through pattern matching.
Related
I apologize in advance if this is an XY problem.
tl;dr:
I'd like to have a compile-time map of type [Request.type, Response.type] so I can effectively say if I send message Request, a CLI should, at compile-time, know how to deserialize its expected Response, irrespective of the fact that it won't know what type of request is sent until runtime.
too long; still read:
I have a CLI which communicates with an HTTP server and depending on the type of message sent to the HTTP server, I'd like to validate the JSON response against a case case.
For instance, if I send the HTTP server an AddFoo message, I might want to validate that the JSON response can be deserialized into an AddedFoo, etc.
My current solution is quite hacky. Using play-json, I'm attempting to parse the JSON response using a mapping from config.mode (i.e., command issued to the CLI) to the expected responses' implicit Reads.
My code looks something like this:
val modeToResponseReads: Map[String, Reads[_]] = Map(
Modes.ADD_FOO -> AddedFoo.addedFooReads,
Modes.ADD_BOO -> AddedBoo.addedBooReads,
Modes.GET_WOO -> GetWooResponse.getWooReads,
)
parser.parse(args, MyConfig()) match {
case Some(config) => try {
val exec = new MyHttpExecutor(remoteUri, config)
val res = Await.result(exec.getResponse, 100.seconds)
// passing `Reads` to `as` because JsValue#as[T] cannot be
// applied at runtime -- only compile-time.
val _ = Json.parse(res.json.toString)
.as(modeToResponseReads(config.mode))
exec.actorSystem.terminate()
exec.wsClient.close()
} catch {
case t: Throwable => logger.error(t.getMessage)
}
case None => {
logger.error("Bad arguments.")
sys.exit(1)
}
}
While this works, it's an incredible kludge that becomes increasingly unmaintainable with an increasing number of messages. Further, I've found that this pattern will need to be replicated anywhere some type of validate or conversion will need to happen (e.g., Future[Any] being converted to Future[AddedFoo]).
Surely my approach isn't the right way... how is this traditionally done? If it is the right way (please no), are there optimizations that can be made?
I managed to accomplish this by encoding the contract directly into the child Request classes. Namely, the child Request classes would hold a ResponseType type with the base class enforcing the covariant type.
So I can do something like this:
abstract class Response
abstract class Request[+A <: Response]
case class Foo(id: String)
object Foo {
implicit val fooReads = Json.reads[Foo]
implicit val fooFormat = Json.format[Foo]
}
case class FooResponse(foo: Foo) extends Response {
def greet = println("woo hoo!")
}
object FooResponse {
implicit val fooRespReads = Json.reads[FooResponse]
implicit val fooRespFormat = Json.format[FooResponse]
}
case class FooRequest() extends Request[FooResponse] {
type ResponseType = FooResponse
}
object Main extends App {
val req: FooRequest = new FooRequest()
val foo = Foo("12345")
val resp = new FooResponse(foo)
val respJsonString = Json.toJson(resp).toString
println(Json.parse(respJsonString).as[req.ResponseType])
}
For the validate method on request.body it matches the attribute name and value type of the json object to those defined in the model definition. Now if I were to add an extra attribute to the json object and try to validate it, it passes as a JsSuccess when it shouldn't.
{
"Name": "Bob",
"Age": 20,
"Random_Field_Not_Defined_in_Models": "Test"
}
My Person Class is defined as follows
case class Person(name: String, age: Int)
I'm assuming you've been using the built-in Reads[T] or Format[T] converters that Play gives you via Json.reads[T], e.g.:
import play.api.libs.json._
val standardReads = Json.reads[Person]
While these are super-handy, if you need additional validation, you'll have to define a custom Reads[Person] class; but fortunately we can still leverage the built-in JSON-to-case-class macro to do the basic checking and conversion, and then add an extra layer of custom checks if things seem OK:
val standardReads = Json.reads[Person]
val strictReads = new Reads[Person] {
val expectedKeys = Set("name", "age")
def reads(jsv:JsValue):JsResult[Person] = {
standardReads.reads(jsv).flatMap { person =>
checkUnwantedKeys(jsv, person)
}
}
private def checkUnwantedKeys(jsv:JsValue, p:Person):JsResult[Person] = {
val obj = jsv.asInstanceOf[JsObject]
val keys = obj.keys
val unwanted = keys.diff(expectedKeys)
if (unwanted.isEmpty) {
JsSuccess(p)
} else {
JsError(s"Keys: ${unwanted.mkString(",")} found in the incoming JSON")
}
}
}
Note how we utilize standardReads first, to make sure we're dealing with something that can be converted to a Person. No need to reinvent the wheel here.
We use flatMap to effectively short-circuit the conversion if we get a JsError from standardReads - i.e. we only call checkUnwantedKeys if needed.
checkUnwantedKeys just uses the fact that a JsObject is really just a wrapper around a Map, so we can easily check the names of the keys against a whitelist.
Note that you could also write that flatMap using a for-comprehension, which starts to look a lot cleaner if you need even more checking stages:
for {
p <- standardReads.reads(jsv)
r1 <- checkUnexpectedFields(jsv, p)
r2 <- checkSomeOtherStuff(jsv, r1)
r3 <- checkEvenMoreStuff(jsv, r2)
} yield r3
If you want to avoid too much boilerplate it is possible to make a more generic solution using a little bit of scala reflection:
import play.api.libs.json._
import scala.reflect.runtime.universe._
def checkedReads[T](underlyingReads: Reads[T])(implicit typeTag: TypeTag[T]): Reads[T] = new Reads[T] {
def classFields[T: TypeTag]: Set[String] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.decodedName.toString
}.toSet
def reads(json: JsValue): JsResult[T] = {
val caseClassFields = classFields[T]
json match {
case JsObject(fields) if (fields.keySet -- caseClassFields).nonEmpty =>
JsError(s"Unexpected fields provided: ${(fields.keySet -- caseClassFields).mkString(", ")}")
case _ => underlyingReads.reads(json)
}
}
}
Then you can specify your reads instances as:
implicit val reads = checkedReads(Json.reads[Person])
This leverages a fair bit of Scala type magic and also the reflection library (that lets you look at fields on classes).
Rather than relying on a fixed set of fields the classFields method gets all of the fields dynamically for the case class (type param T). It looks at all of the members and collects only the case class accessors (otherwise we'd pick up methods like toString). It returns a Set[String] of field names.
You'll notice that the checkedReads takes an implicit TypeTag[T]. This is supplied by the compiler at compile time and used by the typeOf method.
The remaining code is fairly self explanatory. If the incoming json matches our first case (it is a JsObject and there are fields not on the case class) then we return a JsError. Otherwise we pass it on to the underlying reader.
For the validate method on request.body it matches the attribute name and value type of the json object to those defined in the model definition. Now if I were to add an extra attribute to the json object and try to validate it, it passes as a JsSuccess when it shouldn't.
{
"Name": "Bob",
"Age": 20,
"Random_Field_Not_Defined_in_Models": "Test"
}
My Person Class is defined as follows
case class Person(name: String, age: Int)
I'm assuming you've been using the built-in Reads[T] or Format[T] converters that Play gives you via Json.reads[T], e.g.:
import play.api.libs.json._
val standardReads = Json.reads[Person]
While these are super-handy, if you need additional validation, you'll have to define a custom Reads[Person] class; but fortunately we can still leverage the built-in JSON-to-case-class macro to do the basic checking and conversion, and then add an extra layer of custom checks if things seem OK:
val standardReads = Json.reads[Person]
val strictReads = new Reads[Person] {
val expectedKeys = Set("name", "age")
def reads(jsv:JsValue):JsResult[Person] = {
standardReads.reads(jsv).flatMap { person =>
checkUnwantedKeys(jsv, person)
}
}
private def checkUnwantedKeys(jsv:JsValue, p:Person):JsResult[Person] = {
val obj = jsv.asInstanceOf[JsObject]
val keys = obj.keys
val unwanted = keys.diff(expectedKeys)
if (unwanted.isEmpty) {
JsSuccess(p)
} else {
JsError(s"Keys: ${unwanted.mkString(",")} found in the incoming JSON")
}
}
}
Note how we utilize standardReads first, to make sure we're dealing with something that can be converted to a Person. No need to reinvent the wheel here.
We use flatMap to effectively short-circuit the conversion if we get a JsError from standardReads - i.e. we only call checkUnwantedKeys if needed.
checkUnwantedKeys just uses the fact that a JsObject is really just a wrapper around a Map, so we can easily check the names of the keys against a whitelist.
Note that you could also write that flatMap using a for-comprehension, which starts to look a lot cleaner if you need even more checking stages:
for {
p <- standardReads.reads(jsv)
r1 <- checkUnexpectedFields(jsv, p)
r2 <- checkSomeOtherStuff(jsv, r1)
r3 <- checkEvenMoreStuff(jsv, r2)
} yield r3
If you want to avoid too much boilerplate it is possible to make a more generic solution using a little bit of scala reflection:
import play.api.libs.json._
import scala.reflect.runtime.universe._
def checkedReads[T](underlyingReads: Reads[T])(implicit typeTag: TypeTag[T]): Reads[T] = new Reads[T] {
def classFields[T: TypeTag]: Set[String] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m.name.decodedName.toString
}.toSet
def reads(json: JsValue): JsResult[T] = {
val caseClassFields = classFields[T]
json match {
case JsObject(fields) if (fields.keySet -- caseClassFields).nonEmpty =>
JsError(s"Unexpected fields provided: ${(fields.keySet -- caseClassFields).mkString(", ")}")
case _ => underlyingReads.reads(json)
}
}
}
Then you can specify your reads instances as:
implicit val reads = checkedReads(Json.reads[Person])
This leverages a fair bit of Scala type magic and also the reflection library (that lets you look at fields on classes).
Rather than relying on a fixed set of fields the classFields method gets all of the fields dynamically for the case class (type param T). It looks at all of the members and collects only the case class accessors (otherwise we'd pick up methods like toString). It returns a Set[String] of field names.
You'll notice that the checkedReads takes an implicit TypeTag[T]. This is supplied by the compiler at compile time and used by the typeOf method.
The remaining code is fairly self explanatory. If the incoming json matches our first case (it is a JsObject and there are fields not on the case class) then we return a JsError. Otherwise we pass it on to the underlying reader.
I`ve found some questions that are very close to my question (e.g. Play Framework / Scala: abstract repository and Json de/serialization) but they didnt solve my problem.
What I`m trying to achieve is an abstraction of my CRUD DAO for the common crud operations.
I build a GenericMongoDaoActor for that:
abstract class GenericMongoDaoActor[T <: Entity: Writes](implicit inj:Injector, implicit val f:Format[T]) extends Actor with Injectable {
protected val db = inject[DefaultDB]
protected val collectionName: String
protected val collection:JSONCollection
//set to None to use fallback
def defaultSelector(obj:T):Option[JsObject] = None
def fallbackSelector(obj:T) = Json.obj("_id" -> Json.obj("$elemMatch" -> obj._id))
protected def find(jsObject: JsObject) = {
val currentSender = sender
val futureOption = collection
.find(jsObject)
.cursor[T](ReadPreference.primaryPreferred)
.headOption
futureOption.mapTo[Option[T]].flatMap {
case Some(pobj) =>
currentSender ! Some(pobj)
Future{pobj}
case None =>
currentSender ! None
Future{None}
}
}
protected def save(obj:T):Unit = {
update(obj, true)
}
protected def update(obj:T):Unit = {
update(obj, false)
}
private def update(obj:T, upsert: Boolean):Unit = {
val selector = defaultSelector(obj) match {
case None => fallbackSelector(obj)
case Some(sel) => sel
}
collection.update(
selector,
obj,
GetLastError.Default,
upsert).onComplete {
case Failure(e) => Logger.error("[" + this.getClass.getName + "] Couldn`t update " + obj.getClass.getName + ": " + Json.prettyPrint(Json.toJson(obj)), e)
case Success(lastError) => //currentSender ! lastError todo: do something with lastError
}
}
def findAll() = {
collection.find(Json.obj()).cursor[T](ReadPreference.primaryPreferred).collect[List]()
}
}
The DAOActor handles Entities that inherit abstract class "Entity":
abstract class Entity {
val _id: BSONObjectID
}
Currently there are 2 classes that inherit Entity..
As you can see my DOAActor is already context bound to look for Writes[T] in scope..
abstract class GenericMongoDaoActor[T <: Entity: Writes]
When I try to build my project like that, it complains that there are no OWrites given to serialize "obj" of type "T" in the update method.
No Json serializer as JsObject found for type T. Try to implement an implicit OWrites or OFormat for this type.
collection.update( <-------------
I couldnt find a way to solve this issue. Please let me know if you can.
I had similar problems when I migrated from an earlier version of ReactiveMongo. What worked for me was sprinkling around some .as[JsObject] conversions in the various calls to the ReactiveMongo API.
So if before I had:
collection.update(
selector,
obj,
...
)
I replaced it with:
collection.update(
selector,
obj.as[JsObject],
...
)
This seemed to be sufficient, although I am supplying the necessary JSON converter(s) in a slightly different way to you; subclasses of my abstract class have to implement an implicit val fmt:Format[T] member. I doubt whether that is important, but it is an approach that seems to be working :-)
You need to use OWrites and OFormat instead of Writes and Format.
I know OWrites extends Writes and OFormat extends Format, but the reactivemongo version you are using is waiting for OWrites and OFormat, not its super types.
[edit]
So, i got a quick and dirty solution, thanks to Edmondo1984, I don't know if it's the best solution. I don't handle null values with pattern matching at the write function. You can read more details about my problem after this editing. Here is my code now:
object DBNames extends DefaultJsonProtocol {
implicit val pkFormat: JsonFormat[Pk[Int]] = new JsonFormat[Pk[Int]] {
def write(obj: Pk[Int]): JsValue = JsNumber(obj.get)
def read(json: JsValue): Pk[Int] = json.asJsObject.getFields("id") match {
case Seq(JsNumber(id)) => new Pk[Int] { id.toInt }
case _ => throw new DeserializationException("Int expected")
}
}
implicit val nameFormat = jsonFormat2(Name)
jsonFormat2 will implicitly use pkFormat to parse Pk[Int] values.
In my controller class I have this:
def listNames() = Action {
val names = DBNames.findAll()
implicit val writer = DBNames.nameFormat
var json = names.toJson
Ok(json.toString()).as("application/json")
}
I had to get the nameFormat from my model and make it implicit, so bars.toJson could use it to parse the Seq[Name] names.
[/edit]
I'm trying to use Play! Framework with Scala, I'm new to Scala programming and Play Framework, and everything seems nice, but I'm working on this problem during several hours and didn't find a solution.
I have a Case Class:
case class Name (id: Pk[Int], name: String)
And an object to deal with MySql. I created a implicit val nameFormat = jsonFormat2(Name) to deal with JSON.
object DBNames extends DefaultJsonProtocol {
implicit val nameFormat = jsonFormat2(Name)
var parser =
{
get[Pk[Int]]("id") ~
get[String]("name") map {
case id ~ name => Name(id,name)
}
}
def findAll():Seq[Name] =
{
DB.withConnection {
implicit connection =>
SQL("select * from names").as(DBNames.parser *)
}
}
def create(name: Name){
DB.withConnection {
implicit connection =>
SQL("insert into names (name) values ({name})").on(
'name -> name.name
).executeUpdate()
}
}
}
But when I try to compile it, Play! gives me this result:
[error] D:\ProjetosJVM\TaskList\app\models\Names.scala:20: could not find implicit value for evidence parameter of type models.DBNames.JF[anorm.Pk[Int]]
It seems like he couldn't find a way to parse the id value, since it is a Pk[Int] value.
So, by reading this: https://github.com/spray/spray-json I didn't found a way to parse it without creating a complete object parser like they show in the documentation:
object MyJsonProtocol extends DefaultJsonProtocol {
implicit object ColorJsonFormat extends RootJsonFormat[Color] {
def write(c: Color) = JsObject(
"name" -> JsString(c.name),
"red" -> JsNumber(c.red),
"green" -> JsNumber(c.green),
"blue" -> JsNumber(c.blue)
)
def read(value: JsValue) = {
value.asJsObject.getFields("name", "red", "green", "blue") match {
case Seq(JsString(name), JsNumber(red), JsNumber(green), JsNumber(blue)) =>
new Color(name, red.toInt, green.toInt, blue.toInt)
case _ => throw new DeserializationException("Color expected")
}
}
}
}
I have a "big" (actually small) project where I want to make most of things work with Ajax, so I think this is not a good way to do it.
How can I deal with JSON objects in this project, where almost all case classes will have a "JSON parser", without creating large ammounts of code, like the snippet above? And also, how can I make it work with an Seq[Name]?
You don't need to write a complete parser. The compiler says:
[error] D:\ProjetosJVM\TaskList\app\models\Names.scala:20: could not find implicit
value for evidence parameter of type models.DBNames.JF[anorm.Pk[Int]]
The scala compiler is looking for an implicit parameter of type JF[anorm.Pk[Int]] and there is no such an implicit parameter in scope. What is JF[anorm.Pk[Int]]? Well, you need to know the library and I didn't, so I had browsed spray-json source and found out:
trait StandardFormats {
this: AdditionalFormats =>
private[json] type JF[T] = JsonFormat[T] // simple alias for reduced verbosity
so JF[T] is just an alias for JsonFormat[T]. It all make sense: PK[Int] is a class coming from Anorm and spray-json provides out-of-the-box json support for standard types, but does not even know Anorm exists So you have to code your support for Pk[Int] and make it implicit in scope.
You will have code like the following:
object DBNames extends DefaultJsonProtocol {
implicit val pkFormat : JsonFormat[Pk[Int]] = new JsonFormat[Pk[Int]] {
//implementation
}
// rest of your code
}
If you have just started with Scala, you would probably have to read more about implicits and their resolution. I am providing you with a minimal answer: once you have provided the right implementation, your code will compile. I suggest you to refer to the javadoc of anorm.Pk and of JsonFormat to understand how to implement it correctly for your type.
Pk looks like scala.Option and in StandardFormats source code inside spray-json you find the JsonFormat implementation for Option, from which you can copy