I've got a List which holds some Personalization-objects. The latter is defined like this:
sealed case class Personalization(firstname: String, lastname: String, keycardId: String)
I need to map this list to a Json-Array structure which has to look like this:
"personalization": [
{
"id": "firstname",
"value": "John"
},
{
"id": "lastname",
"value": "Doe"
}...
I am struggling with the part of mapping the field information to id/value pairs. Normally, I would create a play.api.libs.json.Format out of the Personalization class and let it map automatically -> Json.format[Personalization] - but this time, I need to create an array where an entry can hold n attributes.
Therefore I am asking for advice, if there is a possibility to use the Scala Play-Framework?
Any input is much appreciated, thank you!
Writing as such JSON representation is not quite complex, using Writes.transform.
import play.api.libs.json._
case class Personalization(firstname: String, lastname: String, keycardId: String) // No need to seal a case class
implicit def writes: Writes[Personalization] = {
val tx: JsValue => JsValue = {
case JsObject(fields) => Json.toJson(fields.map {
case (k, v) => Json.obj("id" -> k, "value" -> v)
})
case jsUnexpected => jsUnexpected // doesn't happen with OWrites
}
Json.writes[Personalization].transform(tx)
}
Which can be tested as bellow.
val personalization = Personalization(
firstname = "First",
lastname = "Last",
keycardId = "Foo")
val jsonRepr = Json.toJson(personalization)
// => [{"id":"firstname","value":"First"},{"id":"lastname","value":"Last"},{"id":"keycardId","value":"Foo"}]
Reading is a little bit tricky:
implicit def reads: Reads[Personalization] = {
type Field = (String, Json.JsValueWrapper)
val fieldReads = Reads.seq(Reads[Field] { js =>
for {
id <- (js \ "id").validate[String]
v <- (js \ "value").validate[JsValue]
} yield id -> v
})
val underlying = Json.reads[Personalization]
Reads[Personalization] { js =>
js.validate(fieldReads).flatMap { fields =>
Json.obj(fields: _*).validate(underlying)
}
}
}
Which can be tested as bellow.
Json.parse("""[
{"id":"firstname","value":"First"},
{"id":"lastname","value":"Last"},
{"id":"keycardId","value":"Foo"}
]""").validate[Personalization]
// => JsSuccess(Personalization(First,Last,Foo),)
Note that is approach can be used for any case class format.
Probably it is possible to do it a more elegant way I did, but you can use the following snippet:
case class Field(id: String, value: String)
object Field {
implicit val fieldFormatter: Format[Field] = Json.format[Field]
}
sealed case class Personalization(firstname: String, lastname: String, keycardId: String)
object Personalization {
implicit val personalizationFormatter: Format[Personalization] = new Format[Personalization] {
override def reads(json: JsValue): JsResult[Personalization] =
Try {
val data = (json \ "personalization").as[JsValue]
data match {
case JsArray(value) =>
val fields = value.map(_.as[Field]).map(f => f.id -> f.value).toMap
val firstname = fields.getOrElse("firstname", throw new IllegalArgumentException("Mandatory field firstname is absent."))
val lastname = fields.getOrElse("lastname", throw new IllegalArgumentException("Mandatory field lastname is absent."))
val keycardId = fields.getOrElse("keycardId", throw new IllegalArgumentException("Mandatory field keycardId is absent."))
Personalization(firstname, lastname, keycardId)
case _ => throw new IllegalArgumentException("Incorrect json format for Personalization.")
}
}.toEither.fold(e => JsError(e.getMessage), JsSuccess(_))
override def writes(o: Personalization): JsValue = {
val fields = List(Field("firstname", o.firstname), Field("lastname", o.lastname), Field("keycardId", o.keycardId))
JsObject(List("personalization" -> Json.toJson(fields)))
}
}
}
It converts {"personalization":[{"id":"firstname","value":"John"},{"id":"lastname","value":"Doe"},{"id":"keycardId","value":"1234"}]} to Personalization(John,Doe,1234) and vice versa
Related
Here is my json:
{
"stringField" : "whatever",
"nestedObject": { "someProperty": "someValue"}
}
I want to map it to
case class MyClass(stringField: String, nestedObject:String)
nestedObject should not be deserialized, I want json4s to leave it as string.
resulting instance shouldBe:
val instance = MyClass(stringField="whatever", nestedObject= """ { "someProperty": "someValue"} """)
Don't understand how to do it in json4s.
You can define a custom serializer:
case object MyClassSerializer extends CustomSerializer[MyClass](f => ( {
case jsonObj =>
implicit val format = org.json4s.DefaultFormats
val stringField = (jsonObj \ "stringField").extract[String]
val nestedObject = compact(render(jsonObj \ "nestedObject"))
MyClass(stringField, nestedObject)
}, {
case myClass: MyClass =>
("stringField" -> myClass.stringField) ~
("nestedObject" -> myClass.nestedObject)
}
))
Then add it to the default formatter:
implicit val format = org.json4s.DefaultFormats + MyClassSerializer
println(parse(jsonString).extract[MyClass])
will output:
MyClass(whatever,{"someProperty":"someValue"})
Code run at Scastie
I am using the Play Framework and trying to build JSON validator for a class with abstract members. Shown below, the DataSource class is the base class which I am trying to validate the format against.
// SourceTypeConfig Trait.
trait SourceTypeConfig
final case class RDBMSConfig(...) extends SourceTypeConfig
object RDBMSConfig { implicit val fmt = Json.format[RDBMSConfig] }
final case class DirectoryConfig(
path: String,
pathType: String // Local, gcloud, azure, aws, etc.
) extends SourceTypeConfig
object DirectoryConfig { implicit val fmt = Json.format[DirectoryConfig] }
// FormatConfig trait.
trait FormatConfig
final case class SQLConfig(...) extends FormatConfig
object SQLConfig { implicit val fmt = Json.format[SQLConfig]}
final case class CSVConfig(
header: String,
inferSchema: String,
delimiter: String
) extends FormatConfig
object CSVConfig { implicit val fmt = Json.format[CSVConfig]}
// DataSource base class.
case class DataSource(
name: String,
sourceType: String,
sourceTypeConfig: SourceTypeConfig,
format: String,
formatConfig: FormatConfig
)
What I am hoping to accomplish:
val input: JsValue = Json.parse(
"""
{
"name" : "test1",
"sourceType" : "directory",
"sourceTypeConfig" : {"path" : "gs://test/path", "pathType" "google"},
"format" : "csv",
"formatConfig" : {"header" : "yes", "inferSchema" : "yes", "delimiter" : "|"}
}
"""
)
val inputResult = input.validate[DataSource]
What I am struggling with is building the DataSource object and defining its reads/writes/format. I would like it to contain a match based on the sourceType and format values that direct it to point towards the associated sourceTypeConfig and formatConfig's formats so it can parse out the JSON.
Instead of building a parser at the DataSource level, I defined parsers at the SourceConfig and FormatConfig levels, similar to what is shown below.
sealed trait SourceConfig{val sourceType: String}
object SourceConfig{
implicit val fmt = new Format[SourceConfig] {
def reads(json: JsValue): JsResult[SourceConfig] = {
def from(sourceType: String, data: JsObject): JsResult[SourceConfig] = sourceType match {
case "RDBMS" => Json.fromJson[RDBMSConfig](data)(RDBMSConfig.fmt)
case "directory" => Json.fromJson[DirectoryConfig](data)(DirectoryConfig.fmt)
case _ => JsError(s"Unknown source type: '$sourceType'")
}
for {
sourceType <- (json \ "sourceType").validate[String]
data <- json.validate[JsObject]
result <- from(sourceType, data)
} yield result
}
def writes(source: SourceConfig): JsValue =
source match {
case b: RDBMSConfig => Json.toJson(b)(RDBMSConfig.fmt)
case b: DirectoryConfig => Json.toJson(b)(DirectoryConfig.fmt)
}
}
}
Then, DataSource could be simply defined as:
object DataSource { implicit val fmt = Json.format[DataSource] }
Another option is to use play-json-derived-codecs library:
libraryDependencies += "org.julienrf" %% "play-json-derived-codecs" % "4.0.0"
import julienrf.json.derived.flat
implicit val format1: OFormat[RDBMSConfig] = Json.format[RDBMSConfig]
implicit val format2: OFormat[DirectoryConfig] = Json.format[DirectoryConfig]
implicit val format3: OFormat[SourceTypeConfig] = flat.oformat((__ \ "sourceType").format[String])
I am building a web app using Scala / Play Framework and Reactive Mongo and I want the models to be defined in the database instead of having them hardcoded.
To do so, I am writing a class EntityInstance taking a Sequence of FieldInstance :
case class EntityInstance(fields: Seq[FieldInstance])
I am trying to accept fields from any types and to convert them to Json : example
new FieldInstance("name", "John") | json: { "name": "John" }
new FieldInstance("age", 18) | json: { "age": 18 }
At the moment I am trying to accept Strings, Booleans and Integers and if the type is not supported I write some error :
new FieldInstance("profilePicture", new Picture("john.jpg") | json: { "profilePicture": "Unsupported type
I wrote a FieldInstance class taking a fieldName as a String and a value as any type. As soon as that class is instantiated I cast the value to a known type or to the String describing the error.
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec match {
case v: Int => v
case v: String => v
case v: Boolean => v
case _ => "Unrecognized type"
}
}
object FieldInstance {
implicit val fieldInstanceWrites = new Writes[FieldInstance] {
def writes(fieldInstance: FieldInstance) = Json.obj(
fieldInstance.fieldName -> fieldInstance.value
)
}
}
I created a companion object with an implicit Write to json so I can call "Json.toJson()" on an instance of FieldInstance and get a json as described on my examples above.
I get an error : found: Any required: play.api.libs.json.Json.JsValueWrapper
I understand that it comes from the fact that my value is of type Any but I thought the cast would change that Any to String || Boolean || Int before hitting the Writer.
PS: Ignore the bad naming of the classes, I could not name EntityInstance and FieldInstance, Entity and Field because these as the classes I use to describe my models.
I found a fix to my problem :
The type matching that I was doing in the class should be done in the implicit Write !
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec
override def toString(): String = "(" + fieldName + "," + value + ")";
}
object FieldInstance {
implicit val fieldInstanceWrites = new Writes[FieldInstance] {
def writes(fieldInstance: FieldInstance) =
fieldInstance.value match {
case v: Int => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[Int])
case v: String => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[String])
case v: Boolean => Json.obj(fieldInstance.fieldName -> v.asInstanceOf[Boolean])
case _ => Json.obj(fieldInstance.fieldName -> "Unsupported type")
}
}
}
This code now allows a user to create an EntityInstance with Fields of Any type :
val ei = new EntityInstance(Seq[FieldInstance](new FieldInstance("name", "George"), new FieldInstance("age", 25), new FieldInstance("married", true)))
println("-- TEST ENTITY INSTANCE TO JSON --")
println(Json.toJson(ei))
prints : {"entity":[{"name":"George"},{"age":25},{"married":true}]}
Here is my EntityInstance code if you are trying to test it :
case class EntityInstance(fields: Seq[FieldInstance])
object EntityInstance {
implicit val EntityInstanceWrites = new Writes[EntityInstance] {
def writes(entityInstance: EntityInstance) =
Json.obj("entity" -> entityInstance.fields)
}
}
It is returning a String, Int or Boolean but Json.obj is expecting the value parameter of type (String, JsValueWrapper)
def obj(fields: (String, JsValueWrapper)*): JsObject = JsObject(fields.map(f => (f._1, f._2.asInstanceOf[JsValueWrapperImpl].field)))
a quick fix could be to convert the matched value v with toJson provided the implicit Writes[T] for type T is available (which they are for String, Int and Boolean)
class FieldInstance(fieldNamec: String, valuec: Any) {
val fieldName = fieldNamec
val value = valuec match {
case v: Int => Json.toJson(v)
case v: String => Json.toJson(v)
case v: Boolean => Json.toJson(v)
case _ => Json.toJson("Unrecognized type")
}
}
If you'd like to see which DefaultWrites are available you can browse them in the play.api.libs.json package in trait DefaultWrites
for example:
/**
* Serializer for Boolean types.
*/
implicit object BooleanWrites extends Writes[Boolean] {
def writes(o: Boolean) = JsBoolean(o)
}
I am using anorm to query and save elements into my postgres database.
I have a json column which I want to read as class of my own.
So for example if I have the following class
case class Table(id: Long, name:String, myJsonColumn:Option[MyClass])
case class MyClass(site: Option[String], user:Option[String])
I am trying to write the following update:
DB.withConnection { implicit conn =>
val updated = SQL(
"""UPDATE employee
|SET name = {name}, my_json_column = {myClass}
|WHERE id = {id}
""".stripMargin)
.on(
'name -> name,
'myClass -> myClass,
'custom -> id
).executeUpdate()
}
}
I also defined a implicit convertor from json to my object
implicit def columnToSocialData: Column[MyClass] = anorm.Column.nonNull[MyClass] { (value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case json: org.postgresql.util.PGobject => {
val result = Json.fromJson[MyClass](Json.parse(json.getValue))
result.fold(
errors => Left(TypeDoesNotMatch(s"Cannot convert $value: ${value.asInstanceOf[AnyRef].getClass} to Json for column $qualified")),
valid => Right(valid)
)
}
case _ => Left(TypeDoesNotMatch(s"Cannot convert $value: ${value.asInstanceOf[AnyRef].getClass} to Json for column $qualified"))
}
And the error I get is:
type mismatch;
found : (Symbol, Option[com.MyClass])
required: anorm.NamedParameter
'myClass -> myClass,
^
The solution is just to add the following:
implicit val socialDataToStatement = new ToStatement[MyClass] {
def set(s: PreparedStatement, i: Int, myClass: MyClass): Unit = {
val jsonObject = new org.postgresql.util.PGobject()
jsonObject.setType("json")
jsonObject.setValue(Json.stringify(Json.toJson(myClass)))
s.setObject(i, jsonObject)
}
}
and:
implicit object MyClassMetaData extends ParameterMetaData[MyClass] {
val sqlType = "OTHER"
val jdbcType = Types.OTHER
}
object Users {
implicit object UserReads extends Reads[User] {
def reads(json: JsValue) = JsSuccess(User(
Id((json \ "id").as[String].toLong),
(json \ "name").as[String],
(json \ "email").as[String]
}
implicit object UserWrites extends Writes[User] {
def writes(user: User) = JsObject(Seq(
"id" -> JsNumber(user.id.get),
"name" -> JsString(user.name),
"email" -> Json.toJson(user.email)
}
def view(id: Long) = Action { implicit request =>
Ok(Json.toJson(User.find(id)))
}
def all() = Action { implicit request =>
Ok(Json.toJson(User.findAll()))
}
def save = Action(parse.json) { request =>
val userJson = request.body
val user = userJson.as[User]
try {
User.create(user)
Ok("Saved")
} catch {
case e: IllegalArgumentException => BadRequest("Error")
}
}
}
case class User(
id: Pk[Long] = NotAssigned,
name: String = "",
email: String = "")
The above two are my controller & model, when I send the User data [id, name, email] as json using angular JS, it creates the User object in the database. But it should be able to create when I input only [name, email] or just [name] as email could be null. If I'm not wrong I should adjust these in the reads and writes method of User, is it?.
Also, can we have two reads/writes for different pursposes, if so how can that be achieved - throw some light. Thanks.
One Issue fixed with the following:
case class User(
id: Pk[Long] = NotAssigned,
name: String = "",
email: Option[String])
implicit object UserReads extends Reads[User] {
def reads(json: JsValue) = JsSuccess(User(
Id((json \ "id").as[String].toLong),
(json \ "name").as[String],
(json \ "email").asOpt[String])
}
implicit object UserWrites extends Writes[User] {
def writes(user: User) = JsObject(Seq(
"id" -> JsNumber(user.id.get),
"name" -> JsString(user.name),
"email" -> Json.toJson(user.email))
}
Now email can be null, but what about id - PK[Long]
I did not compile this, but it should work:
case class User(
id: Pk[Long] = NotAssigned,
name: String = "",
email: Option[String])
implicit object UserReads extends Reads[User] {
def reads(json: JsValue) = JsSuccess(User(
(json \ "id").asOpt[Long].map(id => Id[Long](id)).getOrElse(NotAssigned)
(json \ "name").as[String],
(json \ "email").asOpt[String])
}
implicit object UserWrites extends Writes[User] {
def writes(user: User) = JsObject(Seq(
"id" -> JsNumber(user.id.get),
"name" -> JsString(user.name),
"email" -> Json.toJson(user.email))
}
Basically you take the id as Option[Long] like you did with the email. Then you check if it is set and if yes you create the Pk instance with Id[Long](id) and if not you provide the NotAssigned Pk singleton instance.
Additional Tip
Alternatively you can try to use
implicit val jsonFormatter = Json.format[User]
It will create the Reads and the Writes for you directly at compile time.
There are two problem with this if you are using an anorm object directly:
First you need a Format for the Pk[Long]
implicit object PkLongFormat extends Format[Pk[Long]] {
def reads(json: JsValue): JsResult[Pk[Long]] = {
json.asOpt[Long].map {
id => JsSuccess(Id[Long](id))
}.getOrElse(JsSuccess(NotAssigned))
}
def writes(id: Pk[Long]): JsNumber = JsNumber(id.get)
}
Second it does not work, if you do not send an id at all, not even with value null, so your client needs to send {"id": null, "name": "Tim"} because it does not even try to call the PkFormatter which could handle the JsUndefined, but simply gives you an "validate.error.missing-path" error
If you don't want to send null ids you cannot use the macro Json.format[User]
Type anorm.Pk is almost exactly like scala.Option in form and function. Avoid writing concrete Reads and Writes for all types it might possibly contain. An example that follows the OptionReads and OptionWrites implementations is as follows:
implicit def PkWrites[T](implicit w: Writes[T]): Writes[Pk[T]] = new Writes[Pk[T]] {
def writes(o: Pk[T]) = o match {
case Id(value) => w.writes(value)
case NotAssigned => JsNull
}
}
implicit def PkReads[T](implicit r: Reads[T]): Reads[Pk[T]] = new Reads[Pk[T]] {
def reads(js: JsValue): JsResult[Pk[T]] =
r.reads(js).fold(e => JsSuccess(NotAssigned), v => JsSuccess(Id(v)))
}
This way, you support every T for which there's a respective Reads[T] or Writes[T], and it's more reliable in handling Id and NotAssigned.