Scala - Custom Encoder for Upickle/Ujson Library - json

I am working with the Upickle/Ujson and want to write a custom encoder to get the hang of things.
Suppose I have the following hierarchy (from the tutorial here: Link)
import upickle.default._
object TestDrive extends App {
sealed trait TypedFoo
object TypedFoo{
import upickle.default._
implicit val readWriter: ReadWriter[TypedFoo] = ReadWriter.merge(
macroRW[Bar], macroRW[Baz], macroRW[Quz]
)
case class Bar(i: Int) extends TypedFoo
case class Baz(s: String) extends TypedFoo
case class Quz(b: Boolean) extends TypedFoo
}
import TypedFoo._
println(writeJs(Bar(100)))
}
Firstly this fails to compile. Why is this the case? Have I misunderstood the page?
Secondly
What if I want to serialize Bar and add a field, "parent":"TypedFoo", at the same time? So Bar looks like:
{"parent":"TypedFoo", "$type":"package.TestDrive.TypedFoo.Bar","i":100}
How can I do this too?

Related

Circe asJson not encoding properties from abstract base class

Suppose I have the following abstract base class:
package Models
import reactivemongo.bson.BSONObjectID
abstract class RecordObject {
val _id: String = BSONObjectID.generate().stringify
}
Which is extended by the following concrete case class:
package Models
case class PersonRecord(name: String) extends RecordObject
I then try to get a JSON string using some code like the following:
import io.circe.syntax._
import io.circe.generic.auto._
import org.http4s.circe._
// ...
val person = new PersonRecord(name = "Bob")
println(person._id, person.name) // prints some UUID and "Bob"
println(person.asJso) // {"name": "Bob"} -- what happened to "_id"?
As you can see, the property _id: String inherited from RecordObject is missing. I would expect that the built-in Encoder should function just fine for this use case. Do I really need to build my own?
Let's see what happens in encoder generation. Circe uses shapeless to derive its codecs, so its enough to check what shapeless resolves into to answer your question. So in ammonite:
# abstract class RecordObject {
val _id: String = java.util.UUID.randomUUID.toString
}
defined class RecordObject
# case class PersonRecord(name: String) extends RecordObject
defined class PersonRecord
# import $ivy.`com.chuusai::shapeless:2.3.3`, shapeless._
import $ivy.$ , shapeless._
# Generic[PersonRecord]
res3: Generic[PersonRecord]{type Repr = String :: shapeless.HNil} = ammonite.$sess.cmd3$anon$macro$2$1#1123d461
OK, so its String :: HNil. Fair enough - what shapeless does is extracting all fields available in constructor transforming one way, and putting all fields back through constructor if converting the other.
Basically all typeclass derivation works this way, so you should make it possible to pass _id as constructor:
abstract class RecordObject {
val _id: String
}
case class PersonRecord(
name: String,
_id: String = BSONObjectID.generate().stringify
) extends RecordObject
That would help type class derivation do its work. If you cannot change how PersonRecord looks like... then yes you have to write your own codec. Though I doubt it would be easy as you made _id immutable and impossible to set from outside through a constructor, so it would also be hard to implement using any other way.

Serialize extended class to JSON in Scala with Play Framework serializer

I would like to serialize an extended class in scala and
i have some test code..
import org.specs2.mutable._
import org.specs2.runner._
import org.junit.runner._
import play.api.libs.json.Json
#RunWith(classOf[JUnitRunner])
class JsonSerializerTest extends Specification {
class A(val s1: String)
case class B(s2: String) extends A("a")
"Application" should {
"serialize class to JSON" in {
implicit val bWrites = Json.writes[B]
implicit val bReads = Json.reads[B]
val bClass = B("b")
println(bClass.s1 + " " + bClass.s2)
val serialized = Json.toJson[B](bClass)
val s1 = (serialized \ "s1").asOpt[String]
s1 should beSome[String]
}
}
}
In this case test print:
a b
Application should
'None' is not Some
java.lang.Exception: 'None' is not Some
It means that s1 field from parent class were not serialized.
The solution
class A(val s1: String)
case class B(override val s1: String, s2: String) extends A(s1)
mostly unacceptable because in real application classes have a lot of fields and specifying them explicitly every time when i extend class complicates the code.
Is there any other solution for this case?
you can manually create the json serializers (described here: https://www.playframework.com/documentation/2.5.x/ScalaJsonCombinators)
the problem with your version is that Json.writes and Json.reads are macros that are looking specifically at the constructor of the case class, and building the serializer from that (so superclass arguments aren't captured). you could copy and roll your own version of the macro: https://github.com/playframework/playframework/blob/d6c2673d91d85fd37de424951ee5ad9f4f4cce98/framework/src/play-json/src/main/scala/play/api/libs/json/JsMacroImpl.scala
lastly, you can make a function that takes the result of Json.writes and Json.reads, and adds on the shared fields you want. something like:
object A {
def writesSubclass[T](writer: Writes[T]): Writes[T] = new Writes[T] {
def writes(t: T) = Json.obj("s1" -> t.s1) ++ writer.writes(t).as[JsObject]
}
}
implicit val bWrites = A.writesSubclass(Json.writes[B])
depending on how often your A gets extended, this might be your best bet.

Deriving circe Codec for a sealed case class family where base trait has a (sealed) type member

I can easily generically derive a codec for a sealed case class family like this:
import io.circe._
import io.circe.generic.auto._
sealed trait Base
case class X(x: Int) extends Base
case class Y(y: Int) extends Base
object Test extends App {
val encoded = Encoder[Base].apply(Y(1))
val decoded = Decoder[Base].apply(encoded.hcursor)
println(decoded) // Right(Y(1))
}
However, if I add a type member to the base class I can't do it anymore, even if it's bounded by a sealed trait:
import io.circe._
import io.circe.generic.auto._
sealed trait Inner
case class I1(i: Int) extends Inner
case class I2(s: String) extends Inner
sealed trait Base { type T <: Inner }
case class X[S <: Inner](x: S) extends Base { final type T = S }
case class Y[S <: Inner](y: S) extends Base { final type T = S }
object Test extends App {
val encodedInner = Encoder[Inner].apply(I1(1))
val decodedInner = Decoder[Inner].apply(encodedInner.hcursor) // Ok
println(decodedInner) // Right(I1(1))
// Doesn't work: could not find implicit for Encoder[Base] etc
// val encoded = Encoder[Base].apply(Y(I1(1)))
// val decoded = Decoder[Base].apply(encoded.hcursor)
// println(decoded)
}
Is there a way I can achieve what I want? If not, what can I change to get something similar?
The main reason this doesn't work is because you are trying to essentially do
Encoder[Base { type T }]
without saying what type T is. This is analogous to expecting this function to compile -
def foo[A] = implicitly[Encoder[List[A]]]
You need to explicitly refine your type.
One way to approach this is with the Aux pattern. You can't use the typical type Aux[S] = Base { type T = S } since that won't give you the coproduct when trying to derive the instance (the X and Y classes can't extend from a type alias). Instead, we could hack around it by creating another sealed trait as Aux and have our case classes extend from that.
So long as all of your case classes extend from Base.Aux instead of directly from Base, you can use the following which abuses .asInstanceOf to appease the type system.
sealed trait Inner
case class I1(i: Int) extends Inner
case class I2(s: String) extends Inner
sealed trait Base { type T <: Inner }
object Base {
sealed trait Aux[S <: Inner] extends Base { type T = S }
implicit val encoder: Encoder[Base] = {
semiauto.deriveEncoder[Base.Aux[Inner]].asInstanceOf[Encoder[Base]]
}
implicit val decoder: Decoder[Base] = {
semiauto.deriveDecoder[Base.Aux[Inner]].asInstanceOf[Decoder[Base]]
}
}
val encoded = Encoder[Base].apply(Y(I1(1)))
val decoded = Decoder[Base].apply(encoded.hcursor)
Note that a lot of this depends on how you are actually using your types. I would imagine that you wouldn't rely on calling Encoder[Base] directly and would instead use import io.circe.syntax._ and call the .asJson extension method. In that case, you may be able to rely on an Encoder[Base.Aux[S]] instance which would be inferred depending on the value being encoded/decoded. Something like the following may suffice for your use case without resorting to .asInstanceOf hacks.
implicit def encoder[S <: Inner : Encoder]: Encoder[Base.Aux[S]] = {
semiauto.deriveEncoder
}
Again, it all depends on how you are using the instances. I'm skeptical that you actually need a type member in Base, things would be simpler if you moved it to a generic param so the deriver could figure out the coproduct for you.

Serializing sequences of AnyVal with json4s

I have a problem when trying to serialize sequences of AnyVal using json4s in scala.
Here is a test using FunSuite that reproduces the problem:
import org.json4s._
import org.json4s.jackson.JsonMethods._
import org.json4s.jackson.Serialization._
import org.scalatest.{FunSuite, Matchers}
case class MyId(id: String) extends AnyVal
case class MyModel(ids: Seq[MyId])
class AnyValTest extends FunSuite with Matchers {
test("should serialize correctly") {
implicit val formats = DefaultFormats
val model = MyModel(Seq(MyId("1"), MyId("2")))
val text = write(model)
parse(text).extract[MyModel] shouldBe model
}
}
The test fails when trying to extract MyModel from the JValue because it can not find a suitable value for the ids field.
I notice that it AnyVal are working fine when used directly though with something like:
case class AnotherModel(id: MyId)
Then I am able to serialise and deserialise correctly.
I know this question is one year old but I ran into the same issue. Writing what I did in case it helps someone else. You will need a custom serializer.
case class Id(asString: String) extends AnyVal
class NotificationSerializer extends CustomSerializer[Id](format ⇒ (
{case JString(s) => Id(s)},
{case Id(s) => JString(s)}))
Without above serialization, your JSON will look something like
{"ids":[[{"asString":"testId1"},{"asString":"testId2"}]]}
I am not entirely sure why AnyVal case class serialization works fine when it is a part of another case class but not standalone. My best guess is that the behavior is due to the allocation behavior of JVM for array containing value classes. See http://docs.scala-lang.org/overviews/core/value-classes.html for 'when allocation is necessary' section.

Scala activerecord with implicit json format

I have a scala-activerecord:
case class Person(name: String) extends ActiveRecord with Timestamps
object Person extends ActiveRecordCompanion[Person]
Everything works ok.
Suddenly, I want to provide an API and repond with json representation of the entity, so I modified the code:
case class Person(name: String) extends ActiveRecord with Timestamps
object Person extends ActiveRecordCompanion[Person] with DefaultJsonProtocol {
implicit val jsonFormat = jsonFormat1(Request)
}
Now it causes an exception:
com.github.aselab.activerecord.SchemaSettingException: Cannot find table definition of class Person
at com.github.aselab.activerecord.ActiveRecordException$.tableNotFound(ActiveRecordException.scala:48)
at com.github.aselab.activerecord.Config$$anonfun$schema$1.apply(ActiveRecordConfig.scala:29)
at com.github.aselab.activerecord.Config$$anonfun$schema$1.apply(ActiveRecordConfig.scala:29)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at scala.collection.AbstractMap.getOrElse(Map.scala:59)
at com.github.aselab.activerecord.Config$.schema(ActiveRecordConfig.scala:29)
at com.github.aselab.activerecord.ActiveRecordBaseCompanion$class.schema(ActiveRecord.scala:116)
at Person$.schema$lzycompute(Request.scala:12)
at Person$.schema(Request.scala:12)
at com.github.aselab.activerecord.ActiveRecordBaseCompanion$class.table(ActiveRecord.scala:123)
at Person$.table$lzycompute(Request.scala:12)
at Person$.table(Request.scala:12)
at com.github.aselab.activerecord.ActiveRecordBaseCompanion$class.all(ActiveRecord.scala:133)
at Person$.all(Request.scala:12)
at com.github.aselab.activerecord.inner.CompanionIterable$class.companionToIterable(Implicits.scala:15)
at Person$.companionToIterable(Request.scala:12)
at Person$.<init>(Request.scala:13)
at Person$.<clinit>(Request.scala)
... 34 more
EDIT:
I put two breakpoints in ActiveRecordConfig.scala:
Breakpoint A here:
def schema(companion: ActiveRecordBaseCompanion[_, _]): ActiveRecordTables = {
val clazz = companion.classInfo.clazz
tables.getOrElse(clazz, throw ActiveRecordException.tableNotFound(clazz.toString))
}
Breakpoint B here:
def registerSchema(s: ActiveRecordTables) = {
conf = s.config
s.all.foreach(t => _tables.update(t.posoMetaData.clasz, s))
}
With the first code (without json implicit) the execution hits the breakpoint B.
With the second code (including json implicit) the execution hits the breakpoint A first, causing the exception.
Json support work in the version 0.3.1 of scala-activerecord, see wiki and this issue. As for now, with the latest version 0.3.0 you can use the first code and the form values deserialization:
case class Person(name: String) extends ActiveRecord with Timestamps
object Person extends ActiveRecordCompanion[Person]
And in your e.g. spray controller:
import spray.httpx.SprayJsonSupport._
import spray.json.DefaultJsonProtocol._
requestContext.complete(Person.find(id).toFormValues)
The method toFormValues will return Map[String, String], which can be by spray-json implicitly converted to json.