How do you convert a case class to a class? - json

I have a case class Person(name:String, id:Int) for serialization reasons (MessagePack), I need to convert this to a class. Is there an easy / elegant way to do this which does not require too much boilerplate or creation?
case class Person(name:String, id:Int) -> class Person(name:String, id:Int)

A case class can be serialized as a "normal" class as it is de facto a normal class.
As it makes all constructor arguments accessable by default it's an even better fit for serialisation than a normal class.
The case class just tells the compiler to automatically add some methods as equals and hashCode to the class. At the same time there is a companion object with an apply method added, but this doesn't affect the normal class at all.
So if there arise problems with Serialization the chance is quite hight that the source of the problem lies elsewhere.
You could try json4s https://github.com/json4s/json4s to convert your case class to JSON and then convert it to the MessagePack format.
import org.json4s._
import org.json4s.native.Serialization
import org.json4s.native.Serialization.{read, write}
implicit val formats = Serialization.formats(ShortTypeHints(List(classOf[Person])
val jsonString : String = write(Person("Andi", 42))
// convert with MessagePack as you would do with normal JSON

Related

Using snake case without explicit Configuration dependency in Circe

As specified in the documentation it is possible to convert snake case to a camel case that is idiomatic in Scala. I tried it and it worked fine. Here is it:
implicit lazy val configuration: Configuration = Configuration.default.withSnakeCaseMemberNames
#ConfiguredJsonCodec final case class ModelClass(someField1: String, someField2: Int, someField3: String)
I want to keep my model clean without adding dependencies on external frameworks so it comprises only business-specific case classes.
Is it possible to avoid addding the annotation #ConfiguredJsonCodec and bringing implicit lazy val configuration: Configuration into scope? Maybe it could be configured on the Decoder level?
It's perfectly possible. It's a trade off:
if you have implicits in your companion objects, you don't have to import them
if you don't want to have coupling with libraries in your models, you have to out all implicits in trait/object and then mixin/import them every single time you need them
If you are developing application with a fixed stack, chosen libraries for each task, and so on - having all implicits in companion is just cleaner and easier to maintain.
package com.example
package object domain {
private[domain] implicit lazy val configuration: Configuration = ...
}
package com.example.domain
import io.circe.generic.extra._
#ConfiguredJsonCodec
final case class ModelClass(...)
Many utilities are optimized for this e.g. enumeratum-circe uses a mixin to add add codec for enumeration into companion object.
If you don't want to have them there, because e.g. you have your models in one module and it should be dependency-free, then you would have to put these implicits somewhere else. And that requires writing code manually, no way around it:
package com.example.domain
final case class ModelClass(...)
package com.example.domain.circe
import io.circe._
import io.circe.generic.extra.semiauto._
// if I want a mixin:
// class SomeClass extends Codecs { ... }
trait Codecs {
protected implicit lazy val configuration: Configuration = ...
implicit val modelClassDecoder: Decoder[ModelClass] = deriveConfiguredDecoder[ModelClass]
implicit val modelClassEncoder: Encoder[ModelClass] = deriveConfiguredEncoder[ModelClass]
}
// if I want an import:
// import com.example.domain.circe.Codecs._
object Circe extends Circe
If you pick this way, you are giving up on e.g. enumeraturm-circe's ability to provide codecs, and you will have to write them manually.
You have to pick one of these depending on your use case, but you cannot have the benefits of both at once: either you give up on boilerplate-reduction or on dependency-reduction.

Deserializing to java objects with Scala and json4s

I have HTTP client written in Scala that uses json4s/jackson to serialize and deserialize HTTP payloads. For now I was using only Scala case classes as model and everything was working fine, but now I have to communicate with third party service. They provided me with their own model but its written in Java, so now I need to deserialize jsons also to Java classes. It seams to work fine with simple classes but when class contains collections like Lists or Maps json4s has problems and sets all such fields to null.
Is there any way to handle such cases? Maybe I should use different formats (I'm using DefaultFormats + few custom ones). Example of problem with test:
import org.json4s.DefaultFormats
import org.json4s.jackson.Serialization.read
import org.scalatest.{FlatSpec, Matchers}
class JavaListTest extends FlatSpec with Matchers{
implicit val formats = DefaultFormats
"Java List" should "be deserialized properly" in {
val input = """{"list":["a", "b", "c"]}"""
val output = read[ObjectWithList](input)
output.list.size() shouldBe 3
}
}
And sample Java class:
import java.util.List;
public class ObjectWithList {
List<String> list;
}
I have also noticed that when I'll try to deserialize to Scala case class that contains java.util.List[String] type of field I'll get an exception of type: org.json4s.package$MappingException: Expected collection but got List[String]
Key for solving your issue, is composition of formatters. Basically you want to define JList formatter as list formatter composed with toJList function.
Unfortunately, json4s Formatters are extremely difficult to compose, so I used the Readers for you to get an idea. I also simplified an example, to having only java list:
import DefaultReaders._
import scala.collection.JavaConverters._
implicit def javaListReader[A: Reader]: Reader[java.util.List[A]] = new Reader[util.List[A]] {
override def read(value: JValue) = DefaultReaders.traversableReader[List, A].read(value).asJava
}
val input = """["a", "b", "c"]"""
val output = Formats.read[java.util.List[String]](parse(input))
To my knowledge json4s readers will not work with java classes out of the box, so you might either need to implement the Serializer[JList[_]] the same way, or mirror your java classes with case classes and use them inside your domain.
P.S.
Highly recommend you to switch to circe or argonaut, then you will forget about the most problems with jsons.

spray / akka http json marshalling case classes as values

In spray / akka http I can marshal/unmarshal a case class like so:
case class Latitude(value:Double)
object Latitude extends DefaultJsonProtocol with SprayJsonSupport {
implicit val LatitudeFormat = jsonFormat1(Latitude.apply)
}
However this will marshal a Latitude(42) to an object {value:42}. I rather want it to be marshalled to just a JsNumber 42. To do so I did the following:
case class Latitude(value:Double)
object Latitude extends DefaultJsonProtocol with SprayJsonSupport {
implicit object LatitudeFormat extends RootJsonFormat[Latitude] {
def write(lat: Latitude) = lat.value.toJson
def read(value: JsValue) = ??? //too much code with decent error handling, but working
}
}
However I don't want to do this for every "simple value case class".
My goal is to e.g. create a function (or maybe a macro) that works exactly like spray's jsonFormat1 except it does not write/read objects but simple values, depending on the case class I use it on.
Unfortunately there does not seem to be any way to extend or compose the jsonRootFormat object that is returned from the jsonFormat1 function. The function itself seems to use already deprecated stuff (like ClassManifest) so I'm not sure if I want to copy and adjust it for my needs. What is my best option in this situation to get/create such a simple case class to JsValue function?

Serializing sequences of AnyVal with json4s

I have a problem when trying to serialize sequences of AnyVal using json4s in scala.
Here is a test using FunSuite that reproduces the problem:
import org.json4s._
import org.json4s.jackson.JsonMethods._
import org.json4s.jackson.Serialization._
import org.scalatest.{FunSuite, Matchers}
case class MyId(id: String) extends AnyVal
case class MyModel(ids: Seq[MyId])
class AnyValTest extends FunSuite with Matchers {
test("should serialize correctly") {
implicit val formats = DefaultFormats
val model = MyModel(Seq(MyId("1"), MyId("2")))
val text = write(model)
parse(text).extract[MyModel] shouldBe model
}
}
The test fails when trying to extract MyModel from the JValue because it can not find a suitable value for the ids field.
I notice that it AnyVal are working fine when used directly though with something like:
case class AnotherModel(id: MyId)
Then I am able to serialise and deserialise correctly.
I know this question is one year old but I ran into the same issue. Writing what I did in case it helps someone else. You will need a custom serializer.
case class Id(asString: String) extends AnyVal
class NotificationSerializer extends CustomSerializer[Id](format ⇒ (
{case JString(s) => Id(s)},
{case Id(s) => JString(s)}))
Without above serialization, your JSON will look something like
{"ids":[[{"asString":"testId1"},{"asString":"testId2"}]]}
I am not entirely sure why AnyVal case class serialization works fine when it is a part of another case class but not standalone. My best guess is that the behavior is due to the allocation behavior of JVM for array containing value classes. See http://docs.scala-lang.org/overviews/core/value-classes.html for 'when allocation is necessary' section.

Serializing and unserializing case classes with lift-json

I'm attempting basic serialization/hydration with lift-json, but without success. As near as I can tell from the package readme, this should work. Help?
I'm using Scala 2.8.0 and Lift 2.2 cross-built for 2.8 with sbt ("net.liftweb" %% "lift-json" % "2.2").
import net.liftweb.json._
import net.liftweb.json.Serialization.{read, write}
implicit val formats = Serialization.formats(NoTypeHints)
case class Route(title: String)
val rt = new Route("x277a1")
val ser = write(rt)
// ser: String = {} ...
val deser = read[Route]("""{"title":"Some Title"}""")
// net.liftweb.json.MappingException: Parsed JSON values do not match with class constructor
Lift JSON's serialization does not work for case classes defined in REPL (paranamer can't find the bytecode to read the type metadata). Compile Route with scalac and then the above example works.
The same problem applies every time when the (de)serialuzed class is not on the classpath. In such case, paranamer can't read the parameter names. It is necessary to provide a custom ParameterNameReader.
Such problem applies for e.g.:
REPL (as mentioned) - unless you define the class outside the REPL and add via classpath.
Play Framework - unless you provide a simple custom ParameterNameReader (see below) or load the (de)serialized class as a Maven/Play/... dependency
Feel free to add another situation (you can edit this post).
The PlayParameterNameReader:
import net.liftweb.json.ParameterNameReader
import java.lang.reflect.Constructor
import play.classloading.enhancers.LocalvariablesNamesEnhancer
import scala.collection.JavaConversions._
object PlayParameterReader extends ParameterNameReader{
def lookupParameterNames(constructor: Constructor[_]) = LocalvariablesNamesEnhancer.lookupParameterNames(constructor)
}