I am looking to provide JSON encoders for the following case class:
import io.circe.generic.extras.Configuration
final case class Hello[T](
source: String,
version: Int = 1,
data: T
)
object Hello {
implicit val configuration: Configuration = Configuration.default.withDefaults
}
I would ordinarily call deriveEncoder[A] in the companion object, but that doesn't work here as there is no reference or Encoder for T available here.
The Hello type will be provided to clients as a library, so I would like to do as much of the boilerplate as possible within this type rather than depend on client code providing the encoder and decoder. Is there an idiomatic solution to this with circe so that clients provide an encoder/decoder for T and this gets used to derive the encoder/decoder for Hello[T]?
Yes, you need to add a context bound requiring an implicit encoder to be present for any type T:
import io.circe.gemeric.semiauto._
final case class Hello[T](
source: String,
version: Int = 1,
data: T
)
object Hello {
implicit def helloEncoder[T: Encoder]: Encoder[Hello[T]] = deriveEncoder
}
Such that when the user creates their own Hello[Foo] type, they'll have to make sure that Foo has its own encoder.
Related
I'm trying to write a method that will allow Jackson ObjectMapper readValue on a json string to a parameterized object type. Something like this
case class MyObj(field1: String, field2: String)
val objectMapper: ObjectMapper = new ObjectMapper().registerModule(new DefaultScalaModule)
def fromJson[T](jsonString: String, objTyp: T): T = {
objectMapper.readValue(jsonString, classOf[T])
}
val x = fromJson("""{"field1": "something", "field2": "something"}""", MyObj)
This of course returns an error of
class type required but T found
i've looked at this issue Scala classOf for type parameter
but it doesn't seem to help. It seems like this is possible to do somehow. Looking for any help
You have to give it the actual runtime class to parse into, not just a type parameter.
One way to do it is passing the class directly:
def fromJson[T](json: String, clazz: Class[T]) = objectMapper.readValue[T](json, clazz)
val x = fromJson("""...""", classOf[MyObj])
Alternatively, you can use ClassTag, which looks a bit messier in implementation, but kinda prettier at call site:
def fromJson[T : ClassTag](json: String): T = objectMapper.readValue[T](
json,
implicitly[ClassTag[T]].runtimeClass.asInstanceOf[Class[T]]
)
val x = fromJson[MyObj]("""{"field1": "something", "field2": "something"}""")
i've looked at this issue Scala classOf for type parameter but it doesn't seem to help.
In the very first answer there it's written classTag[T].runtimeClass as a replacement of classOf[T]. This should help.
Regarding the signature
def fromJson[T](jsonString: String, objTyp: T): T
You should notice that MyObj has type MyObj.type (companion-object type), not MyObj (case-class type).
Class companion object vs. case class itself
So if you call fromJson("""...""", MyObj) then the types in these two places
def fromJson[...](jsonString: String, objTyp: ???): ???
^^^ ^^^ <--- HERE
can't be the same.
If it's enough for you to call
fromJson("""...""", classOf[MyObj])
or
fromJson[MyObj]("""...""")
(normally it should be enough) then please see #Dima's answer, you should prefer those options, they're easier.
Just in case, if you really want to call like fromJson("""...""", MyObj) then for example you can use the type class ToCompanion (this is more complicated) from
Invoke construcotr based on passed parameter
Get companion object of class by given generic type Scala (answer)
// ToCompanion should be defined in a different subproject
def fromJson[C, T](jsonString: String, objTyp: C)(implicit
toCompanion: ToCompanion.Aux[C, T],
classTag: ClassTag[T]
): T =
objectMapper.readValue(jsonString, classTag.runtimeClass.asInstanceOf[Class[T]])
val x = fromJson("""{"field1": "something", "field2": "something"}""", MyObj)
// MyObj(something,something)
I'm Java developer and pretty new to scala.
I'm implementing some rest API that use spray and akka
The API should expose some kind of user CRUD. I'll use only create user in this question...
trait DefaultJsonFormats extends DefaultJsonProtocol with SprayJsonSupport with MetaMarshallers {}
class RegistrationService(registration: ActorRef)
(implicit executionContext: ExecutionContext)
extends Directives with DefaultJsonFormats {
implicit val timeout = Timeout(2.seconds)
implicit val userFormat = jsonFormat3(User)
implicit val registerFormat = jsonFormat1(Register)
implicit val registeredFormat = jsonFormat1(Registered)
val route =
path("register") {
post { handleWith { ru: Register => (registration ? ru).mapTo[Registered] } }
}
//------ Actor
object RegistrationActor {
case class User(id:String, name:String)
case class Register(user: User)
case class Registered(status: String)
case object NotRegistered
}
class RegistrationActor(implDef: String) extends Actor {
def receive: Receive = {
case Register(user)=>
val status=// create user real code with return status
sender ! new Registered(status)
} }
In this approach the json serialization and desiarelization is pretty annoying. For every object I need to deal with API I must define the appropriate format
implicit val userFormat = jsonFormat3(User)
implicit val registerFormat = jsonFormat1(Register)
implicit val registeredFormat = jsonFormat1(Registered)
I would like to avoid such definition and use some general json converter and return a pojo objects, so the conversion will happen under-the-hood
The question is how can I change this code to use by default Gson/Jackson/Spray default converter and avoid definition of the implicit ... jsonFormats?
For every object I need to deal with API I must define the appropriate format
It is normal to do this once, in a "JsonProtocol" class and import that where needed, rather than defining new formats each time:
import MyJsonProtocol._
val route =
path("register") {
post { handleWith { ru: Register => (registration ? ru).mapTo[Registered] } }
how can I change this code to use by default Gson/Jackson/Spray default converter and avoid definition of the implicit ... jsonFormats?
You would need to declare an implicit marshaller from Registered to HttpResponse (or an intermediate value like String) which was backed by Jackson instead of spray-json, then import that marshaller instead of SprayJsonSupport.
Have a look at the implementation of SprayJsonSupport to see how to do this. It's fairly straightforward, if you're comfortable with implicit conversions.
You can also see how this is done in Json4sSupport in Spray -- that trait implements a Marshaller[T, String] for ALL types T. Then, at runtime, the Json4s library will try to serialize the object to JSON.
In this approach the json serialization and desiarelization is pretty annoying
There are two main advantages of spray-jsons approach over Jackson's:
There is no reflection, so it is faster at runtime
This is no runtime determining of JSON formats, so any issues are caught at compile-time
I'm new to scala and play framework.
Why does scala not have something like this??
class Customer (idx: Int, emailx: String) {
val id: Int = idx
val email: String = emailx
}
....
def customers = Action {
val customer = new Customer(1, "Customer1")
Ok(Json.toJson(customer))
}
I like play frameowrk (with scala, its productivity)
But,
Why should I map each field of my object manulay to json field?? Was it so hard for scala to implement this future like in Java or C#, even php has json_encode.
Is there any way to achieve this simple goal (return object as json) without any additional manipulations?
Macros are slick and perfect for generating simple case class formats
implicit val jsonFormat = Json.format[Customer]
Typically you put this declaration in your companion object to the type you are generating a format for. This way it is implicitly in scope in any file that you import your type (Customer). Like this:
case class Customer(...)
object Customer {
implicit val jsonFormat = Json.format[Customer]
}
Then in your controller you can do
Json.toJson(customer)
which will produce the JsValue type expected by Play.
For my classes I define a convertor, so that I can write exactly what you have written, e.g. Json.toJson(customer), but the convertor, though simple, does currently have to be written once. E.g.
implicit val customerWrites = new Writes[Customer] {
def writes(customer:Customer) = Json.obj(
"id" -> customer.id,
"email" -> customer.email
)
}
Perhaps macros, into which I have not delved, could do this more automatically...
Is it possible to create a generic function in Scala, using Play Framework 2.2, that will serialize an arbitrary object to JSON, without having to be supplied a writer or formatter?
For instance, this non-generic code will create a JSON response given a Customer:
import play.api.libs.json._
import play.api.libs.functional.syntax._
case class Customer(id: Int, name: String)
object scratch {
val p = Customer(1, "n")
//> p : Customer = Customer(1,n)
def createJsonResponseCustomer(data: Customer) = {
implicit val formatter = Json.format[Customer]
Json.obj("success" -> true, "data" -> Json.toJson[Customer](data))
}
createJsonResponseCustomer(p)
//> res0: play.api.libs.json.JsObject = {"success":true,"data":{"id":1,"name":"n"}}
}
To avoid having to define the formatter for each different object, I'd like to create a generic function like this:
def createJsonResponse[T](data: T) = {
implicit val formatter = Json.format[T]
Json.obj("success" -> true, "data" -> Json.toJson[T](data))
}
But this attempt produces the error No unapply function found at Json.format[T].
In other words, this works:
def getFormatter(c: Customer) = Json.format[Customer]
but this doesn't:
def getFormatterGeneric[T](c: T) = Json.format[T]
Is there any way around this?
You need to define the formatter somewhere, for each type you wish to read or write. This is because the formatter instances are resolved at compile time, not at runtime. This is a good thing, because it means trying to serialize a type that does not have a serializer becomes a compile-time error, not a runtime one.
Instead of defining the formatters on the fly, define them in a module that you can reuse, e.g.
object JsonFormatters {
implicit val customerWrites: Format[Customer] = Json.format[Customer]
}
Then import JsonFormatters._ in the scope that you want to write some JSON.
Now, you can write a generic method similar to what you wanted: you just have to specify the requirement for a formatter in the signature of your method. In practice, this is an implicit paramter of type Writes[T].
def createJsonResponse[T](data: T)(implicit writes: Writes[T]) =
Json.obj("success" -> true, "data" -> Json.toJson[T](data))
You can also write this method signature using context bound syntax, i.e.
def createJsonResponse[T : Writes](data: T) = ...
This requires that there is an instance of Writes[T] in scope; but the compiler will choose the correct instance for you based on the type T, rather than you resolving it explicitly.
Note that Writes[T] is a supertype of Format[T]; since you are only writing JSON in this method, there's no need to specify a requirement for Format[T], which would also give you Reads[T].
Ok so i am writing implicit conversions for case classes in scala, using SJSON, to send messages to remote actors using the akka framework. One of the case classes looks like this
case class Example(id: String, actr: ActorRef)
How would i go about writing the implicit for this case class.
I have seen that ActorRefs do have a toBinary method but i need to send it toJson
http://doc.akkasource.org/serialization-scala . Explicit [deep] serialization may be required only for stateful actors, when underlying actor instance (under ActorRef / RemoteActorRef) holds some important runtime data. For this case, you should implement the following typeclass for your actor:
/**
* Type class definition for Actor Serialization
*/
trait FromBinary[T <: Actor] {
def fromBinary(bytes: Array[Byte], act: T): T
}
trait ToBinary[T <: Actor] {
def toBinary(t: T): Array[Byte]
}
// client needs to implement Format[] for the respective actor
trait Format[T <: Actor] extends FromBinary[T] with ToBinary[T]
If you want ScalaJSON serialization, instead of the default one, you should use SerializerBasedActorFormat trait
trait SerializerBasedActorFormat[T <: Actor] extends Format[T] {
val serializer: Serializer
def fromBinary(bytes: Array[Byte], act: T) = serializer.fromBinary(bytes, Some(act.self.actorClass)).asInstanceOf[T]
def toBinary(ac: T) = serializer.toBinary(ac)
}
with ScalaJSON serializer.
SJSON library supports serialization of plain Scala objects out-of-box, without an additional configuration (which is enough, in the most cases). If you need to ignore some properties, or define serialization policy of embedded objects, read this.
In your case, you would need something like
#BeanInfo
case class Example(id: String,
#(JSONTypeHint #field)(value = classOf[MyActor])
actr: ActorRef)
implicit object MyActorFormat extends SerializerBasedActorFormat[MyActor] {
val serializer = Serializer.ScalaJSON
}
In general, you don't need to serialize your case classes explicitly, when you're sending messages to remote actors in Akka - Akka itself serializes all data with protobufs before sending over TCP.
Why would you need serialize reference to the actor? If it's just needed to call the sender by the actor that receives the message, you can simply use self.sender, if the message was sent with !, or self.senderFuture, when the messages is sent with !! or !!!. ActorRef (or RemoteActorRef) on itself is just an abstract interface to an actor, used to encapsulate internal actor's implementation and letting externals communicate with the actor only via messages (in contrast to stdlib Actors / much like it's done in Erlang [processes]) and holds very small amount of data that makes sense to serialize and send over wire.