Converting DateTime to a JSON string - json

I want to convert a case class with an Option[DateTime] parameter to a spray-json object which can be served by an API. Using spray-json I have a custom JsonFormat as such
object JsonImplicits extends DefaultJsonProtocol {
implicit object PostJsonFormat extends RootJsonFormat[Post] {
def write(p: Post) = JsObject(
"title" -> JsString(p.title),
"content" -> JsString(p.content),
"author" -> JsString(p.author),
"creationDate" -> JsString(p.creationDate.getOrElse(DateTime.now))
)
}
}
But I get:
overloaded method value apply with alternatives:
(value: String)spray.json.JsString <and>
(value: Symbol)spray.json.JsString
cannot be applied to (com.github.nscala_time.time.Imports.DateTime)
"creationDate" -> JsString(p.creationDate.getOrElse(DateTime.now))
when I try to compile it and no matter what I try I can't seem to convert the DateTime object to a string. For instance, when I try calling toString I get
ambiguous reference to overloaded definition,
both method toString in class AbstractDateTime of type (x$1: String, x$2: java.util.Locale)String
and method toString in class AbstractDateTime of type (x$1: String)String
match expected type ?
"creationDate" -> JsString(p.creationDate.getOrElse(DateTime.now.toString)))

You have several problems here.
First, the toString() method in AbstractDateTime requires one or several arguments see here.
But I would advise you against this path and recommend using properly Spray-Json.
Spray-json does not know how to serialize Option[DateTime], therefore you have to provide a RootJsonFormat for it.
This is what I am doing.
implicit object DateJsonFormat extends RootJsonFormat[DateTime] {
private val parserISO : DateTimeFormatter = ISODateTimeFormat.dateTimeNoMillis();
override def write(obj: DateTime) = JsString(parserISO.print(obj))
override def read(json: JsValue) : DateTime = json match {
case JsString(s) => parserISO.parseDateTime(s)
case _ => throw new DeserializationException("Error info you want here ...")
}
}
Adapt it as you want if you do not want to use ISO formatting.

Related

Json.asString returns None even though Json.toString returns the correct value

Given the following case class LogMessage:
import io.circe.{Decoder, Encoder}
import io.circe.generic.semiauto.{deriveDecoder, deriveEncoder}
import enumeratum.{CirceEnum, Enum, EnumEntry}
import io.circe.syntax._
sealed trait LogLevel extends EnumEntry
object LogLevel extends Enum[LogLevel] with CirceEnum[LogLevel] {
val values = findValues
case object Warning extends LogLevel
case object Error extends LogLevel
case object Info extends LogLevel
case object Success extends LogLevel
}
object LogMessage {
implicit val logMessageDecoder: Decoder[LogMessage] = deriveDecoder[LogMessage]
implicit val logMessageEncoder: Encoder[LogMessage] = deriveEncoder[LogMessage]
}
case class LogMessage(level: LogLevel, text: String, args: List[String], date: Long)
case class MyClass[A](obj: A)(implicit encoder: Encoder[A]) {
def message1: String = obj.asJson.toString
def message2: Option[String] = obj.asJson.asString
}
Why does this work:
val x = MyClass(LogMessage(LogLevel.Info, "test notification", Nil, 1550218866571))
x.message1 // {\n "level" : "Info",\n "text" : "test notification",\n "args" : [\n ],\n "date" : 1550218866571\n}
But this does not:
x.message2 // None
Here is a link to Scastie with this problem: link.
In circe Json has six asX methods that correspond to the six data types in JSON. For example, if a Json instance x represents a JSON boolean, x.asBoolean will return a Some containing the value as a Boolean, but if x is a JSON string, array, object, number, or null, x.asBoolean will be empty.
You're seeing .asString return None in this case because you're calling it on a Json value that represents a JSON object, not a JSON string.
The toString method on Json is completely different: it's the universal Scala / Java toString, which in the case of Json is implemented as .spaces2. I'm not sure what you're trying to do here, but in general I'd recommend avoiding toString—if you want to serialize an io.circe.Json value, it's better to use a printer or the printing methods that make the formatting options more explicit (e.g. noSpaces, spaces2, etc.).
(For what it's worth, I'm not entirely happy with the naming of asString, asNull, etc. methods on Json. In general in circe "as" is used in method names for encoding or decoding, which isn't exactly what's happening in these cases, but it's close enough that I've never bothered to come up with a better alternative.)

Create mapping from string to class type in scala

I have a lot of different external JSON entities that I want to parse to different internal case classs via json4s (scala). Everything works fine via the extract function from json4s. I have implemented a parse function which takes a type and a json string and parses the string to the type / the case class. To map the correct json string to the correct case class I have implemented a pattern matching function, which looks like this
entityName match {
case "entity1" => JsonParser.parse[Entity1](jsonString)
case "entity2" => JsonParser.parse[Entity2](jsonString)
....
I don't like the repetition here and would like to do this mapping via a map like this:
val mapping = Map(
"entity1" -> Entity1,
"entity2" -> Entity2
....
With this map in place I could implement the JsonParser.parse function only once like this
JsonParser.parse[mapping(entityName)](jsonString)
This is not working, because the map is referencing to the object and not to the class type. I also tried classOf[Entity1], but this also is not working. Is there a way to do this?
Thanks!
The way you want your JsonParser.parse to work is not possible in Scala. Scala is a strongly and statically typed language. It means that the compiler should know the types of the values at the compile time to be able to verify that you accesss only valid fields and methods on them and/or pass them as valid parameters to methods. Assuming your classes are
case class Entity1(value:Int, unique1:Int)
case class Entity2(value:String, unique2:String)
and you write
val parsed = JsonParser.parse[mapping("entity1")](jsonString)
how the compiler could know the type of parsed to know the type of parsed.value or to know that parsed.unique1 is a valid field while parsed.unique2 is not? The best type compiler could assign to such parsed is something very generic like Any. Of course you can downcast that Any to the specific type later but this means you still have to specify that type explicitly in the asInstanceOf which kind of defeats the whole purpose. Still, if somehow returning Any is OK for you, you may try to do something like this:
import org.json4s.jackson.JsonMethods
implicit val formats = org.json4s.DefaultFormats // or whatever Formats you actually need
val typeMap: Map[String, scala.reflect.Manifest[_]] = Map(
"String" -> implicitly[scala.reflect.Manifest[String]],
"Int" -> implicitly[scala.reflect.Manifest[Int]]
)
def parseAny(typeName: String, jsonString: String): Any = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract(formats, typeMap(typeName))
}
and then do something like this:
def testParseByTypeName(typeName: String, jsonString: String): Unit = {
try {
val parsed = parseAny(typeName, jsonString)
println(s"parsed by name $typeName => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParseByTypeName("String", "\"abc\"")
testParseByTypeName("Int", "123")
}
P.S. If your entityName doesn't come from the outside (i.e. you don't analyze data to find out actual type), you don't actually need it at all. It is enough to use type (without a need for match/case) such as:
def parse[T](jsonString: String)(implicit mf: scala.reflect.Manifest[T]): T = {
val jValue = JsonMethods.parse(jsonString)
jValue.extract[T]
}
def testParse[T](prefix: String, jsonString: String)(implicit mf: scala.reflect.Manifest[T]): Unit = {
try {
val parsed = parse[T](jsonString)
println(s"$prefix => ${parsed.getClass} - '$parsed'")
} catch {
case e => println(e)
}
}
def test() = {
testParse[String]("parse String", "\"abc\"")
testParse[Int]("parse Int", "123")
}
Following idea from #SergGr, as a snippet to paste on Ammonite REPL:
{
import $ivy.`org.json4s::json4s-native:3.6.0-M2`
import org.json4s.native.JsonMethods.parse
import org.json4s.DefaultFormats
import org.json4s.JValue
case class Entity1(name : String, value : Int)
case class Entity2(name : String, value : Long)
implicit val formats = DefaultFormats
def extract[T](input : JValue)(implicit m : Manifest[T]) = input.extract[T]
val mapping: Map[String, Manifest[_]] = Map(
"entity1" -> implicitly[Manifest[Entity1]],
"entity2" -> implicitly[Manifest[Entity2]]
)
val input = parse(""" { "name" : "abu", "value" : 1 } """)
extract(input)(mapping("entity1")) //Entity1("abu", 1)
extract(input)(mapping("entity2")) //Entity2("abu", 1L)
}

JSON4S deserialization without parameter name

I have the following use-case:
Each class that I'm serde using JSON4S have a field, named ID. This ID can be any type T <: Stringifiable, where Stringifiable requires your ID type to be hashed to a string. Stringifiables also have constructors that rebuilds them from a string.
I'd like to serde any Stringifiable, for example ComplexIdentifier to a JSON of ID: stringified_identifier. Serialization works nicely, but unfortunately during deserialization, JSON4S is not going to use the default constructor which has only 1 string constructor. It finds the constructor, but if the identifier has a signature of case class ComplexIdentifier(whatever: String), it tries to extract a whatever name from the JString(stringified_identifier). That fails, so MappingException is thrown internally.
Is there any way to teach JSON4S to use the default constructor without extracting the values like this? It would be so obvious to just use the value from the JString and construct the Stringifiable using that.
Thanks!
Use the applymethod in a Companion to overload the constructor for the ID classes with a String parameter. Then just use a custom serializer for all of your ID types
sealed abstract class Stringifiable {}
case class ComplexIdentifier(whatever: List[Long]) extends Stringifiable
case class SimpleIdentifier(whatever: Int) extends Stringifiable
//Overload the default constructor
object ComplexIdentifier {
def apply(s: String):ComplexIdentifier = {
ComplexIdentifier(s.split(",").map(_.toLong).toList)
}
}
case class MyClass(id: ComplexIdentifier, value: String)
Then use a custom serializer:
case object ComplexIdentifierSerializer extends CustomSerializer[ComplexIdentifier] ( formats =>
({
case JString(id) => ComplexIdentifier(id)
case JNull => null
},
{
case x: ComplexIdentifier => JString(x.whatever.mkString(","))
}))
Finally, make sure to include the serializer in the implicit formats:
implicit val formats = DefaultFormats ++ List(ComplexIdentifierSerializer)
println(parse("""
{
"id": "1",
"value": "big value"
}
""").extract[MyClass])
val c = MyClass(ComplexIdentifier("123,456"), "super value")
println(write(c))

How to model finite set of values of enum-like type for (de)serialization?

I am using Spray-json 1.3.1. I have the following JSON message:
{
"results": [{
... NOT IMPORTANT PART HERE ...
}],
"status": "OK"
}
Trivially, this can be deserialized to status String field via
case class Message[T](results: List[T], status: String)
with custom Protocol
object MessageProtocol extends DefaultJsonProtocol {
implicit def messageFormat[T: JsonFormat] = jsonFormat2(Message.apply[T])
}
Since status field can be one of OK, ZERO_RESULTS, OVER_QUERY_LIMIT having this field as a String makes no sense. As I am coming from
Java background I tried enums in Scala implemented as follows:
case class Message[T](results: List[T], status: Status)
object Status extends Enumeration{
type Status = Value
val OK,ZERO_RESULTS,OVER_QUERY_LIMIT, REQUEST_DENIED, INVALID_REQUEST,UNKNOWN_ERROR = Value
}
object MessageProtocol extends DefaultJsonProtocol {
implicit val statusFormat = jsonFormat(Status)
implicit def messageFormat[T: JsonFormat] = jsonFormat2(Message.apply[T])
}
What is best practice/approach to solve this?
You can simply implement your own RootJsonFormat (as an implicit in Message companion object) and override read and write functions. There you will have JsObject and you can convert it to your own case class as you want like converting the string to desired enumeration etc. You can see a sample here

Overcoming type erasure in Scala when pattern matching on Objects which may be different Sets or any type of Object

Is there any way of pattern matching objects where the objects may be Set[Foo] or Set[Bar] when the matching object can be any Object.
Given the below code, trying to pattern match on Set[Bar] will result in a match of Set[Foo] because of type erasure.
import play.api.libs.json._
import scala.collection.immutable.HashMap
case class Foo(valOne: Int, valTwo: Double)
object Foo {
implicit val writesFoo = Json.writes[Foo]
}
case class Bar(valOne: String)
object Bar {
implicit val writesBar = Json.writes[Bar]
}
case class TestRequest(params: Map[String, Object])
object TestRequest {
import play.api.libs.json.Json.JsValueWrapper
implicit val writeAnyMapFormat = new Writes[Map[String, Object]] {
def writes(map: Map[String, Object]): JsValue = {
Json.obj(map.map {
case (s, a) => {
val ret: (String, JsValueWrapper) = a match {
case _: String => s -> JsString(a.asInstanceOf[String])
case _: java.util.Date => s -> JsString(a.asInstanceOf[String])
case _: Integer => s -> JsString(a.toString)
case _: java.lang.Double => s -> JsString(a.toString)
case None => s -> JsNull
case foo: Set[Foo] => s -> Json.toJson(a.asInstanceOf[Set[Foo]])
case bar: Set[Bar] => s -> Json.toJson(a.asInstanceOf[Set[Bar]])
case str: Set[String] => s -> Json.toJson(a.asInstanceOf[Set[String]])
}
ret
}}.toSeq: _*)
}
}
implicit val writesTestRequest = Json.writes[TestRequest]
}
object MakeTestRequest extends App {
val params = HashMap[String, Object]("name" -> "NAME", "fooSet" -> Set(Foo(1, 2.0)), "barSet" -> Set(Bar("val1")))
val testRequest = new TestRequest(params)
println(Json.toJson(testRequest))
}
Trying to serialise the TestRequest will result in:
Exception in thread "main" java.lang.ClassCastException: Bar cannot be cast to Foo
Delegating the pattern matching of Sets to another method in an attempt to get the TypeTag,
case _ => s -> matchSet(a)
results in the type, unsurprisingly, of Object.
def matchSet[A: TypeTag](set: A): JsValue = typeOf[A] match {
case fooSet: Set[Foo] if typeOf[A] =:= typeOf[Foo] => Json.toJson(set.asInstanceOf[Set[Foo]])
case barSet: Set[Bar] if typeOf[A] =:= typeOf[Bar] => Json.toJson(set.asInstanceOf[Set[Bar]])
}
The runtime error being:
Exception in thread "main" scala.MatchError: java.lang.Object (of class scala.reflect.internal.Types$ClassNoArgsTypeRef)
A workaround could be to check the instance of the first element in the Set but this seems inefficient and ugly. Could also match on the key eg fooSet or barSet but if the keys are the same name eg both called set, then this wouldn't work.
In 2.11 s there any way to get at the type/class the Set has been created with?
You could use Shapeless typeable. Note that this is still not 100% safe (e.g. empty lists of different types cannot be distinguished at runtime, because that information literally doesn't exist); under the hood it's doing things like checking the types of the elements using reflection, just with a nicer interface on top.
In general it's better to carry the type information around explicitly, e.g. by using a a case class (or a shapeless HMap) rather than an untyped Map. Less good, but still better than nothing, is using wrapper case classes for the different types of Set that are possible, so that each one is a different type at runtime.
(Also half the point of the pattern match is to avoid the asInstanceOf; you should use e.g. foo rather than a.asInstanceOf[Set[Foo]])