I was uder the impression that Validation could be used as a Monoid/SemiGroup
I tried the following code under scala 2.9.2 and scalaz 7 snapshot
import scalaz._
import Scalaz._
val success1 = 1.success
val success2 = 2.success
val failureA = "A".fail
val failureB = "B".fail
success1 |+| success2
<console>:16: error: diverging implicit expansion for type scalaz.Semigroup[scalaz.Validation[Nothing,Int]]
starting with method validationSemigroup in trait ValidationInstances
success1 |+| success2
^
<console>:16: error: value |+| is not a member of scalaz.Validation[Nothing,Int]
success1 |+| success2
I was expecting a Success(3)
then
failureA |+| failureB gives
res1: scalaz.Validation[java.lang.String,Nothing] = Failure(AB)
as expected
and
success1 |+| failureA fails as expected with
<console>:16: error: diverging implicit expansion for type scalaz.Semigroup[scalaz.Validation[Nothing,Int]]
starting with method validationSemigroup in trait ValidationInstances
success1 |+| failureA
^
<console>:16: error: value |+| is not a member of scalaz.Validation[Nothing,Int]
success1 |+| failureA
Why |+| on Success does not work ? Is it a bug or did I miss something here
It works like this:
scala> import scalaz._, Scalaz._
import scalaz._
import Scalaz._
scala> val success1 = 1.success[String]
success1: scalaz.Validation[String,Int] = Success(1)
scala> val success2 = 2.success[String]
success2: scalaz.Validation[String,Int] = Success(2)
scala> val failureA = "A".fail[Int]
failureA: scalaz.Validation[java.lang.String,Int] = Failure(A)
scala> val failureB = "B".fail[Int]
failureB: scalaz.Validation[java.lang.String,Int] = Failure(B)
scala> success1 |+| success2
res0: scalaz.Validation[String,Int] = Success(1)
scala> failureA |+| failureB
res1: scalaz.Validation[java.lang.String,Int] = Failure(AB)
scala> success1 |+| failureA
res2: scalaz.Validation[String,Int] = Success(1)
You did not specify the type of the left element (Failure) in your first two val's. That is why Nothing was inferred. Apparently, Validation has a Semigroup instance if the left element is a Semigroup (Nothing is not, String is). Here's an instance.
UPD: Also, as you see, Semigroup instance gets the first Success instead of using the Success'es Semigroup instance. I see, there is a method append in Validation, which requires both left and right elements to be Semigroups and uses both instances:
scala> success1 append success2
res6: scalaz.Validation[String,Int] = Success(3)
Don't know, why it is not used in Validation's Semigroup instance. Created a pull request to change this behavior.
Related
I am implemented Spark Structured Streaming, and for my use-case I have to specify the starting offsets.
And, I have the offset values in form of an Array[String]:
{"topic":"test","partition":0,"starting_offset":123}
{"topic":"test","partition":1,"starting_offset":456}
I would like to convert it to the below programmatically, so that I can pass it to Spark.
{"test":{"0":123,"1":456}}
Note: This is just a sample, I keep getting different offset ranges so I cannot hardcode it.
If array is the variable contaning the list you describe then:
>>> [{d['topic']: [d['partition'], d['starting_offset']]} for d in array]
[{'test': [0, 123]}, {'test': [1, 456]}]
scala> import org.json4s._
scala> import org.json4s.jackson.JsonMethods._
scala> val topicAsRawStr: Array[String] = Array(
"""{"topic":"test","partition":0,"starting_offset":123}""",
"""{"topic":"test","partition":1,"starting_offset":456}""")
scala> val topicAsJSONs = topicAsRawStr.map(rawText => {
val json = parse(rawText)
val topicName = json \ "topic" // Extract topic value
val offsetForTopic = json \ "starting_offset" // Extract starting_offset
topicName -> offsetForTopic
})
scala> // Aggregate offsets for each topic
You can also use spark.sparkContext.parallelize API.
scala> case class KafkaTopic(topicName: String, partitionId: Int, starting_offset: Int)
scala> val spark: SparkSession = ???
scala> val topicAsRawStr: Array[String] = Array(
"""{"topic":"test","partition":0,"starting_offset":123}""",
"""{"topic":"test","partition":1,"starting_offset":456}""")
scala> val topicAsJSONs = topicAsRawStr.map(line => json.parse(line).extract[KafkaTopic])
scala> val kafkaTopicDS = spark.sparkContext.parallelize(topicAsJSONs)
scala> val aggregatedOffsetsByTopic = kafkaTopicDS
.groupByKey("topic")
.mapGroups {
case (topicName, kafkaTopics) =>
val offsets = kafkaTopics.flatMap(kT => kT.starting_offset)
(topicName -> offsets.toSet)
}
I use Josn4s to parse json in scala, when I parse tes, will throw some exception, the code is below:
implicit val formats = DefaultFormats
val pos = Array[(Int, Int)]((1,3),(2,4))
val tes = compact(render("pos" -> Extraction.decompose(pos)))
val dec = (parse(tes) \ "pos").extract[(Int, Int)]
and the exception is below:
Exception in thread "main" org.json4s.package$MappingException: No usable value for _1
Did not find value which can be converted into int
at org.json4s.reflect.package$.fail(package.scala:96)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$buildCtorArg(Extraction.scala:443)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$14.apply(Extraction.scala:463)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:451)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:491)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:488)
at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:500)
at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:488)
at org.json4s.Extraction$.extract(Extraction.scala:332)
at org.json4s.Extraction$.extract(Extraction.scala:42)
at org.json4s.ExtractableJsonAstNode.extract(ExtractableJsonAstNode.scala:21)
You may try this:
import org.json4s._
import org.json4s.jackson.Serialization.write
import org.json4s.jackson.JsonMethods._
implicit val formats = DefaultFormats
val json = write(pos.toMap.map { case (k,v) => (k.toString -> v) })
val arrayTuple2 = parse(json).values.asInstanceOf[Map[Int,Int]].toArray
We use Scala 2.11.8 and Play framework 2.5.8
Data to work with can be as simple as that:
object EnumA extends Enumeration {
type EnumA = Value
val ONE, TWO, THREE = Value
}
case class NoWork(data: Map[EnumA.Value, String] = Map.empty)
And what I want to archive is to be able to parse the NoWork class to Json. I know that to it requires providing an implicit formatter for Enumeration.
I have found this solution: https://stackoverflow.com/a/15489179/1549135 and applied it.
The companion object providing those implicits looks as follows:
object NoWork {
implicit val enumAFormat = EnumUtils.enumFormat(EnumA)
implicit val jsonModelFormat = Json.format[NoWork]
}
And it always fails with an error:
error: No implicit format for Map[EnumA.Value,String] available.
implicit val jsonModelFormat = Json.format[NoWork]
^
What is the issue?
I have tested and changing the data type to Map[String, String] allows for serialization. The Enum on its own is serializable too, so now - how to fix the Map with Enum type?
Thanks!
Edit
As Pamu's answer
implicit val writes = new Writes[Map[EnumA.Value, String]] {
override def writes(o: Map[EnumA.Value, String]): JsValue = Json.toJson(o.map { case (a, b) => Json.parse(s"""{${Json.toJson(a)}:${Json.toJson(b)}}""")}.toList)
}
would clearly work for this situation, I would actually need a generic solution for other Map[Enum, T] that I could use in whole application.
Note that it is mandatory for Json keys to be strings.
Following code works
Json.toJson(Map("mon" -> EnumA.MON))
Following code does not work because key for valid Json should always be string. Here the key is EnumA.Value which is not String.
scala> Json.toJson(Map(EnumA.MON -> "mon"))
<console>:19: error: No Json serializer found for type scala.collection.immutable.Map[EnumA.Value,String]. Try to implement an implicit Writes or Format for this type.
Json.toJson(Map(EnumA.MON -> "mon"))
But if you want it work as expected provide a writes
implicit val writes = new Writes[Map[EnumA.Value, String]] {
override def writes(o: Map[EnumA.Value, String]): JsValue = Json.toJson(o.map { case (a, b) => Json.parse(s"""{${Json.toJson(a)}:${Json.toJson(b)}}""")}.toList)
}
now following code works
Json.toJson(Map(EnumA.MON -> "hello"))
You can declare format for EnumA as following
object EnumA extends Enumeration {
val MON = Value("monday")
val TUE = Value("Tuesday")
implicit val format = new Format[EnumA.Value] {
override def writes(o: EnumA.Value): JsValue = Json.toJson(o.toString)
override def reads(json: JsValue): JsResult[EnumA.Value] = json.validate[String].map(EnumA.withName(_))
}
}
Scala REPL output
scala> object EnumA extends Enumeration {
| val MON = Value("monday")
| val TUE = Value("Tuesday")
|
| implicit val format = new Format[EnumA.Value] {
| override def writes(o: EnumA.Value): JsValue = Json.toJson(o.toString)
| override def reads(json: JsValue): JsResult[EnumA.Value] = json.validate[String].map(EnumA.withName(_))
| }
| }
defined object EnumA
scala> Json.toJson(EnumA.MON)
res0: play.api.libs.json.JsValue = "monday"
scala> (Json.parse("""{"a": "monday"}""") \ "a").validate[EnumA.Value]
res7: play.api.libs.json.JsResult[EnumA.Value] = JsSuccess(monday,)
scala> (Json.parse("""{"a": "monday"}""") \ "a").validate[EnumA.Value].get
res10: EnumA.Value = monday
scala> Json.toJson(Map("mon" -> EnumA.MON))
res2: play.api.libs.json.JsValue = {"mon":"monday"}
scala> Json.toJson(Map(EnumA.MON -> "mon"))
<console>:19: error: No Json serializer found for type scala.collection.immutable.Map[EnumA.Value,String]. Try to implement an implicit Writes or Format for this type.
Json.toJson(Map(EnumA.MON -> "mon"))
scala> implicit val writes = new Writes[Map[EnumA.Value, String]] {
| override def writes(o: Map[EnumA.Value, String]): JsValue = Json.toJson(o.map { case (a, b) => Json.parse(s"""{${Json.toJson(a)}:${Json.toJson(b)}}""")}.toList)
| }
writes: play.api.libs.json.Writes[Map[EnumA.Value,String]] = $anon$1#65aebb67
scala> Json.toJson(Map(EnumA.MON -> "hello"))
res2: play.api.libs.json.JsValue = [{"monday":"hello"}]
With colleague we have prepared a generic class that provides JSON serialization for Map[E <: Enum[E], T] type.
The Enum type is always converted to String as it is required for JsObject key. The other parameter is generic and is converted using the implicit format: Format[T]
import play.api.data.validation.ValidationError
import play.api.libs.json._
import scala.util.{Failure, Success, Try}
class MapEnumFormat[E <: Enum[E], T](valueOf: (String => E))(implicit format: Format[T]) extends Format[Map[E, T]] {
override def writes(o: Map[E, T]): JsValue = {
JsObject(o.map { case (a, b) => (a.name, Json.toJson(b)) })
}
override def reads(json: JsValue): JsResult[Map[E, T]] = {
val result = Try(json.as[Map[String, T]].map {
case (key, value) =>
valueOf(key) -> value
})
result match {
case Success(status) =>
JsSuccess(status)
case Failure(th) =>
JsError(ValidationError(s"Error while serializing $json: $th"))
}
}
}
I'm using my own implicit implementation of JSON serializer and deserializer for my case object
My case class looks like (it's just a code snippet)
sealed trait MyTrait
case object MyCaseClass extends MyTrait
I want to write my own ser. and deser. of JSON for MyTrait
implicit val myTraitFormat = new JsonFormat[MyTrait] {
override def read(json: JsValue): MyTrait = json.asJsObject.getFields("value") match {
case Seq(JsString("TEST")) ⇒ MyCaseClass
case _ ⇒ throw new DeserializationException(s"$json is not a valid extension of my trait")
}
override def write(myTrait: MyTrait): JsValue = {
myTrait match {
case MyCaseClass => JsObject("value" -> JsString("TEST"))
}
}
}
Now my test is failing by throwing DeserializationException:
"The my JSON format" when {
"deserializing a JSON" must {
"return correct object" in {
val json = """{"value": "TEST"}""".asJson
json.convertTo[MyTrait] must equal (MyCaseClass)
}
}
}
Obviously json.asJsObject.getFields("value")can not be matched to Seq(JsString("TEST")). Maybe this is related to using traits?
But I have found example on official spray-json site https://github.com/spray/spray-json#providing-jsonformats-for-other-types
Any ideas how to properly match on field in JsObject?
Thanks!
Best
Write variable name instead of string:
override def read(json: JsValue): MyCaseClass = json.asJsObject.getFields("value") match {
case Seq(JsString(value)) ⇒ MyCaseClass
case _ ⇒ throw new DeserializationException(s"$json is not a valid case class")
}
Does that work for you?
Update:
Yes, your test is wrong. I tried it with the following (note parseJson instead of asJson) and it worked:
scala> val json = """{"value": "TEST"}""".parseJson
json: spray.json.JsValue = {"value":"TEST"}
scala> json.convertTo[MyCaseClass]
res2: MyCaseClass = MyCaseClass()
Update 2: Tried it with trait:
scala> import spray.json._
import spray.json._
scala> import spray.json.DefaultJsonProtocol._
import spray.json.DefaultJsonProtocol._
[Your code]
scala> val json = """{"value": "TEST"}""".parseJson
json: spray.json.JsValue = {"value":"TEST"}
scala> json.convertTo[MyTrait]
res1: MyTrait = MyCaseClass
I'm trying to deserialize a JsArray into a List[T] in a playframework application using Scala. After some research I found this method which is supposed to do the needed work:
/**
* Deserializer for List[T] types.
*/
implicit def listReads[T](implicit fmt: Reads[T]): Reads[List[T]] = new Reads[List[T]] {
def reads(json: JsValue) = json match {
case JsArray(ts) => ts.map(t => fromJson(t)(fmt)).toList
case _ => throw new RuntimeException("List expected")
}
}
The problem is that I didn't know how to use it. Any help is welcome.
Here's a quick example:
scala> import play.api.libs.json._
import play.api.libs.json._
scala> Json.toJson(List(1, 2, 3)).as[List[Int]]
res0: List[Int] = List(1, 2, 3)
And if you have a custom type with a Format instance:
case class Foo(i: Int, x: String)
implicit object fooFormat extends Format[Foo] {
def reads(json: JsValue) = Foo(
(json \ "i").as[Int],
(json \ "x").as[String]
)
def writes(foo: Foo) = JsObject(Seq(
"i" -> JsNumber(foo.i),
"x" -> JsString(foo.x)
))
}
It still works:
scala> val foos = Foo(1, "a") :: Foo(2, "bb") :: Nil
foos: List[Foo] = List(Foo(1,a), Foo(2,bb))
scala> val json = Json.toJson(foos)
json: play.api.libs.json.JsValue = [{"i":1,"x":"a"},{"i":2,"x":"bb"}]
scala> json.as[List[Foo]]
res1: List[Foo] = List(Foo(1,a), Foo(2,bb))
This approach would also work if your custom type had a xs: List[String] member, for example: you'd just use (json \ "xs").as[List[String]] in your reads method and Json.toJson(foo.xs) in your writes.