With Jerkson, I was able to parse a String containing a JSON array, like this:
com.codahale.jerkson.Json.parse[Array[Credentials]](contents)
where contents was a String containing the following:
[{"awsAccountName":"mslinn","accessKey":"blahblah","secretKey":"blahblah"}]
... and I would get the array of Credentials.
(Brief diversion) I tried to do something similar using the new JSON parser for Play 2.1 and Scala using different data. For a simple parse, the following works fine. A case class (S3File) defines the unapply method necessary for this to work:
case class S3File(accountName: String,
bucketName: String,
endpoint: String = ".s3.amazonaws.com")
implicit val s3FileFormat = Json.format[S3File]
val jsValue = Json.parse(stringContainingJson)
Json.fromJson(jsValue).get
Let's reconsider the original string called contents containing JSON. As with all collections, an array of objects has no unapply method. That means the technique I showed in the the diversion above won't work. I tried to create a throwaway case class for this purpose:
case class ArrayCreds(payload: Array[Credentials])
implicit val credsFormat = Json.format[ArrayCreds]
val jsValue = Json.parse(contents)
val credArray = Json.fromJson(jsValue).get.payload
... unfortunately, this fails:
No unapply function found
[error] implicit val credsFormat = Json.format[ArrayCreds]
[error] ^
[error]
/blah.scala:177: diverging implicit expansion for type play.api.libs.json.Reads[T]
[error] starting with method ArrayReads in trait DefaultReads
[error] val credArray = Json.fromJson(jsValue).get
[error] ^
Is there a simple way of parsing an array of JSON using Play 2.1's new JSON parser? I expect the throwaway case class is the wrong approach, and the implicit will need to be instead:
implicit val credsFormat = Json.format[Credentials]
But I don't understand how to write the rest of the deserialization in a simple manner. All of the code examples I have seen are rather verbose, which seems contrary to the spirit of Scala. The ideal incantation would be as simple as Jerkson's incantation.
Thanks,
Mike
I think this is what you're looking for:
scala> import play.api.libs.json._
import play.api.libs.json._
scala> case class Credentials(awsAccountName: String, accessKey: String, secretKey: String)
defined class Credentials
scala> implicit val credentialsFmt = Json.format[Credentials]
credentialsFmt: play.api.libs.json.OFormat[Credentials] = play.api.libs.json.OFormat$$anon$1#1da9be95
scala> val js = """[{"awsAccountName":"mslinn","accessKey":"blahblah","secretKey":"blahblah"}]"""
js: String = [{"awsAccountName":"mslinn","accessKey":"blahblah","secretKey":"blahblah"}]
scala> Json.fromJson[Seq[Credentials]](Json.parse(js))
res3: play.api.libs.json.JsResult[Seq[Credentials]] = JsSuccess(List(Credentials(mslinn,blahblah,blahblah)),)
HTH,
Julien
Related
I have something like:
sealed trait Foo
case class Bar(field: ...) extends Foo
case class Baz(otherField: ...) extends Foo
trait JsonFormat {
implicit val barWrites = Json.writes[Bar]
implicit val barReads = Json.reads[Bar]
implicit val bazWrites = Json.writes[Baz]
implicit val bazReads = Json.reads[Baz]
implicit val fooWrites = Json.writes[Foo]
implicit val fooReads = Json.reads[Foo]
// other vals that depend on Foo
}
When I compile, I get an error like:
[error] /file/path/JsonFormat.scala:68:41: unreachable code
[error] implicit val fooWrites = Json.writes[Foo]
[error] ^
[error] one error found
I'm pretty new to scala and I understand an "unreachable code" error in the context of pattern matching, but I can't figure this one out.
I'm using play 2.8.
This may not be the exact solution to the answer but possible advice to change your approach.
First, if you add your own apply() method in your class's companion object this error does happen. Check that.
Second, the best practice seems to be that you would put an implicit converter within the companion object of the class per class and I'd go ahead and implicit the formatter so both conversion directions are supported.
case class SomeClass(someString: String)
object SomeClass {
implicit val jsonFormatter: OFormat[SomeClass] = Json.format[SomeClass]
}
If you approach your JSON implicit converters this way, anywhere you use your DTOs will automatically get the implicit converter anywhere you use that class in both in and out directions.
Other place to assemble "shared" implicits is Package Object. I keep, for example, an ISO DateTime string <-> Date converter there.
I am using Json4s classes inside of a Spark 2.2.0 closure. The "workaround" for a failure to serialize DefaultFormats is to include their definition inside every closure executed by Spark that needs them. I believe I have done more than I needed to below but still get the serialization failure.
Using Spark 2.2.0, Scala 2.11, Json4s 3.2.x (whatever is in Spark) and also tried using Json4s 3.5.3 by pulling it into my job using sbt. In all cases I used the workaround shown below.
Does anyone know what I'm doing wrong?
logger.info(s"Creating an RDD for $actionName")
implicit val formats = DefaultFormats
val itemProps = df.rdd.map[(ItemID, ItemProps)](row => { <--- error points to this line
implicit val formats = DefaultFormats
val itemId = row.getString(0)
val correlators = row.getSeq[String](1).toList
(itemId, Map(actionName -> JArray(correlators.map { t =>
implicit val formats = DefaultFormats
JsonAST.JString(t)
})))
})
I have also tried another suggestion, which is to set the DefaultFormats implicit in the class constructor area and not in the closure, no luck anywhere.
The JVM error trace is from Spark complaining that the task is not serializable and pointing to the line above (last line in my code anyway) then the root cause is explained with:
Serialization stack:
- object not serializable (class: org.json4s.DefaultFormats$, value: org.json4s.DefaultFormats$#7fdd29f3)
- field (class: com.actionml.URAlgorithm, name: formats, type: class org.json4s.DefaultFormats$)
- object (class com.actionml.URAlgorithm, com.actionml.URAlgorithm#2dbfa972)
- field (class: com.actionml.URAlgorithm$$anonfun$udfLLR$1, name: $outer, type: class com.actionml.URAlgorithm)
- object (class com.actionml.URAlgorithm$$anonfun$udfLLR$1, <function3>)
- field (class: org.apache.spark.sql.catalyst.expressions.ScalaUDF$$anonfun$4, name: func$4, type: interface scala.Function3)
- object (class org.apache.spark.sql.catalyst.expressions.ScalaUDF$$anonfun$4, <function1>)
- field (class: org.apache.spark.sql.catalyst.expressions.ScalaUDF, name: f, type: interface scala.Function1)
- object (class org.apache.spark.sql.catalyst.expressions.ScalaUDF, UDF(input[2, bigint, false], input[3, bigint, false], input[5, bigint, false]))
- element of array (index: 1)
- array (class [Ljava.lang.Object;, size 3)
- field (class: org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10, name: references$1, type: class [Ljava.lang.Object;)
- object (class org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10, <function2>)
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295)
... 128 more
I have another example. You can try it by using spark-shell. I hope it can help you.
import org.json4s._
import org.json4s.jackson.JsonMethods._
def getValue(x: String): (Int, String) = {
implicit val formats: DefaultFormats.type = DefaultFormats
val obj = parse(x).asInstanceOf[JObject]
val id = (obj \ "id").extract[Int]
val name = (obj \ "name").extract[String]
(id, name)
}
val rdd = sc.parallelize(Array("{\"id\":0, \"name\":\"g\"}", "{\"id\":1, \"name\":\"u\"}", "{\"id\":2, \"name\":\"c\"}", "{\"id\":3, \"name\":\"h\"}", "{\"id\":4, \"name\":\"a\"}", "{\"id\":5, \"name\":\"0\"}"))
rdd.map(x => getValue(x)).collect
Interesting. One typical problem is that you run into serialization issues with the implicit val formats, but as you define them inside your loop this should be ok.
I know that this is bit hacky, but you could try the following:
using #transient implicit val
maybe do a minimal test whether JsonAST.JString(t) is serializable
I have some scala code that requires the use of implicits for serializing and deserializing json.
We previously had something that worked by putting these implicit statements (simplified with dummies):
(in some class SomeClass1)
implicit val some1format = Json.format[SomeItem1]
implicit val some2format = Json.format[SomeItem2]
...
All as class-level variables. Any method within the class was then able to convert from Json just fine.
However, we are trying to move the implicit definitions of these formats to a separate object.
So we created an object (for example: SomeFormatters), which only contains these implicits:
object SomeFormatters {
implicit val some1format = Json.format[SomeItem1]
implicit val some2format = Json.format[SomeItem2]
}
When I try to import this object into SomeClass1, I get a compilation error saying that no deserializer was found for SomeItem1 or SomeItem2, even though I am importing SomeFormatters. (The IDE says the import of SomeFormatters is unused though, so I already knew something was off.)
What's the proper way to get SomeClass1 to know about the implicit definitions in SomeFormatters?
The issue was that there were no type annotations for implicit values -
Instead of:
implicit val some1format = Json.format[SomeItem1]
I needed to put:
implicit val some1format: Format[SomeItem1] = Json.format[SomeItem1]
I'm looking for suggestions or libraries that can help me convert JSON (with nested structure) from one format to another in Scala.
I saw there are a few JavaScript and Java based solutions. Anything in Scala ?
I really like the Play JSON library. It's API is very clean and it's very fast even if some parts have a slightly steeper learning curve. You can also use the Play JSON library even if you aren't using the rest of Play.
https://playframework.com/documentation/2.3.x/ScalaJson
To convert JSON to scala objects (and vice versa), Play uses implicits. There is a Reads type which specifies how to convert JSON to a scala type, and a Writes type which specifies how to convert a scala object to JSON.
For example:
case class Foo(a: Int, b: String)
There are a few different routes you can take to convert Foo to JSON. If your object is simple (like Foo), Play JSON can create a conversion function for you:
implicit val fooReads = Json.reads[Foo]
or you can create a custom conversion function if you want more control or if your type is more complex. The below examples uses the name id for the property a in Foo:
implicit val fooReads = (
(__ \ "id").read[Int] ~
(__ \ "name").read[String]
)(Foo)
The Writes type has similar capabilities:
implicit val fooWrites = Json.writes[Foo]
or
implicit val fooWrites = (
(JsPath \ "id").write[Int] and
(JsPath \ "name").write[String]
)(unlift(Foo.unapply))
You can read more about Reads/Writes (and all the imports you will need) here: https://playframework.com/documentation/2.3.x/ScalaJsonCombinators
You can also transform your JSON without mapping JSON to/from scala types. This is fast and often requires less boilerplate. A simple example:
import play.api.libs.json._
// Only take a single branch from the input json
// This transformer takes the entire JSON subtree pointed to by
// key bar (no matter what it is)
val pickFoo = (__ \ 'foo).json.pickBranch
// Parse JSON from a string and apply the transformer
val input = """{"foo": {"id": 10, "name": "x"}, "foobar": 100}"""
val baz: JsValue = Json.parse(input)
val foo: JsValue = baz.transform(pickFoo)
You can read more about transforming JSON directly here: https://playframework.com/documentation/2.3.x/ScalaJsonTransformers
You can use Json4s Jackson. With PlayJson, you have to write Implicit conversions for all the case classes. If the no. of classes are small, and will not have frequent changes while development, PlayJson seems to be okay. But, if the case classes are more, I recommend using json4s.
You need to add implicit conversion for different types, so that json4s will understand while converting to json.
You can add the below dependency to your project to get json4s-jackson
"org.json4s" %% "json4s-jackson" % "3.2.11"
A sample code is given below (with both serialization and deserialization):
import java.util.Date
import java.text.SimpleDateFormat
import org.json4s.DefaultFormats
import org.json4s.jackson.JsonMethods._
import org.json4s.jackson.{Serialization}
/**
* Created by krishna on 19/5/15.
*/
case class Parent(id:Long, name:String, children:List[Child])
case class Child(id:Long, name:String, simple: Simple)
case class Simple(id:Long, name:String, date:Date)
object MainClass extends App {
implicit val formats = (new DefaultFormats {
override def dateFormatter = new SimpleDateFormat("yyyy-MM-dd")
}.preservingEmptyValues)
val d = new Date()
val simple = Simple(1L, "Simple", d)
val child1 = Child(1L, "Child1", simple)
val child2 = Child(2L, "Child2", simple)
val parent = Parent(1L, "Parent", List(child1, child2))
//Conversion from Case Class to Json
val json = Serialization.write(parent)
println(json)
//Conversion from Json to Case Class
val parentFromJson = parse(json).extract[Parent]
println(parentFromJson)
}
I need to parse the following json string:
{"type": 1}
The case class I am using looks like:
case class MyJsonObj(
val type: Int
)
However, this confuses Scala since 'type' is a keyword. So, I tried using #JsonProperty annotation from Jacson/Jerkson as follows:
case class MyJsonObj(
#JsonProperty("type") val myType: Int
)
However, the Json parser still refuses to look for 'type' string in json instead of 'myType'. Following sample code illustrates the problem:
import com.codahale.jerkson.Json._
import org.codehaus.jackson.annotate._
case class MyJsonObj(
#JsonProperty("type") val myType: Int
)
object SimpleExample {
def main(args: Array[String]) {
val jsonLine = """{"type":1}"""
val JsonObj = parse[MyJsonObj](jsonLine)
}
I get the following error:
[error] (run-main-a) com.codahale.jerkson.ParsingException: Invalid JSON. Needed [myType], but found [type].
P.S: As seen above, I am using jerkson/jackson, but wouldn't mind switching to some other json parsing library if that makes life easier.
Use backquotes to prevent the Scala compiler from interpreting type as the keyword:
case class MyJsonObj(
val `type`: Int
)
I suspect you aren't enabling Scala support in Jackson properly.
I've tried this:
object Test extends App {
val mapper = new ObjectMapper
mapper.registerModule(DefaultScalaModule)
println(mapper.writeValueAsString(MyJsonObj(1)))
val obj = mapper.readValue("""{"type":1}""", classOf[MyJsonObj])
println(obj.myType)
}
case class MyJsonObj(#JsonProperty("type") myType: Int)
And I get:
{"type":1}
1
Note that I've added Scala support to the object mapper by calling registerModule
As #wingedsubmariner implied, the answer lies with Scala meta annotations.
This worked for me:
import scala.annotation.meta.field
case class MyJsonObj(
#(JsonProperty #field)("type") val myType: Int
)
This is in addition to mapper.registerModule(DefaultScalaModule), which you'll probably need if you're deserializing into a Scala class.