Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I need to build a JSON string, something like this:
[
{ 'id': 1, 'name': 'John'},
{ 'id': 2, 'name': 'Dani'}
]
val jArray = JsArray();
jArray += (("id", "1"), ("name", "John"))
jArray += (("id", "2"), ("name", "Dani"))
println(jArray.dump)
I need to be able to add rows to the jArray, something like jArray += ...
What is the closest library/solution to this?
Unfortunately writing a JSON library is the Scala community's version of coding a todo list app.
There are quite a variety of alternatives. I list them in no particular order, with notes:
parsing.json.JSON - Warning this library is available only up to Scala version 2.9.x (removed in newer versions)
spray-json - Extracted from the Spray project
Jerkson ± - Warning a nice library (built on top of Java Jackson) but now abandonware. If you are going to use this, probably follow the Scalding project's example and use the backchat.io fork
sjson - By Debasish Ghosh
lift-json - Can be used separately from the Lift project
json4s 💣
§ ± - An extraction from lift-json, which is attempting to create a standard JSON AST which other JSON libraries can use. Includes a Jackson-backed implementation
Argonaut 💣
§ - A FP-oriented JSON library for Scala, from the people behind Scalaz
play-json ± - Now available standalone, see this answer for details
dijon - A handy, safe and efficient JSON library, uses jsoniter-scala under hood.
sonofjson - JSON library aiming for a super-simple API
Jawn - JSON library by Erik Osheim aiming for Jackson-or-faster speed
Rapture JSON ± - a JSON front-end which can use 2, 4, 5, 6, 7, 11 or Jackson as back-ends
circe 💣 - fork of Argonaut built on top of cats instead of scalaz
jsoniter-scala - Scala macros for compile-time generation of ultra-fast JSON codecs
jackson-module-scala - Add-on module for Jackson to support Scala-specific datatypes
borer - Efficient CBOR and JSON (de)serialization in Scala
💣 = has not fixed security vulnerabilities, § = has Scalaz integration, ± = supports interop with Jackson JsonNode
In Snowplow we use json4s with the Jackson back-end; we've had good experiences with Argonaut too.
Lift-json is at version 2.6 and it works really well (and is also very well supported, the maintainer is always ready to fix any bugs users may find.
You can find examples using it on the github repository
The maintainer (Joni Freeman) is always reachable on the Lift mailing list. There are also other users on the mailing list who are very helpful as well.
As #Alexey points out, if you want to use the library with other Scala version, say 2.11.x, change scalaVersion and use %% as follows:
scalaVersion := "2.11.5"
"net.liftweb" %% "lift-json" % "2.6"
You can check the liftweb.net site to find out the latest version as time goes by.
I suggest using jerkson, it supports most basic type conversions:
scala> import com.codahale.jerkson.Json._
scala> val l = List(
Map( "id" -> 1, "name" -> "John" ),
Map( "id" -> 2, "name" -> "Dani")
)
scala> generate( l )
res1: String = [{"id":1,"name":"John"},{"id":2,"name":"Dani"}]
Number 7 on the list is Jackson, not using Jerkson. It has support for Scala objects, (case classes etc).
Below is an example of how I use it.
object MyJacksonMapper extends JacksonMapper
val jsonString = MyJacksonMapper.serializeJson(myObject)
val myNewObject = MyJacksonMapper.deserializeJson[MyCaseClass](jsonString)
This makes it very simple. In addition is the XmlSerializer and support for JAXB Annotations is very handy.
This blog post describes it's use with JAXB Annotations and the Play Framework.
http://krasserm.blogspot.co.uk/2012/02/using-jaxb-for-xml-and-json-apis-in.html
Here is my current JacksonMapper.
trait JacksonMapper {
def jsonSerializer = {
val m = new ObjectMapper()
m.registerModule(DefaultScalaModule)
m
}
def xmlSerializer = {
val m = new XmlMapper()
m.registerModule(DefaultScalaModule)
m
}
def deserializeJson[T: Manifest](value: String): T = jsonSerializer.readValue(value, typeReference[T])
def serializeJson(value: Any) = jsonSerializer.writerWithDefaultPrettyPrinter().writeValueAsString(value)
def deserializeXml[T: Manifest](value: String): T = xmlSerializer.readValue(value, typeReference[T])
def serializeXml(value: Any) = xmlSerializer.writeValueAsString(value)
private[this] def typeReference[T: Manifest] = new TypeReference[T] {
override def getType = typeFromManifest(manifest[T])
}
private[this] def typeFromManifest(m: Manifest[_]): Type = {
if (m.typeArguments.isEmpty) { m.erasure }
else new ParameterizedType {
def getRawType = m.erasure
def getActualTypeArguments = m.typeArguments.map(typeFromManifest).toArray
def getOwnerType = null
}
}
}
Maybe I've late a bit, but you really should try to use json library from play framework.
You could look at documentation. In current 2.1.1 release you could not separately use it without whole play 2, so dependency will looks like this:
val typesaferepo = "TypeSafe Repo" at "http://repo.typesafe.com/typesafe/releases"
val play2 = "play" %% "play" % "2.1.1"
It will bring you whole play framework with all stuff on board.
But as I know guys from Typesafe have a plan to separate it in 2.2 release. So, there is standalone play-json from 2.2-snapshot.
You should check Genson.
It just works and is much easier to use than most of the existing alternatives in Scala. It is fast, has many features and integrations with some other libs (jodatime, json4s DOM api...).
All that without any fancy unecessary code like implicits, custom readers/writers for basic cases, ilisible API due to operator overload...
Using it is as easy as:
import com.owlike.genson.defaultGenson_
val json = toJson(Person(Some("foo"), 99))
val person = fromJson[Person]("""{"name": "foo", "age": 99}""")
case class Person(name: Option[String], age: Int)
Disclaimer: I am Gensons author, but that doesn't meen I am not objective :)
Here is a basic implementation of writing and then reading json file using json4s.
import org.json4s._
import org.json4s.jackson.JsonMethods._
import org.json4s.JsonDSL._
import java.io._
import scala.io.Source
object MyObject { def main(args: Array[String]) {
val myMap = Map("a" -> List(3,4), "b" -> List(7,8))
// writing a file
val jsonString = pretty(render(myMap))
val pw = new PrintWriter(new File("my_json.json"))
pw.write(jsonString)
pw.close()
// reading a file
val myString = Source.fromFile("my_json.json").mkString
println(myString)
val myJSON = parse(myString)
println(myJSON)
// Converting from JOjbect to plain object
implicit val formats = DefaultFormats
val myOldMap = myJSON.extract[Map[String, List[Int]]]
println(myOldMap)
}
}
Jawn is a very flexible JSON parser library in Scala. It also allows generation of custom ASTs; you just need to supply it with a small trait to map to the AST.
Worked great for a recent project that needed a little bit of JSON parsing.
Rapture seems to be missing in the list of the answers.
It can be obtained from http://rapture.io/ and allows you (among other thing) to:
select JSON back-end, which is very useful if you already use one (in import)
decide if you work with Try, Future, Option, Either, etc. (also in import)
do a lot of work in a single line of code.
I don't want to copy/paste Rapture examples from it's page. A nice presentation about Rapture's features was given by Jon Pretty at SBTB 2014: https://www.youtube.com/watch?v=ka5-OLJgybI
#AlaxDean's #7 answer, Argonaut is the only one that I was able to get working quickly with sbt and intellij. Actually json4s also took little time but dealing with a raw AST is not what I wanted. I got argonaut to work by putting in a single line into my build.st:
libraryDependencies += "io.argonaut" %% "argonaut" % "6.0.1"
And then a simple test to see if it I could get JSON:
package mytest
import scalaz._, Scalaz._
import argonaut._, Argonaut._
object Mytest extends App {
val requestJson =
"""
{
"userid": "1"
}
""".stripMargin
val updatedJson: Option[Json] = for {
parsed <- requestJson.parseOption
} yield ("name", jString("testuser")) ->: parsed
val obj = updatedJson.get.obj
printf("Updated user: %s\n", updatedJson.toString())
printf("obj : %s\n", obj.toString())
printf("userid: %s\n", obj.get.toMap("userid"))
}
And then
$ sbt
> run
Updated user: Some({"userid":"1","name":"testuser"})
obj : Some(object[("userid","1"),("name","testuser")])
userid: "1"
Make sure you are familiar with Option which is just a value that can also be null (null safe I guess). Argonaut makes use of Scalaz so if you see something you don't understand like the symbol \/ (an or operation) it's probably Scalaz.
You can try this:
https://github.com/momodi/Json4Scala
It's simple, and has only one scala file with less than 300 lines code.
There are samples:
test("base") {
assert(Json.parse("123").asInt == 123)
assert(Json.parse("-123").asInt == -123)
assert(Json.parse("111111111111111").asLong == 111111111111111l)
assert(Json.parse("true").asBoolean == true)
assert(Json.parse("false").asBoolean == false)
assert(Json.parse("123.123").asDouble == 123.123)
assert(Json.parse("\"aaa\"").asString == "aaa")
assert(Json.parse("\"aaa\"").write() == "\"aaa\"")
val json = Json.Value(Map("a" -> Array(1,2,3), "b" -> Array(4, 5, 6)))
assert(json("a")(0).asInt == 1)
assert(json("b")(1).asInt == 5)
}
test("parse base") {
val str =
"""
{"int":-123, "long": 111111111111111, "string":"asdf", "bool_true": true, "foo":"foo", "bool_false": false}
"""
val json = Json.parse(str)
assert(json.asMap("int").asInt == -123)
assert(json.asMap("long").asLong == 111111111111111l)
assert(json.asMap("string").asString == "asdf")
assert(json.asMap("bool_true").asBoolean == true)
assert(json.asMap("bool_false").asBoolean == false)
println(json.write())
assert(json.write().length > 0)
}
test("parse obj") {
val str =
"""
{"asdf":[1,2,4,{"bbb":"ttt"},432]}
"""
val json = Json.parse(str)
assert(json.asMap("asdf").asArray(0).asInt == 1)
assert(json.asMap("asdf").asArray(3).asMap("bbb").asString == "ttt")
}
test("parse array") {
val str =
"""
[1,2,3,4,{"a":[1,2,3]}]
"""
val json = Json.parse(str)
assert(json.asArray(0).asInt == 1)
assert(json(4)("a")(2).asInt == 3)
assert(json(4)("a")(2).isInt)
assert(json(4)("a").isArray)
assert(json(4)("a").isMap == false)
}
test("real") {
val str = "{\"styles\":[214776380871671808,214783111085424640,214851869216866304,214829406537908224],\"group\":100,\"name\":\"AO4614【金宏达电子】现货库存 质量保证 欢迎购买#\",\"shopgrade\":8,\"price\":0.59,\"shop_id\":60095469,\"C3\":50018869,\"C2\":50024099,\"C1\":50008090,\"imguri\":\"http://img.geilicdn.com/taobao10000177139_425x360.jpg\",\"cag\":50006523,\"soldout\":0,\"C4\":50006523}"
val json = Json.parse(str)
println(json.write())
assert(json.asMap.size > 0)
}
I use uPickle which has the big advantage that it will handle nested case classes automatically:
object SerializingApp extends App {
case class Person(name: String, address: Address)
case class Address(street: String, town: String, zipCode: String)
import upickle.default._
val john = Person("John Doe", Address("Elm Street 1", "Springfield", "ABC123"))
val johnAsJson = write(john)
// Prints {"name":"John Doe","address":{"street":"Elm Street 1","town":"Springfield","zipCode":"ABC123"}}
Console.println(johnAsJson)
// Parse the JSON back into a Scala object
Console.println(read[Person](johnAsJson))
}
Add this to your build.sbt to use uPickle:
libraryDependencies += "com.lihaoyi" %% "upickle" % "0.4.3"
I use PLAY JSON library
you can find the mavn repo for only the JSON library not the whole framework here
val json = "com.typesafe.play" %% "play-json" % version
val typesafe = "typesafe.com" at "http://repo.typesafe.com/typesafe/releases/"
A very good tutorials about how to use them, are available here:
http://mandubian.com/2012/09/08/unveiling-play-2-dot-1-json-api-part1-jspath-reads-combinators/
http://mandubian.com/2012/10/01/unveiling-play-2-dot-1-json-api-part2-writes-format-combinators/
http://mandubian.com/2012/10/29/unveiling-play-2-dot-1-json-api-part3-json-transformers/
Let me also give you the SON of JSON version:
import nl.typeset.sonofjson._
arr(
obj(id = 1, name = "John)
obj(id = 2, name = "Dani)
)
Play released its module for dealing with JSON independently from Play Framework, Play WS
Made a blog post about that, check it out at http://pedrorijo.com/blog/scala-json/
Using case classes, and Play WS (already included in Play Framework) you case convert between json and case classes with a simple one-liner implicit
case class User(username: String, friends: Int, enemies: Int, isAlive: Boolean)
object User {
implicit val userJsonFormat = Json.format[User]
}
Related
I am trying to implement a generic pattern with which to generate marshallers and unmarshallers for an Akka HTTP REST service using Argonaut, handling both entity and collection level requests and responses. I have no issues in implementing the entity level as such:
case class Foo(foo: String)
object Foo {
implicit val FooJsonCodec = CodecJson.derive[Foo]
implicit val EntityEncodeJson = FooJson.Encoder
implicit val EntityDecodeJson = FooJson.Decoder
}
I am running into issues attempting to provide encoders and decoders for the following:
[
{ "foo": "1" },
{ "foo": "2" }
]
I have attempted adding the following to my companion:
object Foo {
implicit val FooCollectionJsonCodec = CodecJson.derive[HashSet[Foo]]
}
However, I am receiving the following error:
Error:(33, 90) value jencode0L is not a member of object argonaut.EncodeJson
I see this method truly does not exist but is there any other generic method to generate my expected result. I'm strongly avoiding using an additional case class to describe the collection since I am using reflection heavily in my use case.
At this point, I'd even be fine with a manually constructed Encoder and Decoder, however, I've found no documentation on how to construct it with the expected structure.
Argonaut have predefined encoders and decoders for Scala's immutable lists, sets, streams and vectors. If your type is not supported explicitly, as in the case of java.util.HashSet, you can easily add EncodeJson and DecodeJson for the type:
import argonaut._, Argonaut._
import scala.collection.JavaConverters._
implicit def hashSetEncode[A](
implicit element: EncodeJson[A]
): EncodeJson[java.util.HashSet[A]] =
EncodeJson(set => EncodeJson.SetEncodeJson[A].apply(set.asScala.toSet))
implicit def hashSetDecode[A](
implicit element: DecodeJson[A]
): DecodeJson[java.util.HashSet[A]] =
DecodeJson(cursor => DecodeJson.SetDecodeJson[A]
.apply(cursor)
.map(set => new java.util.HashSet(set.asJava)))
// Usage:
val set = new java.util.HashSet[Int]
set.add(1)
set.add(3)
val jsonSet = set.asJson // [1, 3]
jsonSet.jdecode[java.util.HashSet[Int]] // DecodeResult(Right([1, 3]))
case class A(set: java.util.HashSet[Int])
implicit val codec = CodecJson.derive[A]
val a = A(set)
val jsonA = a.asJson // { "set": [1, 3] }
jsonA.jdecode[A] // DecodeResult(Right(A([1, 3])))
Sample is checked on Scala 2.12.1 and Argonaut 6.2-RC2, but as far as I know it shouldn't depend on some latest changes.
Approach like this works with any linear or unordered homogenous data structure that you want to represent as JSON array. Also, this is preferable to creating a CodecJson: latter can be inferred automatically from JsonEncode and JsonDecode, but not vice versa. This way, your set will serialize and deserialize both when used independently or within other data type, as shown in example.
I don't use Argonaut but use spray-json and suspect solution can be similar.
Have you tried something like this ?
implicit def HashSetJsonCodec[T : CodecJson] = CodecJson.derive[Set[T]]
if it doesn't work I'd probably try creating more verbose implicit function like
implicit def SetJsonCodec[T: CodecJson](implicit codec: CodecJson[T]): CodecJson[Set[T]] = {
CodecJson(
{
case value => JArray(value.map(codec.encode).toList)
},
c => c.as[JsonArray].flatMap {
case arr: Json.JsonArray =>
val items = arr.map(codec.Decoder.decodeJson)
items.find(_.isError) match {
case Some(error) => DecodeResult.fail[Set[T]](error.toString(), c.history)
case None => DecodeResult.ok[Set[T]](items.flatMap(_.value).toSet[T])
}
}
)
}
PS. I didn't test this but hopefully it leads you to the right direction :)
First, I searched a lot on Google and StackOverflow for questions like that, but I didn't find any useful answers (to my big surprise).
I saw something about Play Framework, how to create JSON array in Java and how to create JSON objects in Java, but I don't want to use Play Framework and I don't know if the creation of JSON objects differ from Scala to Java.
Following is the JSON I want to create. Later I'll convert the object into a string to send it via a POST request (through an API call).
{
"start_relative": {
"value": "5",
"unit": "years"
},
"metrics": [
{
"name": "DP_391366" # S-Temperature - Celsius
},
{
"name": "DP_812682" # Sensor-A4 Luminosity
}
]
}
How can I do something like that in Scala?
You should use a library that handles serialization/deserialization.
I would consider choosing between Spray Json and Play Json.
I will explain to you how the process works with Play first, and it's very similar to that in Spray.
Let's say you have a class, and an object with an instance and a json as string:
case class MyClass(id: Int,
name: String,
description: String)
object Data {
val obj: MyClass = MyClass(1, "me", "awesome")
val str: String =
"""
|{
| "id": 1,
| "name": "me",
| "description": "awesome"
|}
""".stripMargin
}
For MyClass to be serialized/deserialized, you will need an implicit formatter, specific for this, so you will create an object that contains this formatter using Play.
trait MyClassPlayProtocol {
implicit val formatAbility = Json.format[Ability]
}
object MyClassPlayProtocol extends MyClassPlayProtocol
The serialization/deserialization will look something like this:
object PlayData {
import play.api.libs.json.JsValue
import play.api.libs.json.Json
import MyClassPlayProtocol._
import General._
val str2Json: JsValue = Json.parse(str)
val obj2Json: JsValue = Json.toJson(obj)
val json2Str: String = Json.stringify(str2Json)
val json2Obj: MyClass = obj2Json.as[MyClass]
}
In Spray, the protocol will look like this:
trait MyClassSprayProtocol extends DefaultJsonProtocol {
implicit val myClassFormat = jsonFormat3(MyClass)
}
object MyClassSprayProtocol extends MyClassSprayProtocol
and the serialization/deserialization:
object SprayData {
import spray.json._
import MyClassSprayProtocol._
import General._
val str2Json: JsValue = str.parseJson
val obj2Json: JsValue = obj.toJson
val json2Str: String = str2Json.compactPrint
val json2Obj: MyClass = obj2Json.convertTo[MyClass]
}
As you can see, it's mostly a matter of choice between this two. Both are still improved and probably will be in the near future.
Depending on the benchmark, you will find that one is better than the other by a few miliseconds (usually Spray).
I for one am using Spray at work and Play in some personal projects, and I can't say I found something fundamentally different from one to another.
EDIT:
And to finally answer your question, to go from MyClass to String (serialization), you will do something like this:
PLAY: Json.stringify(Json.toJson(myClass))
SPRAY: myClass.toJson.compactPrint
And the deserialization:
PLAY: Json.parse(string).as[MyClass]
SPRAY: myClass.parseJson.convertTo[MyClass]
You need to use a library if you dont want it to do by yourself there are serveral:
Spray Json - https://github.com/spray/spray-json
Lift Json - https://github.com/lift/lift/tree/master/framework/lift-base/lift-json/
Jerkson - https://github.com/codahale/jerkson
Jackson - You can use Jackson with the scala Module https://github.com/FasterXML/jackson-module-scala
Note: The cool Java Gson LIbrary looses a lot of Magic if you want to use it with Scala, because they dont know Collections. So if you want to use this library you have to convert the "Scala List" to java.util.List and after that use Gson.
I have a simple single key-valued Map(K,V) myDictionary that is populated by my program and at the end I want to write it as JSON format string in a text file - as I would need parse them later.
I was using this code earlier,
Some(new PrintWriter(outputDir+"/myDictionary.json")).foreach{p => p.write(compact(render(decompose(myDictionary)))); p.close}
I found it to be slower as the input size increased. Later, I used this var out = new
var out = new PrintWriter(outputDir+"/myDictionary.json");
out.println(scala.util.parsing.json.JSONObject(myDictionary.toMap).toString())
This is proving to be bit faster.
I have run this for sample input and found that this is faster than my earlier approach. I assuming my input map size would reach at least a million values( >1GB text file) (K,V) hence I want to make sure that I follow the faster and memory efficient approach for Map serialization process.What are other approaches that you would recommend,that I can look into to optimize this.
The JSON support in the standard Scala library is probably not the best choice. Unfortunately the situation with JSON libraries for Scala is a bit confusing, there are many alternatives (Lift JSON, Play JSON, Spray JSON, Twitter JSON, Argonaut, ...), basically one library for each day of the week... I suggest you have a look at these at least to see if any of them is easier to use and more performative.
Here is an example using Play JSON which I have chosen for particular reasons (being able to generate formats with macros):
object JsonTest extends App {
import play.api.libs.json._
type MyDict = Map[String, Int]
implicit object MyDictFormat extends Format[MyDict] {
def reads(json: JsValue): JsResult[MyDict] = json match {
case JsObject(fields) =>
val b = Map.newBuilder[String, Int]
fields.foreach {
case (k, JsNumber(v)) => b += k -> v.toInt
case other => return JsError(s"Not a (string, number) pair: $other")
}
JsSuccess(b.result())
case _ => JsError(s"Not an object: $json")
}
def writes(m: MyDict): JsValue = {
val fields: Seq[(String, JsValue)] = m.map {
case (k, v) => k -> JsNumber(v)
} (collection.breakOut)
JsObject(fields)
}
}
val m = Map("hallo" -> 12, "gallo" -> 34)
val serial = Json.toJson(m)
val text = Json.stringify(serial)
println(text)
val back = Json.fromJson[MyDict](serial)
assert(back == JsSuccess(m), s"Failed: $back")
}
While you can construct and deconstruct JsValues directly, the main idea is to use a Format[A] where A is the type of your data structure. This puts more emphasis on type safety than the standard Scala-Library JSON. It looks more verbose, but in end I think it's the better approach.
There are utility methods Json.toJson and Json.fromJson which look for an implicit format of the type you want.
On the other hand, it does construct everything in-memory and it does duplicate your data structure (because for each entry in your map you will have another tuple (String, JsValue)), so this isn't necessarily the most memory efficient solution, given that you are operating in the GB magnitude...
Jerkson is a Scala wrapper for the Java JSON library Jackson. The latter apparently has the feature to stream data. I found this project which says it adds streaming support. Play JSON in turn is based on Jerkson, so perhaps you can even figure out how to stream your object with that. See also this question.
When trying to insert a MongoDBObject that contains a JsNumber
val obj: DBObject = getDbObj // contains a "JsNumber()"
collection.insert(obj)
the following error occurs:
[error] play - Cannot invoke the action, eventually got an error: java.lang.IllegalArgumentException: can't serialize class scala.math.BigDecimal
I tried to replace the JsNumber with an Int, but I got the same error.
EDIT
Error can be reproduced via this test code. Full code in scalatest (https://gist.github.com/kman007us/6617735)
val collection = MongoConnection()("test")("test")
val obj: JsValue = Json.obj("age" -> JsNumber(100))
val q = MongoDBObject("name" -> obj)
collection.insert(q)
There are no registered handlers for Plays JSON implementation - you could add handlers to automatically translate plays Js Types to BSON types. However, that wont handle mongodb extended json which has a special structure dealing with non native json types eg: date and objectid translations.
An example of using this is:
import com.mongodb.util.JSON
val obj: JsValue = Json.obj("age" -> JsNumber(100))
val doc: DBObject = JSON.parse(obj.toString).asInstanceOf[DBObject]
For an example of a bson transformer see the joda time transformer.
It seems that casbah driver isn't compatible with Plays's JSON implementation. If I look through the cashbah code than it seems that you must use a set of MongoDBObject objects to build your query. The following snippet should work.
val collection = MongoConnection()("test")("test")
val obj = MongoDBObject("age" -> 100)
val q = MongoDBObject("name" -> obj)
collection.insert(q)
If you need the compatibility with Play's JSON implementation then use ReactiveMongo and Play-ReactiveMongo.
Edit
Maybe this Gist can help to convert JsValue objects into MongoDBObject objects.
I am working on Play Framework 2.0 and it uses Jerkson to parse JSON strings. I was using it successfully to parse Immutable Lists of strings like so:
Json.parse( jsonStr ).as[ List[String] ]
But this code doesn't work for me when I try
Json.parse( jsonStr ).as[ MutableList[String] ]
Does anyone know how I can do this easily?
Your second line will work as it is in a future version of Play 2.0, thanks to the replacement of seqReads by traversableReads in the current trunk:
implicit def traversableReads[F[_], A](implicit bf: generic.CanBuildFrom[F[_], A, F[A]], ra: Reads[A]) = new Reads[F[A]] {
def reads(json: JsValue) = json match {
case JsArray(ts) => {
val builder = bf()
for (a <- ts.map(fromJson[A](_))) {
builder += a
}
builder.result()
}
case _ => throw new RuntimeException("Collection expected")
}
}
So if you're willing to build Play from source, or to wait, you're fine. Otherwise you should be able to drop the method above somewhere in your own code to get an appropriate Reads instance in scope, or—even better—just use Alexey Romanov's solution, or—best of all—don't use MutableList.
E.g. new MutableList[String]() ++= Json.parse( jsonStr ).as[ List[String] ] (assuming #DanSimon is correct about which MutableList you mean). But the most used mutable list-like collection in Scala is a Buffer which could be obtained as Buffer(Json.parse( jsonStr ).as[ List[String] ] or Json.parse( jsonStr ).as[ List[String] ].toBuffer.