Kotlinx.Serializer - Create a quick JSON to send - json

I've been playing with Kotlinx.serialisation. I've been trying to find a quick way to use Kotlinx.serialisation to create a plain simple JSON (mostly to send it away), with minimum code clutter.
For a simple string such as:
{"Album": "Foxtrot", "Year": 1972}
I've been doing is something like:
val str:String = Json.stringify(mapOf(
"Album" to JsonPrimitive("Foxtrot"),
"Year" to JsonPrimitive(1972)))
Which is far from being nice. My elements are mostly primitive, so I wish I had something like:
val str:String = Json.stringify(mapOf(
"Album" to "Sergeant Pepper",
"Year" to 1967))
Furthermore, I'd be glad to have a solution with a nested JSON. Something like:
Json.stringify(JsonObject("Movies", JsonArray(
JsonObject("Name" to "Johnny English 3", "Rate" to 8),
JsonObject("Name" to "Grease", "Rate" to 1))))
That would produce:
{
"Movies": [
{
"Name":"Johnny English 3",
"Rate":8
},
{
"Name":"Grease",
"Rate":1
}
]
}
(not necessarily prettified, even better not)
Is there anything like that?
Note: It's important to use a serialiser, and not a direct string such as
"""{"Name":$name, "Val": $year}"""
because it's unsafe to concat strings. Any illegal char might disintegrate the JSON! I don't want to deal with escaping illegal chars :-(
Thanks

Does this set of extension methods give you what you want?
#ImplicitReflectionSerializer
fun Map<*, *>.toJson() = Json.stringify(toJsonObject())
#ImplicitReflectionSerializer
fun Map<*, *>.toJsonObject(): JsonObject = JsonObject(map {
it.key.toString() to it.value.toJsonElement()
}.toMap())
#ImplicitReflectionSerializer
fun Any?.toJsonElement(): JsonElement = when (this) {
null -> JsonNull
is Number -> JsonPrimitive(this)
is String -> JsonPrimitive(this)
is Boolean -> JsonPrimitive(this)
is Map<*, *> -> this.toJsonObject()
is Iterable<*> -> JsonArray(this.map { it.toJsonElement() })
is Array<*> -> JsonArray(this.map { it.toJsonElement() })
else -> JsonPrimitive(this.toString()) // Or throw some "unsupported" exception?
}
This allows you to pass in a Map with various types of keys/values in it, and get back a JSON representation of it. In the map, each value can be a primitive (string, number or boolean), null, another map (representing a child node in the JSON), or an array or collection of any of the above.
You can call it as follows:
val json = mapOf(
"Album" to "Sergeant Pepper",
"Year" to 1967,
"TestNullValue" to null,
"Musicians" to mapOf(
"John" to arrayOf("Guitar", "Vocals"),
"Paul" to arrayOf("Bass", "Guitar", "Vocals"),
"George" to arrayOf("Guitar", "Sitar", "Vocals"),
"Ringo" to arrayOf("Drums")
)
).toJson()
This returns the following JSON, not prettified, as you wanted:
{"Album":"Sergeant Pepper","Year":1967,"TestNullValue":null,"Musicians":{"John":["Guitar","Vocals"],"Paul":["Bass","Guitar","Vocals"],"George":["Guitar","Sitar","Vocals"],"Ringo":["Drums"]}}
You probably also want to add handling for some other types, e.g. dates.
But can I just check that you want to manually build up JSON in code this way rather than creating data classes for all your JSON structures and serializing them that way? I think that is generally the more standard way of handling this kind of stuff. Though maybe your use case does not allow that.
It's also worth noting that the code has to use the ImplicitReflectionSerializer annotation, as it's using reflection to figure out which serializer to use for each bit. This is still experimental functionality which might change in future.

Related

Scala: How to create a simple nested JSON object

In Scala I want to create a JSON Object or JSON String where the result would be something similar to this (i.e. a relatively basic nested JSON):
{
"info":{
"info1":"1234",
"info2":"123456",
"info3":false
},
"x":[
{
"x1_info":10,
"x2_info":"abcde"
}
],
"y":[
{
"y1_info":"28732",
"y2_info":"defghi",
"y3_info":"1261",
"y_extra_info":[
{
"Id":"543890",
"ye1":"asdfg",
"ye2":"BOOLEAN",
"ye3":false,
"ye4":true,
"ye5":true,
"ye6":0,
"ye7":396387
}
]
}
]
}
I plan to use this object in tests so ideally I can set the information inside the JSON object to whatever I want (I don't plan on sourcing the info for the JSON object from a separate file). I tried the following where I created a Map of the information I want in my JSON object and then tried converting it to json using gson:
val myInfo = Map(
"info" -> Map("info1" -> "1234",
"info2" -> "123456",
"info3" -> "false"),
"x" -> Map("x1_info" -> 10,
"x2_info" -> "abcde"),
"y" -> Map("y1_info" -> "28732",
"y2_info" -> "defghi",
"y3_info" -> "1261",
"y_extra_info"-> Map("Id" -> "543890",
"ye1" -> "asdfg",
"ye2" -> "BOOLEAN",
"ye3" -> false,
"ye4" -> true,
"ye5" -> true,
"ye6" -> 0,
"ye7" -> 396387
)
)
)
val gson = new GsonBuilder().setPrettyPrinting.create
print(gson.toJson(myInfo))
However though it produces JSON correctly does not produce the JSON in the structure I have above but instead structures it like so:
{
"key1": "info",
"value1": {
"key1": "info1",
"value1": "1234",
"key2": "info2",
"value2": "123456",
...(etc)...
I have figured out an inelegant way of achieving what I want by creating a string containing the JSON I expect and then parsing as JSON (but I assume there is a better way)
val jsonString = "{\"info\":{\"info1\":\"1234\", \"info2\":\"123456\", " +
"\"info3\":false}," +
"\"x\":[{ \"x1_info\":10,\"x2_info\":\"abcde\"}]," +
"\"y\":[{\"y1_info\":\"28732\",\"y2_info\":\"defghi\",\"y3_info\":\"1261\", " +
"\"y_extra_info\":[{" +
"\"Id\":\"543890\",\"ye1\":\"asdfg\"," +
"\"ye2\":\"BOOLEAN\",\"ye3\":false,\"ye4\":true," +
"\"ye5\":true,\"ye6\":0,\"ye7\":396387}]}]}"
I'm relatively new to Scala so perhaps using a Map or writing it all out as a string is not the best or most "Scala way"?
Is there an easy way to create such a JSON object using preferable either gson or spray or otherwise by some other means?
In general, there's no good reason to use GSON with Scala, as GSON is Java-based and doesn't really know how to deal with Scala collections. In this case, it doesn't know how to deal with the key-value-ness of a Scala Map and instead (most likely due to reflection) just leaks the implementation detail of the default Scala Map being customized for the cases where the size is less than 4.
JSON libraries which are native to Scala (e.g. circe, play-json, spray-json, etc.) do what you expect for String-keyed Maps. They also have built-in support for directly handling case classes in Scala, e.g. for
case class Foo(id: String, code: Int, notes: String)
val foo = Foo("fooId", 42, "this is a very important foo!")
will serialize as something like (ignoring possible differences around pretty-printing and field ordering)
{ "id": "fooId", "code": 42, "notes": "this is a very important foo!" }
If there's a hard project requirement that you must use GSON, I believe it does understand Java's collections, so you could convert a Scala Map to a java.util.Map by using the conversions in CollectionConverters (in 2.13, it's in the scala.jdk package; in earlier versions, it should be in scala.collection). I'm not going to elaborate on this, because my very strong opinion is that in a Scala codebase, it's best to use a native-to-Scala JSON library.

How to properly use JSON.parse in kotlinjs with enums?

During my fresh adventures with kotlin-react I hit a hard stop when trying to parse some data from my backend which contains enum values.
Spring-Boot sends the object in JSON form like this:
{
"id": 1,
"username": "Johnny",
"role": "CLIENT"
}
role in this case is the enum value and can have the two values CLIENT and LECTURER. If I were to parse this with a java library or let this be handled by Spring-Boot, role would be parsed to the corresponding enum value.
With kotlin-js' JSON.parse, that wouldn't work and I would have a simple string value in there.
After some testing, I came up with this snippet
val json = """{
"id": 1,
"username": "Johnny",
"role": "CLIENT",
}"""
val member: Member = JSON.parse(json) { key: String, value: Any? ->
if (key == "role") Member.Role.valueOf(value.toString())
else value
}
in which I manually have to define the conversion from the string value to the enum.
Is there something I am missing that would simplify this behaviour?
(I am not referring to using ids for the JSON and the looking those up, etc. I am curious about some method in Kotlin-JS)
I have the assumption there is not because the "original" JSON.parse in JS doesn't do this and Kotlin does not add any additional stuff in there but I still have hope!
As far as I know, no.
The problem
Kotlin.JS produces an incredibly weird type situation when deserializing using the embedded JSON class, which actually is a mirror for JavaScript's JSON class. While I haven't done much JavaScript, its type handling is near non-existent. Only manual throws can enforce it, so JSON.parse doesn't care if it returns a SomeCustomObject or a newly created object with the exact same fields.
As an example of that, if you have two different classes with the same field names (no inheritance), and have a function that accepts a variable, it doesn't care which of those (or a third for that matter) it receives as long as the variables it tries accessing on the class exists.
The type issues manifest themselves into Kotlin. Now wrapping it back to Kotlin, consider this code:
val json = """{
"x": 1, "y": "yes", "z": {
"x": 42, "y": 314159, "z": 444
}
}""".trimIndent()
data class SomeClass(val x: Int, val y: String, val z: Struct)
data class Struct(val x: Int, val y: Int, val z: Int)
fun main(args: Array<String>) {
val someInstance = JSON.parse<SomeClass>(json)
if(someInstance.z::class != Struct::class) {
println("Incompatible types: Required ${Struct::class}, found ${someInstance.z::class}");
}
}
What would you expect this to print? The natural would be to expect a Struct. The type is also explicitly declared
Unfortunately, that is not the case. Instead, it prints:
Incompatible types: Required class Struct, found class Any
The point
The embedded JSON de/serializer isn't good with types. You might be able to fix this by using a different serializing library, but I'll avoid turning this into a "use [this] library".
Essentially, JSON.parse fails to parse objects as expected. If you entirely remove the arguments and try a raw JSON.parse(json); on the JSON in your question, you'll get a role that is a String and not a Role, which you might expect. And with JSON.parse doing no type conversion what so ever, that means you have two options: using a library, or using your approach.
Your approach will unfortunately get complicated if you have nested objects, but with the types being changed, the only option you appear to have left is explicitly parsing the objects manually.
TL;DR: your approach is fine.

Json4s: keep unknown fields in a map when deserialising

I am trying to parse the response given by a HTTP endpoint using json4s in scala. The Json returned could have many number of fields (They are all documented and defined, but there are a lot of them, and they are subject to change.) I don't need to reference many of these fields, only pass them on to some other service to deal with.
I want to take the fields I need, and deserialise the rest to a map. The class needs to be serialised correctly also.
e.g. JSON response from endpoint:
{
"name": "value",
"unknown_field": "unknown",
"unknown_array": ["one", "two", "three"],
...
...
}
e.g. case class used in code:
case class TestResponse(name: String, otherFields: Map[String, Any])
Is there a simple solution for this?
I have made an attempt to implement a custom Serialiser for this, but have not had much luck as yet. Seems like this would be a common enough requirement. Is there a way to do this OOTB with json4s?
Cheers
Current attempt at customer serialiser:
private object TestResponseDeserializer extends CustomSerializer[TestResponse](_ => ( {
case JObject(JField("name_one", JString(name)) :: rest) => TestType1Response(name, rest.toMap)
case JObject(JField("name_two", JString(name)) :: rest) => TestType2Response(name, rest.toMap)
}, {
case testType1: TestType1Response=>
JObject(JField("name_one", JString(testType1.name)))
case testType2: TestType2Response=> JObject(JField("name_two", JString(testType2.name)))
}))
I was able to solve this using the custom serialiser in the original question. It didn't work for me due to an unrelated issue.

How to intercept map getProperty and list getAt?

I'm scraping external sources, mostly JSON. I'm using new JsonSlurper().parse(body) to parse them and I operate on them using constructs like def name = json.user[0].name. These being external, can change without notice, so I want to be able to detect this and log it somehow.
After reading a bit about the MOP I thought that I can change the appropriate methods of the maps and lists to log if the property is missing. I only want to do that the json object and on its properties recursively. The thing is, I don't know how to do that.
Or, is there a better way to accomplish all this?
[EDIT] For example if I get this JSON:
def json = '''{
"owners": [
{
"firstName": "John",
"lastName": "Smith"
},
{
"firstName": "Jane",
"lastName": "Smith"
}
]
}'''
def data = new groovy.json.JsonSlurper().parse(json.bytes)
assert data.owners[0].firstName == 'John'
However, if they change "owners" to "ownerInfo" the above access would throw NPE. What I want is intercept the access and do something (like log it in a special log, or whatever). I can also decide to throw a more specialized exception.
I don't want to catch NullPointerException, because it may be caused by some bug in my code instead of the changed data format. Besides, if they changed "firstName" to "givenName", but kept the "owners" name, I'd just get a null value, not NPE. Ideally I want to detect this case as well.
I also don't want to put a lot of if or evlis operators, if possible.
I actually managed to intercept that for maps:
data.getMetaClass().getProperty = {name -> println ">>> $name"; delegate.get(name)}
assert data.owners // this prints ">>> owners"
I still can't find out how to do that for the list:
def owners = data.owners
owners.getMetaClass().getAt(o -> println "]]] $o"; delegate.getAt(o)}
assert owners[0] // this doesn't print anything
Try this
owners.getMetaClass().getAt = { Integer o -> println "]]] $o"; delegate.get(o)}
I'm only guessing that it got lost because of multiple getAt() methods, so you have to define type. I also delegated to ArrayList's Java get() method since getAt() resulted in recursive calls.
If you want to more control over all methods calls, you could always do this
owners.getMetaClass().invokeMethod = { String methodName, Object[] args ->
if (methodName == "getAt") {
println "]]] $args"
}
return ArrayList.getMetaClass().getMetaMethod(methodName, args).invoke(delegate, args)
}
The short answer is that you can't do this with the given example. The reason is that the owners object is a java.util.ArrayList, and you are calling the get(int index) method on it. The metaClass object is specific to Groovy, and if you have a Java object making a method call to a Java method, it will have no knowledge of the metaClass. Here's a somewhat related question.
The good news is that there is and option, although I'm not sure if it works for your use case. You can create a Groovy wrapper object for this list, so that you can capture method calls.
For example, you could change your code from this
def owners = data.owners
to this
def owners = new GroovyList(data.owners)
and then create this new class
class GroovyList {
private List list
public GroovyList(List list) {
this.list = list
}
public Object getAt(int index) {
println "]]] $index"
list.getAt(index)
}
}
Now when you call
owners[0]
you'll get the output
]]] 0
[firstName:John, lastName:Smith]

Abstraction to extract data from JSON in Scala

I am looking for a good abstraction to extract data form JSON (I am using json4s now).
Suppose I have a case class A and data in JSON format.
case class A(a1: String, a2: String, a3: String)
{"a1":"xxx", "a2": "yyy", "a3": "zzz"}
I need a function to extract the JSON data and return A with these data as follows:
val a: JValue => A = ...
I do not want to write the function a from scratch. I would rather compose it from primitive functions.
For example, I can write a primitive function to extract string by field name:
val str: (String, JValue) => String = {(fieldName, jval) => ... }
Now I would like to compose the function a: JValue => A from str. Does it make sense ?
Consider use of Play-JSON, which has a composable "Reads" object. If you've ever used ReactiveMongo, it can be used in much the same way. Contrary to some older posts here, it can be used stand-alone, without most of the rest of Play.
It uses the common "implicit translator" (my term) idiom. I found that my favorite deserializing pattern for using it is not highlighted in the docs, though - the pattern they espouse is a lot harder to get right, IMHO. I make heavy use of .as and .asOpt, which are documented on the first linked page above, in the small section "Using JsValue.as/asOpt". When deserializing a JSON object, you can say something like
val person:Person = (someParsedJsonObject \ "aPerson").as[Person]
and as long as you have an implicit Reads[Person] in scope, all just works. There are built-in Reads for all primitive types and many collection types. In many cases, it makes sense to put the Reads and Writes implicit objects in the companion object for, e.g., Person.
I thought json4s had a similar feature, but I could be wrong.
Argonaut is fully functional Scala library.
It allows to encode/decode case classes (JSON codecs).
import argonaut._, Argonaut._
case class Person(name: String, age: Int)
implicit def PersonDecodeJson: DecodeJson[Person]
jdecode2L(Person.apply)("name", "age")
// Codec for Person case class from JSON of form
// { "name": "string", "age": 1 }
It also provides JSON cursor (lenses/monocle) for custom parsing.
implicit def PersonDecodeJson: DecodeJson[Person] =
DecodeJson(c => for {
name <- (c --\ "_name").as[String]
age <- (c --\ "_age").as[String].map(_.toInt)
} yield Person(name, age))
// Decode Person from a JSON with property names different
// from those of the case class, and age passed as string:
// { "_name": "string", "age": "10" }
Parsing result is represented by DecodeResult type that can be composed (.map, .flatMap) and handle error cases.