Want Groovy MarkupBuilder() Equivalent to JSONBuilder() for Objects - json

Goal:
Given the myInfoObject definition below, I wish to be able to do this:
println new groovy.xml.MarkupBuilder(myInfoObject).toPrettyString()
Premise:
The following is one of the most amazing and convenient features of Groovy for my use cases: Brilliant dynamic serializing of complex nested objects into sensible JSON. Just pass the object, and get the JSON.
Example - A simple Map within a Map
import groovy.json.*
def myInfoMap = [
firstname : 'firstname',
lastname : 'lastname',
relatives : [
mother : "mom",
father : "dad"
]
]
myInfoJson = new JsonBuilder(myInfoMap)
//One line, straight to JSON object, no string writer/parser conversions
//Works on any object, extremely elegant, even handles deep nesting
//Alternatively, add .toPrettyString() for the string representation
Returns:
{
"firstname": "firstname",
"lastname": "lastname",
"relatives": {
"mother": "mom",
"father": "dad"
}
}
I have read through all the MarkupBuilder examples and docs I could find, and there does not seem to be any equivalent for XML. Here is the closest I could find, it is not nearly the same.
http://www.leveluplunch.com/groovy/examples/build-xml-from-map-with-markupbuilder/
XML and JSON are fundamentally different, but it's still common for objects to be represented by XML in a similar fashion. An XML equivalent would require at least one optional parameter specifying how the data should be represented, but I think a sensible default would be something like:
<myInfoMap>
<firstname>firstname</firstname>
<lastname>lastname</lastname>
<relatives>
<relative>
<mother>mom</mother>
</relative>
<relative>
<father>dad</father>
</relative>
</relatives>
</myInfoMap>
...Which has to be built manually with intimate knowledge of the structure like so...
def writer = new StringWriter()
def builder = new groovy.xml.MarkupBuilder(writer)
builder.myInfoMap {
myInfoMap.each{key, value ->
if (value instanceof Map){
"${key}"{
value.each{key2, value2 ->
"${key[0..key.size()-2]}"{
"${key2}" "${value2}"
}
}
}
}else{
"${key}" "${value}"
}
}
}
println writer.toString()
I even tried to be clever and make it a bit dynamic, but you can see how far from the JSONBuilder example it is, even in a simple case.
If this is currently impossible and not on anybody's radar, I will submit my first JIRA ticket to the Groovy project as a feature request. Just want to be sure before I do. Please just comment if you think this is the next step.

Try grails.converters.XML. In your case:
def myInfoMap = [
firstname: 'firstname',
lastname : 'lastname',
relatives: [
mother: "mom",
father: "dad"
]
]
println new grails.converters.XML(myInfoMap)
would result in:
<?xml version="1.0" encoding="UTF-8"?>
<map>
<entry key="firstname">firstname</entry>
<entry key="lastname">lastname</entry>
<entry key="relatives">
<entry key="mother">mom</entry>
<entry key="father">dad</entry>
</entry>
</map>

Sorry I may be wrong but XmlUtil.serialize I think covers what you want.
MrHaki: groovy-goodness-pretty-print-xml
I did some work a while back that I was going to release as a plugin, I didn't release it in the end. Got side tracked with dns on the fly update which didn't end up working. Anyhow process xml controller and follow gsp view for it. But I think MrHaki has put it a lot more elegantly and has some really good examples

Related

Kotlinx.Serializer - Create a quick JSON to send

I've been playing with Kotlinx.serialisation. I've been trying to find a quick way to use Kotlinx.serialisation to create a plain simple JSON (mostly to send it away), with minimum code clutter.
For a simple string such as:
{"Album": "Foxtrot", "Year": 1972}
I've been doing is something like:
val str:String = Json.stringify(mapOf(
"Album" to JsonPrimitive("Foxtrot"),
"Year" to JsonPrimitive(1972)))
Which is far from being nice. My elements are mostly primitive, so I wish I had something like:
val str:String = Json.stringify(mapOf(
"Album" to "Sergeant Pepper",
"Year" to 1967))
Furthermore, I'd be glad to have a solution with a nested JSON. Something like:
Json.stringify(JsonObject("Movies", JsonArray(
JsonObject("Name" to "Johnny English 3", "Rate" to 8),
JsonObject("Name" to "Grease", "Rate" to 1))))
That would produce:
{
"Movies": [
{
"Name":"Johnny English 3",
"Rate":8
},
{
"Name":"Grease",
"Rate":1
}
]
}
(not necessarily prettified, even better not)
Is there anything like that?
Note: It's important to use a serialiser, and not a direct string such as
"""{"Name":$name, "Val": $year}"""
because it's unsafe to concat strings. Any illegal char might disintegrate the JSON! I don't want to deal with escaping illegal chars :-(
Thanks
Does this set of extension methods give you what you want?
#ImplicitReflectionSerializer
fun Map<*, *>.toJson() = Json.stringify(toJsonObject())
#ImplicitReflectionSerializer
fun Map<*, *>.toJsonObject(): JsonObject = JsonObject(map {
it.key.toString() to it.value.toJsonElement()
}.toMap())
#ImplicitReflectionSerializer
fun Any?.toJsonElement(): JsonElement = when (this) {
null -> JsonNull
is Number -> JsonPrimitive(this)
is String -> JsonPrimitive(this)
is Boolean -> JsonPrimitive(this)
is Map<*, *> -> this.toJsonObject()
is Iterable<*> -> JsonArray(this.map { it.toJsonElement() })
is Array<*> -> JsonArray(this.map { it.toJsonElement() })
else -> JsonPrimitive(this.toString()) // Or throw some "unsupported" exception?
}
This allows you to pass in a Map with various types of keys/values in it, and get back a JSON representation of it. In the map, each value can be a primitive (string, number or boolean), null, another map (representing a child node in the JSON), or an array or collection of any of the above.
You can call it as follows:
val json = mapOf(
"Album" to "Sergeant Pepper",
"Year" to 1967,
"TestNullValue" to null,
"Musicians" to mapOf(
"John" to arrayOf("Guitar", "Vocals"),
"Paul" to arrayOf("Bass", "Guitar", "Vocals"),
"George" to arrayOf("Guitar", "Sitar", "Vocals"),
"Ringo" to arrayOf("Drums")
)
).toJson()
This returns the following JSON, not prettified, as you wanted:
{"Album":"Sergeant Pepper","Year":1967,"TestNullValue":null,"Musicians":{"John":["Guitar","Vocals"],"Paul":["Bass","Guitar","Vocals"],"George":["Guitar","Sitar","Vocals"],"Ringo":["Drums"]}}
You probably also want to add handling for some other types, e.g. dates.
But can I just check that you want to manually build up JSON in code this way rather than creating data classes for all your JSON structures and serializing them that way? I think that is generally the more standard way of handling this kind of stuff. Though maybe your use case does not allow that.
It's also worth noting that the code has to use the ImplicitReflectionSerializer annotation, as it's using reflection to figure out which serializer to use for each bit. This is still experimental functionality which might change in future.

Return nested JSON in AWS AppSync query

I'm quite new to AppSync (and GraphQL), in general, but I'm running into a strange issue when hooking up resolvers to our DynamoDB tables. Specifically, we have a nested Map structure for one of our item's attributes that is arbitrarily constructed (its complexity and form depends on the type of parent item) — a little something like this:
"item" : {
"name": "something",
"country": "somewhere",
"data" : {
"nest-level-1a": {
"attr1a" : "foo",
"attr1b" : "bar",
"nest-level-2" : {
"attr2a": "something else",
"attr2b": [
"some list element",
"and another, for good measure"
]
}
}
},
"cardType": "someType"
}
Our accompanying GraphQL type is the following:
type Item {
name: String!
country: String!
cardType: String!
data: AWSJSON! ## note: it was originally String!
}
When we query the item we get the following response:
{
"data": {
"genericItemQuery": {
"name": "info/en/usa/bra/visa",
"country": "USA:BRA",
"cardType": "visa",
"data": "{\"tourist\":{\"reqs\":{\"sourceURL\":\"https://travel.state.gov/content/passports/en/country/brazil.html\",\"visaFree\":false,\"type\":\"eVisa required\",\"stayLimit\":\"30 days from date of entry\"},\"pages\":\"One page per stamp required\"}}"
}}}
The problem is we can't seem to get the Item.data field resolver to return a JSON object (even when we attach a separate field-level resolver to it on top of the general Query resolver). It always returns a String and, weirdly, if we change the expected field type to String!, the response will replace all : in data with =. We've tried everything with our response resolvers, including suggestions like How return JSON object from DynamoDB with appsync?, but we're completely stuck at this point.
Our current response resolver for our query has been reverted back to the standard response after none of the suggestions in the aforementioned post worked:
## 'Before' response mapping template on genericItemQuery query; same result as the 'After' listed below **
#set($result = $ctx.result)
#set($result.data = $util.parseJson($ctx.result.data))
$util.toJson($result)
## 'After' response mapping template **
$util.toJson($ctx.result)
We're trying to avoid a situation where we need to include supporting types for each nest level in data (since it changes based on parent Item type and in cases like the example I gave it can have three or four tiers), and we thought changing the schema type to AWSJSON! would do the trick. I'm beginning to worry there's no way to get around rebuilding our base schema, though. Any suggestions to the contrary would be helpful!
P.S. I've noticed in the CloudWatch logs that the appropriate JSON response exists under the context.result.data response field, but somehow there's the following transformedTemplate (which, again, I find very unusual considering we're not applying any mapping template except to transform the result into valid JSON):
"arn": ...
"transformedTemplate": "{data={tourist={reqs={sourceURL=https://travel.state.gov/content/passports/en/country/brazil.html, visaFree=false, type=eVisa required, stayLimit=30 days from date of entry}, pages=One page per stamp required}}, resIds=USA:BRA, cardType=visa, id=info/en/usa/bra/visa}",
"context": ...
Apologies for the lengthy question, but I'm stumped.
AWSJSON is a JSON string type so you will always get back a string value (this is what your type definition must adhere to).
You could try to make a type for data field which contains all possible fields and then resolve fields to a corresponding to a parent type or alternatively you could try to implement graphQL interfaces

Json4s: keep unknown fields in a map when deserialising

I am trying to parse the response given by a HTTP endpoint using json4s in scala. The Json returned could have many number of fields (They are all documented and defined, but there are a lot of them, and they are subject to change.) I don't need to reference many of these fields, only pass them on to some other service to deal with.
I want to take the fields I need, and deserialise the rest to a map. The class needs to be serialised correctly also.
e.g. JSON response from endpoint:
{
"name": "value",
"unknown_field": "unknown",
"unknown_array": ["one", "two", "three"],
...
...
}
e.g. case class used in code:
case class TestResponse(name: String, otherFields: Map[String, Any])
Is there a simple solution for this?
I have made an attempt to implement a custom Serialiser for this, but have not had much luck as yet. Seems like this would be a common enough requirement. Is there a way to do this OOTB with json4s?
Cheers
Current attempt at customer serialiser:
private object TestResponseDeserializer extends CustomSerializer[TestResponse](_ => ( {
case JObject(JField("name_one", JString(name)) :: rest) => TestType1Response(name, rest.toMap)
case JObject(JField("name_two", JString(name)) :: rest) => TestType2Response(name, rest.toMap)
}, {
case testType1: TestType1Response=>
JObject(JField("name_one", JString(testType1.name)))
case testType2: TestType2Response=> JObject(JField("name_two", JString(testType2.name)))
}))
I was able to solve this using the custom serialiser in the original question. It didn't work for me due to an unrelated issue.

Grails LinkedHashMap to JSON Preserve Order on RoundTrip

tl;dr
I need to send the data from a LinkedHashMap in a GSP template to a Controller and preserve the order of the elements.
I'm assuming a structured data format like JSON is the ideal way to do this, but Grails' JSON converter doesn't create an ordered JSON object from a LinkedHashMap.
What is the best way to send a LinkedHashMap data structure from a GSP to a Controller so that I can preserve order, but do minimal work in parsing the data?
Long version
I'm developing a taglib to render search results in a table.
In the taglib, I construct a LinkedHashMap that specifies the data columns and the labels that the user wants to show for the column names. For example:
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"]
That map gets sent to a view, which will then send it back to a controller to retrieve the search results from the database. I need to preserve the order of the elements (hence the use of a LinkedHashMap).
My first thought was to turn the LinkedHashMap into a JSON string, and then send it to the controller via a hidden form element. So,
import grails.converters.JSON
//taglib class and other code
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"] as JSON
However, that creates a JSON Object like this in the HTML. I'm putting this in a hidden field's value attribute.
<input type="hidden" name="columns" value="{"firstName": "First Name", "lastName": "Surname", "unique_id": "Your Whizbang ID"}" id="columns">
Here's the JSON object by itself.
{"firstName": "First Name", "lastName": "Surname", "unique_id": "Your Whizbang ID"}
You can see that the JSON string's properties are in the same order as the LinkedHashMap in the JSON string. However, JSON Objects aren't really supposed to the preserve order of their properties. Thus, when my controller receives the columns parameter, and I use the JSON.parse() method on it, it creates a plain ol' unordered HashMap instead of a LinkedHashMap. As a result, the columns in my search results display in the wrong order when I render them into an HTML table.
At least one fellow has had a similar problem. Adding as LinkedHashMap after running JSON.parse() doesn't cut it, since the .parse() method screws up the order from the get go.
Daniel Woods, in his response to the above post, noted:
If it's a matter of the grails data binder not working for you, you should be able to override the implicit property setter to cast the object to your favorite Map implementation.
I assume that he's saying I could write my own parser, which would honor the order of the JSON elements (even though it technically shouldn't). I imagine I could also write my own converter so that the resulting JSON element would be something like:
{[{firstName: "First Name"}, {lastName: "Surname"}, {unique_id "Your Whizbang ID"}]}
I'm just about terrified of how the JSON parser would handle that, though. Would I get back a list of HashMaps?
Again, my real question is What is the best way to send a LinkedHashMap data structure from a GSP to a Controller so that I can preserve order, but do minimal work in parsing the data? I'm assuming that's JSON, but I'm more than happy to be told, "Why not just..."
I think the issue is a mismatch between the nature of Java/Groovy collections and the simple "it's a list or it's a map" nature of JSON. Without getting into custom parsing, I'd suggest shifting what you're sending a bit. Instead of trying to force Groovy notions of a LinkedHashMap into Javascriptland, maybe stick to an idiom Javascript understands, such as a list of maps.
In code, instead of:
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"]
how about:
List tableFields = [
[ name: 'firstName', label: 'First Name' ],
[ name: 'lastName', label: 'Surname' ],
[ name: 'unique_id', label: 'Your Whizbang ID' ],
]
This shifts you to JSON that'd maintain the data (I think) you need while giving JSON something it understands is ordered (a list):
<input type="hidden" name="columns" id="columns" value="[
{ "name": "firstName", "label": "First Name" },
{ "name": "lastName", "label": "Surname" },
{ "name": "unique_id", "label": "Your Whizbang ID" }
]" />
Whatever handles this will be a slightly deeper iterator, but that's the price of going from a land of good collections to simpler types...
What I'm doing for now is passing both the current JSON object and a list that I can iterate through. In the GSP template, this looks like:
<g:hiddenField name="columns" value="${colJson}"/>
<g:hiddenField name="columnOrder" value="${columns.collect{it.key}}"/>
where columns is the LinkedHashMap.
Then, in the controller that gets those params, I do this:
def columnTitles = params.columnOrder.tokenize(",[] ")
def unorderedColumns = JSON.parse(params.columns)
def columns = columnTitles.collectEntries{ [(it): unorderedColumns[it]] }
Not elegant, but it does work, and it requires a bit less refactoring than Joe Rinehart's suggestion.

Parsing JSON with Erlang (erlang-rfc4627 lib)

I am attempting to get some JSON parsed with erlang-rfc4627 and struggling with the returned results
This is the JSON:
{
"people": [
{"name": "Toby"}
]
}
Using the erlang-rfc4627 library:
{ok, Json, []} = rfc4627:decode("...")
I can decode fine into Erlang as:
{obj,[
{"people",[
{obj,[
{"name",<<"Toby">>}
]},
{obj,[
{"name",<<"Blah">>}
]}
]}
]}
But then what happens?
How do I get an array of people out of this structure in an easy way?
(This is a very simplified model of the overall JSON).
Is there a better library I should be using for this?
Updated
I noticed that when pulling out arrays, each element of the arrays has the awful obj structure wrapped into it, which makes the process of manipulating arrays very clumsy.
Why on earth is this so complex in Erlang?
Reference: http://www.lshift.net/blog/2007/02/17/json-and-json-rpc-for-erlang
How about:
lookup(K, {obj, PL}) -> proplists:get_value(K, PL).
And then
People = lookup("people", JSON),
Names = [lookup("name", Obj) || Obj <- People].
The better way generalizes this idea into a query-compiler which can compile any query to a function which can then be applied to a JSON document. It will be way easier should you want to to rumaging inside JSON documents all the time.
It is also important to note that you should probably not be operating directly on the JSON structure, but embed it in something else inside the Erlang world.
Is there a better library I should be using for this?
Please, try use library like jiffy, because this library is easy to use:
1> Doc = {[{foo, [<<"bing">>, 2.3, true]}]}.
{[{foo,[<<"bing">>,2.3,true]}]}
2> BinJSON = jiffy:encode(Doc).
<<"{\"foo\":[\"bing\",2.3,true]}">>
3> JsonMap = jiffy:decode(BinJSON, [return_maps]).
#{<<"foo">> => [<<"bing">>,2.3,true]}
4> maps:get(<<"foo">>, JsonMap, undefined).
[<<"bing">>,2.3,true]