Grails LinkedHashMap to JSON Preserve Order on RoundTrip - json

tl;dr
I need to send the data from a LinkedHashMap in a GSP template to a Controller and preserve the order of the elements.
I'm assuming a structured data format like JSON is the ideal way to do this, but Grails' JSON converter doesn't create an ordered JSON object from a LinkedHashMap.
What is the best way to send a LinkedHashMap data structure from a GSP to a Controller so that I can preserve order, but do minimal work in parsing the data?
Long version
I'm developing a taglib to render search results in a table.
In the taglib, I construct a LinkedHashMap that specifies the data columns and the labels that the user wants to show for the column names. For example:
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"]
That map gets sent to a view, which will then send it back to a controller to retrieve the search results from the database. I need to preserve the order of the elements (hence the use of a LinkedHashMap).
My first thought was to turn the LinkedHashMap into a JSON string, and then send it to the controller via a hidden form element. So,
import grails.converters.JSON
//taglib class and other code
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"] as JSON
However, that creates a JSON Object like this in the HTML. I'm putting this in a hidden field's value attribute.
<input type="hidden" name="columns" value="{"firstName": "First Name", "lastName": "Surname", "unique_id": "Your Whizbang ID"}" id="columns">
Here's the JSON object by itself.
{"firstName": "First Name", "lastName": "Surname", "unique_id": "Your Whizbang ID"}
You can see that the JSON string's properties are in the same order as the LinkedHashMap in the JSON string. However, JSON Objects aren't really supposed to the preserve order of their properties. Thus, when my controller receives the columns parameter, and I use the JSON.parse() method on it, it creates a plain ol' unordered HashMap instead of a LinkedHashMap. As a result, the columns in my search results display in the wrong order when I render them into an HTML table.
At least one fellow has had a similar problem. Adding as LinkedHashMap after running JSON.parse() doesn't cut it, since the .parse() method screws up the order from the get go.
Daniel Woods, in his response to the above post, noted:
If it's a matter of the grails data binder not working for you, you should be able to override the implicit property setter to cast the object to your favorite Map implementation.
I assume that he's saying I could write my own parser, which would honor the order of the JSON elements (even though it technically shouldn't). I imagine I could also write my own converter so that the resulting JSON element would be something like:
{[{firstName: "First Name"}, {lastName: "Surname"}, {unique_id "Your Whizbang ID"}]}
I'm just about terrified of how the JSON parser would handle that, though. Would I get back a list of HashMaps?
Again, my real question is What is the best way to send a LinkedHashMap data structure from a GSP to a Controller so that I can preserve order, but do minimal work in parsing the data? I'm assuming that's JSON, but I'm more than happy to be told, "Why not just..."

I think the issue is a mismatch between the nature of Java/Groovy collections and the simple "it's a list or it's a map" nature of JSON. Without getting into custom parsing, I'd suggest shifting what you're sending a bit. Instead of trying to force Groovy notions of a LinkedHashMap into Javascriptland, maybe stick to an idiom Javascript understands, such as a list of maps.
In code, instead of:
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"]
how about:
List tableFields = [
[ name: 'firstName', label: 'First Name' ],
[ name: 'lastName', label: 'Surname' ],
[ name: 'unique_id', label: 'Your Whizbang ID' ],
]
This shifts you to JSON that'd maintain the data (I think) you need while giving JSON something it understands is ordered (a list):
<input type="hidden" name="columns" id="columns" value="[
{ "name": "firstName", "label": "First Name" },
{ "name": "lastName", "label": "Surname" },
{ "name": "unique_id", "label": "Your Whizbang ID" }
]" />
Whatever handles this will be a slightly deeper iterator, but that's the price of going from a land of good collections to simpler types...

What I'm doing for now is passing both the current JSON object and a list that I can iterate through. In the GSP template, this looks like:
<g:hiddenField name="columns" value="${colJson}"/>
<g:hiddenField name="columnOrder" value="${columns.collect{it.key}}"/>
where columns is the LinkedHashMap.
Then, in the controller that gets those params, I do this:
def columnTitles = params.columnOrder.tokenize(",[] ")
def unorderedColumns = JSON.parse(params.columns)
def columns = columnTitles.collectEntries{ [(it): unorderedColumns[it]] }
Not elegant, but it does work, and it requires a bit less refactoring than Joe Rinehart's suggestion.

Related

Сonvert dict or json formats with nested objects into string

Now I have a string in format dict but as i can guess its a json format its look like:
{
"gid":"1201400250397201",
"memberships":[
"can be nested objects",
...
],
"name":"Name of task",
"parent":{
"gid":"1201400250397199",
"name":"name of parent task"
},
"permalink_url":"https://url...."
}
So first question: am i right? I used dumps() from json library but got unicode escape sequences, loads() didnt work for me, i got error "the JSON object must be str, bytes or bytearray, not dict".
Second question: if its not json format, how can i get comfortable view? I did it:
first of all i get dict-line, then I print a dictionary's key:
for key in task:
task
print(task[key])
output:
1201400250397201
[]
Name of task
{'gid': '1201400250397199', 'name': ''name of parent task'}
https://url....
At actually it would be great if I get something like that:
gid: 1201400250397201
name: Name of task
parent_name: 'Name of task' etc
But I dont know how to get it :(
Next question: as you can see for part "parent" (penultimate line) I also get dictionary, how can I extract it and get convenient format?
Or maybe you have your comfortable methods?
Like stated in your error, the object you are working with is already a dictionary. You can print it directly as json with json.dumps:
task = {'gid': '1201400250397201', 'memberships': [{}], 'name': 'Name of task', 'parent': {'gid': '1201400250397199', 'name': 'name of parent task'},'permalink_url': 'https://url....'}
print(json.dumps(task, indent=4))
Setting indent=4 makes it readable and you'll get:
{
"gid": "1201400250397201",
"memberships": [
{}
],
"name": "Name of task",
"parent": {
"gid": "1201400250397199",
"name": "name of parent task"
},
"permalink_url": "https://url...."
}
If you don't want unicode characters to be escaped, add the argument ensure_ascii=False:
print(json.dumps(task, indent=4, ensure_ascii=False))

Return nested JSON in AWS AppSync query

I'm quite new to AppSync (and GraphQL), in general, but I'm running into a strange issue when hooking up resolvers to our DynamoDB tables. Specifically, we have a nested Map structure for one of our item's attributes that is arbitrarily constructed (its complexity and form depends on the type of parent item) — a little something like this:
"item" : {
"name": "something",
"country": "somewhere",
"data" : {
"nest-level-1a": {
"attr1a" : "foo",
"attr1b" : "bar",
"nest-level-2" : {
"attr2a": "something else",
"attr2b": [
"some list element",
"and another, for good measure"
]
}
}
},
"cardType": "someType"
}
Our accompanying GraphQL type is the following:
type Item {
name: String!
country: String!
cardType: String!
data: AWSJSON! ## note: it was originally String!
}
When we query the item we get the following response:
{
"data": {
"genericItemQuery": {
"name": "info/en/usa/bra/visa",
"country": "USA:BRA",
"cardType": "visa",
"data": "{\"tourist\":{\"reqs\":{\"sourceURL\":\"https://travel.state.gov/content/passports/en/country/brazil.html\",\"visaFree\":false,\"type\":\"eVisa required\",\"stayLimit\":\"30 days from date of entry\"},\"pages\":\"One page per stamp required\"}}"
}}}
The problem is we can't seem to get the Item.data field resolver to return a JSON object (even when we attach a separate field-level resolver to it on top of the general Query resolver). It always returns a String and, weirdly, if we change the expected field type to String!, the response will replace all : in data with =. We've tried everything with our response resolvers, including suggestions like How return JSON object from DynamoDB with appsync?, but we're completely stuck at this point.
Our current response resolver for our query has been reverted back to the standard response after none of the suggestions in the aforementioned post worked:
## 'Before' response mapping template on genericItemQuery query; same result as the 'After' listed below **
#set($result = $ctx.result)
#set($result.data = $util.parseJson($ctx.result.data))
$util.toJson($result)
## 'After' response mapping template **
$util.toJson($ctx.result)
We're trying to avoid a situation where we need to include supporting types for each nest level in data (since it changes based on parent Item type and in cases like the example I gave it can have three or four tiers), and we thought changing the schema type to AWSJSON! would do the trick. I'm beginning to worry there's no way to get around rebuilding our base schema, though. Any suggestions to the contrary would be helpful!
P.S. I've noticed in the CloudWatch logs that the appropriate JSON response exists under the context.result.data response field, but somehow there's the following transformedTemplate (which, again, I find very unusual considering we're not applying any mapping template except to transform the result into valid JSON):
"arn": ...
"transformedTemplate": "{data={tourist={reqs={sourceURL=https://travel.state.gov/content/passports/en/country/brazil.html, visaFree=false, type=eVisa required, stayLimit=30 days from date of entry}, pages=One page per stamp required}}, resIds=USA:BRA, cardType=visa, id=info/en/usa/bra/visa}",
"context": ...
Apologies for the lengthy question, but I'm stumped.
AWSJSON is a JSON string type so you will always get back a string value (this is what your type definition must adhere to).
You could try to make a type for data field which contains all possible fields and then resolve fields to a corresponding to a parent type or alternatively you could try to implement graphQL interfaces

Order of performing sorting on a big JSON object

i have a big json object with a list of "tickets". schema looks like below
{
"Artist": "Artist1",
"Tickets": [
{
"Id": 1,
"Attr2Array": [
{
"Att41": 1,
"Att42": "A",
"Att43": null
},
{
"Att41": 1,
"Att42": "A",
"Att43": null
},
],
.
.
.
(more properties)
"Price": "20",
"Description": "I m a ticket"
},
{
"Id": 4,
"Attr2Array": [
{
"Att41": 1,
"Att42": "A",
"Att43": null
},
{
"Att41": 1,
"Att42": "A",
"Att43": null
},
],
.
.
.
.
(more properties)
"Price": "30",
"Description": "I m a ticket"
}
]
}
each item in the list has around 25-30 properties (some simple types, and others complex array as nested objects)
i have to read the object from an api endpoint and extract only "ID" and "Description" but they need to be sorted by "Price" which is an int for example
In what order shall i proceed with this data manipulation
Shall i use the json object, deserialised it into another object with just those 2 properties (which i need) and THEN perform sort "asc" on the "Price"?
Please note that after i have the sorted list i will have to convert it back to a json list because the front end consumes a json after all.
What i dont like about this approach is the cycle of serialisation and deserialisation that happens
or
I perform a sort on the json object first (using for example a binary/bubble sort) and then use the object to create a strongly typed (deserialised) object with just those 2 properties and then serialise it back to pass to the front end
I dont know how performant the bubble sort will be and if at all i will get any gain in performance for large chunks of data processing.
I also need to keep in mind that this implementation can take into account other properties like "availabilitydate" because at a later date, this front end could add one more filter like "availabilitdate" asc
any help is much appreciated
thanks
You can deserialize your JSON string (or file) using the Microsoft System.Web.Extensions and JavaScriptSerializer.DeserializeObject.
First, you must have classes associated to your JSON. To create classes, select your JSON sample data and, in Visual Studio, go to Edit / Paste Special / Paste JSON As Classes.
Next, use this sample to deserialize a JSON string to typed objects, and to sort all Tickets by Price property using Linq.
String json = System.IO.File.ReadAllText(#"C:\Data.json");
var root = new System.Web.Script.Serialization.JavaScriptSerializer().Deserialize<Rootobject>(json);
var sortedTickets = root.Tickets.OrderBy(t => t.Price);

Create dyanamic class to accept dynamic JSON

I have a web api endpoint that receives JSON and serializes it into objects. Very basic and common stuff. However, I have a requirement to accept custom user defined fields. For example, a developer may want to add a custom field for "account #" and pass that along via the API. I'm stuck how I could define a field on my class if I don't know the name. I need to support unlimited fields so I cannot simply create a field for custom1, custom2, custom2, etc.
I would think that my JSON could look something like this... where custom_label_xxx is identifies the field label:
...
"custom_fields": {
"custom_label_90": 49,
"custom_label_83": [ 28, 29, 30 ],
"custom_label_89": "2012/05/21"
},
...
How in the world can I setup a dynamic class to accept this dynamic JSON?
I have Googled forever and cannot find any examples using custom fields.
Your post method can accept dynamic as a param:
public HttpResponseMessage Post(dynamic obj)
This will accept every json that you send.
Saykor's reply is probably correct. However, I already have tons of business objects to work with and I only wanted the custom fields to be dynamic. Also, I wanted the Web API Help pages to be generated with sample JSON. So making the method dynamic would not work well for me.
What I finally did was probably obvious to some. I created a property on the class that was a dictionary. When I run the Help Pages it generates sample JSON that looks exactly what I needed. I have not tested this yet, but I assume it will serialize the JSON into a dictionary.
Here is the property in VB, but it could easily use dynamic in C# as well.
Private _customfields As Dictionary(Of String, String)
<DataMember(EmitDefaultValue:=False, IsRequired:=False, Order:=11)> _
Public Property customfields() As Dictionary(Of String, String)
Get
Return Me._customfields
End Get
Set(ByVal value As Dictionary(Of String, String))
Me._customfields = value
End Set
End Property
This resulted in the following JSON:
"customfields": {
"sample string 1": "sample string 2",
"sample string 3": "sample string 4",
"sample string 5": "sample string 6"
}

json file manipulation with javascript

I am getting json as response from server like below;
{"data":"<div align=\"left\"><select id =\"test\"><option id=\"1\" value=\"one\"><option id=\"2\" value=\"two\" selected></select></div>"};
I want to manipulate above json file using javascript to change option one to be selected instead of option two.
Any hints please.
Regards,
Raj
Your life would be easier if your JSON was actually data represented as JSON, instead of serialized DOM fragments embedded in a JSON value, e.g.,
[
{"value": "one"}
, {"value": "two", "selected": true}
]
Then when you turn that into an object, you could just do something like this (assume for the sake of the example that you named the result myArray):
myArray[0].selected = true; // Select the first element
myArray[1].selected = false; // Deselect the other element; in many cases, you'd probably need some sort of loop.