I'm quite new to AppSync (and GraphQL), in general, but I'm running into a strange issue when hooking up resolvers to our DynamoDB tables. Specifically, we have a nested Map structure for one of our item's attributes that is arbitrarily constructed (its complexity and form depends on the type of parent item) — a little something like this:
"item" : {
"name": "something",
"country": "somewhere",
"data" : {
"nest-level-1a": {
"attr1a" : "foo",
"attr1b" : "bar",
"nest-level-2" : {
"attr2a": "something else",
"attr2b": [
"some list element",
"and another, for good measure"
]
}
}
},
"cardType": "someType"
}
Our accompanying GraphQL type is the following:
type Item {
name: String!
country: String!
cardType: String!
data: AWSJSON! ## note: it was originally String!
}
When we query the item we get the following response:
{
"data": {
"genericItemQuery": {
"name": "info/en/usa/bra/visa",
"country": "USA:BRA",
"cardType": "visa",
"data": "{\"tourist\":{\"reqs\":{\"sourceURL\":\"https://travel.state.gov/content/passports/en/country/brazil.html\",\"visaFree\":false,\"type\":\"eVisa required\",\"stayLimit\":\"30 days from date of entry\"},\"pages\":\"One page per stamp required\"}}"
}}}
The problem is we can't seem to get the Item.data field resolver to return a JSON object (even when we attach a separate field-level resolver to it on top of the general Query resolver). It always returns a String and, weirdly, if we change the expected field type to String!, the response will replace all : in data with =. We've tried everything with our response resolvers, including suggestions like How return JSON object from DynamoDB with appsync?, but we're completely stuck at this point.
Our current response resolver for our query has been reverted back to the standard response after none of the suggestions in the aforementioned post worked:
## 'Before' response mapping template on genericItemQuery query; same result as the 'After' listed below **
#set($result = $ctx.result)
#set($result.data = $util.parseJson($ctx.result.data))
$util.toJson($result)
## 'After' response mapping template **
$util.toJson($ctx.result)
We're trying to avoid a situation where we need to include supporting types for each nest level in data (since it changes based on parent Item type and in cases like the example I gave it can have three or four tiers), and we thought changing the schema type to AWSJSON! would do the trick. I'm beginning to worry there's no way to get around rebuilding our base schema, though. Any suggestions to the contrary would be helpful!
P.S. I've noticed in the CloudWatch logs that the appropriate JSON response exists under the context.result.data response field, but somehow there's the following transformedTemplate (which, again, I find very unusual considering we're not applying any mapping template except to transform the result into valid JSON):
"arn": ...
"transformedTemplate": "{data={tourist={reqs={sourceURL=https://travel.state.gov/content/passports/en/country/brazil.html, visaFree=false, type=eVisa required, stayLimit=30 days from date of entry}, pages=One page per stamp required}}, resIds=USA:BRA, cardType=visa, id=info/en/usa/bra/visa}",
"context": ...
Apologies for the lengthy question, but I'm stumped.
AWSJSON is a JSON string type so you will always get back a string value (this is what your type definition must adhere to).
You could try to make a type for data field which contains all possible fields and then resolve fields to a corresponding to a parent type or alternatively you could try to implement graphQL interfaces
Related
I'm trying to create a JSON Schema for something very dynamic. Say I have two pieces of data, and I want one (the source) to determine the validity of the other (the target). Both can change over time, but both will always be an array of objects with known properties. For example:
source.json
[
{ "id": 23, "active": true },
{ "id": 9, "active": false },
{ "id": 6, "active": true }
]
target.json
[
{ "identifier": 6 }
]
The schema I'm trying to create is this: For each active object in the source array, there should be an equivalent object in the target array. A little more formally, given an object in the source array where "active" equals true and "id" equals x, there should be an object in the target array where "identifier" equals x.
In the example above, the target would be invalid because it's missing an object like { "identifier": 23 }.
However, I want to statically define this schema (or something capable of generating it) in a JSON file ahead of time, and this feels pretty tough since the source array can change. I'm using Ajv, and I'm aware that it supports the $data reference, but I'm not sure that's enough to help me here. The other option I could see is creating some kind of schema-generator definition? In concept, it too would be a JSON object I define ahead of time, but at runtime it would be used to safely generate arbitrary schemas based on runtime data such as the source array. However, if a mechanism like this doesn't already exist, trying to implement it myself sounds like a great way to give myself a code-injection vulnerability.
Thanks for your time!
My previous problem was I'm unable to arrange the json structure like what I wanted. And I found some answers that looks like it almost satisfy my needs but unfortunately I don't know if it's working or not because another problem has occurred.
Below, I arranged my own json data based on the json structure by someone named Programmer.
{
"dialog_type": {"human": {"inner": "He is so scary"}}
}
Here, I have a key called "human". I have two keys in my data. First is "human" and second is "non_human". Now if I have two data in my json file, it will become like this :
{
"dialog_type": {"human": {"inner": "He is so scary"}}
},
{
"dialog_type": {"non_human": "Once upon a time..."}
}
This case is maybe simillar to someone asked here. But unfortunately I have no idea if it's possible to do that in unity. I want to make a method like this answer. So I can determine what action to take by comparing those keys.
Now the question is how do I get the key name as a string in my json data using C# ?
To access the property names of a Unity javascript object, you can use:
for(var property in obj) {}
For instance, this will log all keys (i.e. property names) of all the property key-value pairs in a Unity javascript object (e.g. "key1" and "key2"):
function Start () {
var testObject = {
"key1": "value 1",
"key2": "value 2"
};
for(var property in testObject) {
Debug.Log(property.Key);
};
}
That should give you a way to check objects for any matching property names you are interested in.
Is there another way to track what is unmarshalled than write own reverter for each field?
I'm updating my local data based on json message and my problem is (simplified):
I'm expecting json like
{ "items": [ { "id":1, "name":"foobar", "price":"12.34" } ] }
which is then unmarshaled to class TItems by
UnMarshaller.TryCreateObject( TItems, TJsonObject( OneJsonElement ), TargetItem )
My problem is that I can't make difference between
{ "items": [ { "id":1, "name":"", "price":"12.34" } ] }
and
{ "items": [ { "id":1, "price":"12.34" } ] }
In both cases name is blank and i'd like to update only those fields that are passed on json message. Of course I could create a reverted for each field, but there are plenty of fields and messages so it's quite huge.
I tried to look REST.Jsonreflect.pas source, but couldn't make sense.
I'm using delphi 10.
In Rest.Json unit there is a TJson class defined that offers several convenience methods like converting objects to JSON and vice versa. Specifically, it has a class function JsonToObject where you can specify options like for example ignore empty strings or ignore empty arrays. I think the TJson class can serve you. For unmarshalling complex business objects you have to write custom converters though.
Actually, my problem was finally simple to solve.
Instead of using TJSONUnMarshal.tryCreateObject I use now TJSONUnMarshal.CreateObject. First one has object parameters declared with out modifier, but CreateObject has Object parameter var modifier, so I was able to
create object, initalize it from database and pass it to CreateObject which only modifies fields in json message.
tl;dr
I need to send the data from a LinkedHashMap in a GSP template to a Controller and preserve the order of the elements.
I'm assuming a structured data format like JSON is the ideal way to do this, but Grails' JSON converter doesn't create an ordered JSON object from a LinkedHashMap.
What is the best way to send a LinkedHashMap data structure from a GSP to a Controller so that I can preserve order, but do minimal work in parsing the data?
Long version
I'm developing a taglib to render search results in a table.
In the taglib, I construct a LinkedHashMap that specifies the data columns and the labels that the user wants to show for the column names. For example:
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"]
That map gets sent to a view, which will then send it back to a controller to retrieve the search results from the database. I need to preserve the order of the elements (hence the use of a LinkedHashMap).
My first thought was to turn the LinkedHashMap into a JSON string, and then send it to the controller via a hidden form element. So,
import grails.converters.JSON
//taglib class and other code
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"] as JSON
However, that creates a JSON Object like this in the HTML. I'm putting this in a hidden field's value attribute.
<input type="hidden" name="columns" value="{"firstName": "First Name", "lastName": "Surname", "unique_id": "Your Whizbang ID"}" id="columns">
Here's the JSON object by itself.
{"firstName": "First Name", "lastName": "Surname", "unique_id": "Your Whizbang ID"}
You can see that the JSON string's properties are in the same order as the LinkedHashMap in the JSON string. However, JSON Objects aren't really supposed to the preserve order of their properties. Thus, when my controller receives the columns parameter, and I use the JSON.parse() method on it, it creates a plain ol' unordered HashMap instead of a LinkedHashMap. As a result, the columns in my search results display in the wrong order when I render them into an HTML table.
At least one fellow has had a similar problem. Adding as LinkedHashMap after running JSON.parse() doesn't cut it, since the .parse() method screws up the order from the get go.
Daniel Woods, in his response to the above post, noted:
If it's a matter of the grails data binder not working for you, you should be able to override the implicit property setter to cast the object to your favorite Map implementation.
I assume that he's saying I could write my own parser, which would honor the order of the JSON elements (even though it technically shouldn't). I imagine I could also write my own converter so that the resulting JSON element would be something like:
{[{firstName: "First Name"}, {lastName: "Surname"}, {unique_id "Your Whizbang ID"}]}
I'm just about terrified of how the JSON parser would handle that, though. Would I get back a list of HashMaps?
Again, my real question is What is the best way to send a LinkedHashMap data structure from a GSP to a Controller so that I can preserve order, but do minimal work in parsing the data? I'm assuming that's JSON, but I'm more than happy to be told, "Why not just..."
I think the issue is a mismatch between the nature of Java/Groovy collections and the simple "it's a list or it's a map" nature of JSON. Without getting into custom parsing, I'd suggest shifting what you're sending a bit. Instead of trying to force Groovy notions of a LinkedHashMap into Javascriptland, maybe stick to an idiom Javascript understands, such as a list of maps.
In code, instead of:
def tableFields = [firstName: "First Name", lastName: "Surname", unique_id: "Your Whizbang ID"]
how about:
List tableFields = [
[ name: 'firstName', label: 'First Name' ],
[ name: 'lastName', label: 'Surname' ],
[ name: 'unique_id', label: 'Your Whizbang ID' ],
]
This shifts you to JSON that'd maintain the data (I think) you need while giving JSON something it understands is ordered (a list):
<input type="hidden" name="columns" id="columns" value="[
{ "name": "firstName", "label": "First Name" },
{ "name": "lastName", "label": "Surname" },
{ "name": "unique_id", "label": "Your Whizbang ID" }
]" />
Whatever handles this will be a slightly deeper iterator, but that's the price of going from a land of good collections to simpler types...
What I'm doing for now is passing both the current JSON object and a list that I can iterate through. In the GSP template, this looks like:
<g:hiddenField name="columns" value="${colJson}"/>
<g:hiddenField name="columnOrder" value="${columns.collect{it.key}}"/>
where columns is the LinkedHashMap.
Then, in the controller that gets those params, I do this:
def columnTitles = params.columnOrder.tokenize(",[] ")
def unorderedColumns = JSON.parse(params.columns)
def columns = columnTitles.collectEntries{ [(it): unorderedColumns[it]] }
Not elegant, but it does work, and it requires a bit less refactoring than Joe Rinehart's suggestion.
Consider that you get this JSON object:
{ id: 3, name: 'C' }
How can you tell if it's a Vitamin object or a Programming language object. Am I clear?
In typed languages, we simply understand the nature of the object (the Type of the object) from its very name. Each object has a name. How we might achieve something similar with JSON? How can we give names to JSON objects? Any established pattern?
If you have this problem it means that your serialization format is not properly defined. You should always be able to deserialize from a JSON object without any trouble.
There are two options:
Add a type field: You can't create the typed object because there are multiple types with the same set of field names. By adding a type field, there is no question about the underlying type of the object. With this option, you will have to go through the process of handling the type field when creating the real object.
Use context to decide what type the object is: If the query you submitted implies that only one type of object should be returned, then there is no problem. Alternately, if multiple types can be returned, the response format should group the objects by type.
{
"states": [ { id: 3, name: 'New York' } ],
"cities": [ { id: 4, name: 'New York' } ]
}
You have to add a field describing the type of the object. That way you can never have doubts about the type. Look for example at Google's Calendar API. Google's calendar resources all have a field "kind" describing the type.
A Calendar Event looks like:
{
"kind": "calendar#event",
"etag": etag,
"id": string,
"created": datetime,
"updated": datetime,
"summary": string,
...
}
A Calendar List Entry like:
{
"kind": "calendar#calendarListEntry",
"etag": etag,
"id": string,
"summary": string,
"description": string,
"location": string,
...
}
etc.
There's no way to do it for JSON fetched from somewhere else.
If you have control over the JSON, then you can do this:
Add a "type" field to each object.
Tailor make a JSON function to handle this. This can be done in two ways, one secure, one insecure.
Secure method
Create a function stringifyJSONType(). This one stringifies as usual, but adds a type parameter on-the-fly.
function stringifyJSONType(o){
o.type=o.constructor.name;
var s=JSON.stringify(o);
delete o.type; //To be clean and not modify the object.
return s;
}
Now, in the "secure" method, we have to create a switch-case for every type we expect for parsing. This only allows certain types (those which have been kept in the switch-case).
function parseJSONType(s){
var o=JSON.parse(s);
switch(o.type){
case "String":
o.__proto__=String;
break;
case "Date":
o.__proto__=Date;
break;
case "City": //Your custom object
o.__proto__=City;
break;
case "State": //Your custom object
o.__proto__=State;
break;
case "Country": //Your custom object
o.__proto__=Country;
break;
/*.... more stuff... */
case "Object":
default:
o.__proto__=Object;
break;
}
delete o.type;
return o;
}
Now, use these two methods just like JSON.parse() and JSON.stringify(), and it'll work. But for every new type you want to support, you'll have to add an extra case.
Insecure method
Not too insecure, just that it uses the nefarious eval() method. Which isn't too good.. As long as nobody else has the ability to add a custom type parameter to your JSON, it's OK, though.
Here, you use the same stringifyJSONType() as above, but use a different parse method.
function stringifyJSONType(o){
o.type=o.constructor.name;
var s=JSON.stringify(o);
delete o.type; //To be clean and not modify the object.
return s;
}
function parseJSONType(s){
var o=JSON.parse(s);
o.__proto__=eval(o.type);
delete o.type;
return o;
}
This has the advantage of not requiring switch-case and being easily extended to new types (no code changes required).