Let I have an array of structures in JSON like this:
[{
"name": "Amrit",
"second_name": "Valentine"
},
{
"name": "Beatriz",
"second_name": "Carty"
}// And so on...
]
It is human readable, but the issue with it is that "name" and "second_name" are repeated for all entities in this array. And if the array is big, it creates about 40% size overhead.
On the other hand, I could translate this data structure:
{
"names": ["Amrit", "Beatriz" /* ... */],
"second_names": ["Valentine", "Carty" /* ... */]
}
But it is not human readable any longer, but the size is optimal. It is also error prone, since it is not guaranteed that sizes of "names" and "second_names" are the same.
Is there any trade-off in JSON, so I can use array of structures without repeating filed names and the size is still optimal?
Not as part of JSON itself, no. What I've done on projects is a generic system where the JSON would look like this:
{
"__keys__": ["name", "second_name"],
"values": [
["Amrit", "Valentine"],
["Beatriz", "Carty"]
]
}
...where once I've parsed the JSON, I throw a utility function at it to consume that and turn it into an array of objects. Along these lines:
const json = `{
"__keys__": ["name", "second_name"],
"values": [
["Amrit", "Valentine"],
["Beatriz", "Carty"]
]
}`;
const parsed = JSON.parse(json);
const expanded = expand(parsed);
console.log(expanded);
function expand(data) {
const keys = data.__keys__;
return data.values.map(entry => {
const obj = {};
keys.forEach((key, index) => {
obj[key] = entry[index];
});
return obj;
});
}
Or, of course, you just leave off __keys__ and assume your endpoint knows what the keys should be, but it impairs readability/debugging even more than the above does. :-)
(You can shoehorn that forEach into a reduce, because you can shoehorn just about any array operation into a reduce, but it doesn't buy you anything.)
Related
So I have two JSON files with different fields.
{
"password": { "length": 5, "reset": true},
}
"dataSettings": {
"enabled": true,
"algorithm": { "djikstra": true },
"country": { "states": {"USA": true, "Romania": false}}
}
I want to be able to use the same code to be able to print out all the nested fields and its values in the JSON.
I tried using a [JSON to Dart converter package](https://javiercbk.github.io/json_to_dart/.
However, it seems like using this would make it so I would have to hardcode all the values, since I would retrieve it by doing
item.dataSettings.country.states.USA
which is a hardcoded method of doing it. Instead I want a way to loop through all the nested values and print it out without having to write it out myself.
You can use ´dart:convert´ to convert the JSON string into an object of type Map<String, dynamic>:
import 'dart:convert';
final jsonData = "{'someKey' : 'someValue'}";
final parsedJson = jsonDecode(jsonData);
You can then iterate over this dict like any other:
parsedJson.forEach((key, value) {
// ... Do something with the key and value
));
For your use case of listing all keys and values, a recursive implementation might be most easy to implement:
void printMapContent(Map<String, dynamic> map) {
parsedJson.forEach((key, value) {
print("Key: $key");
if (value is String) {
print("Value: $value");
} else if (value is Map<String, dynamic>) {
// Recursive call
printMapContent(value);
}
));
}
But be aware that this type of recursive JSON parsing is generally not recommendable, because it is very unstructured and prone to errors. You should know what your data structure coming from the backend looks like and parse this data into well-structured objects.
There you can also perform input validation and verify the data is reasonable.
You can read up on the topic of "JSON parsing in dart", e.g. in this blog article.
I am trying to import json data that might include present or absent mappings within one of the properties, and figured the correct data type to represent this was Map<string, number>, but I'm getting an error when I try to do this.
My JSON file, data.json, looks like this:
{
"datas": [
{
"name":"test1",
"config":"origin",
"entries": {
"red":1,
"green":2
}
}
,
{
"name":"test2",
"config":"remote",
"entries": {
"red":1,
"blue":3
}
}
,
{
"name":"test3",
"entries": {
"red":1,
"blue":3,
"purple":3
}
}
]
}
My typescript code, Data.ts, which attempts to read it looks like this:
import data from './data.json';
export class Data {
public name:string;
public config:string;
public entries:Map<string, number>;
constructor(
name:string,
entries:Map<string, number>,
config?:string
) {
this.name = name;
this.entries = entries;
this.config = config ?? "origin";
}
}
export class DataManager {
public datas:Data[] = data.datas;
}
But that last line, public datas:Data[] = data.datas;, is throwing an error.
Is there a proper way to import data like this?
The goal, ultimately, is to achieve three things:
Any time entries is present, it should receive some validation that it only contains properties of type number; what those properties are is unknown to the programmer, but will be relevant to the end-user.
If config is absent in the JSON file, the construction of Data objects should supply a default value (here, it's "origin")
This assignment of the data should occur with as little boilerplate code as possible. If, down the line Data is updated to have a new property (and Data.json might or might not receive updates to its data to correspond), I don't want to have to change how DataManager.data receives the values
Is this possible, and if so, what's the correct way to write code that will do this?
The lightest weight approach to this would not be to create or use classes for your data. You can instead use plain JavaScript objects, and just describe their types strongly enough for your use cases. So instead of a Data class, you can have an interface, and instead of using instances of the Map class with string-valued keys, you can just use a plain object with a string index signature to represent the type of data you already have:
interface Data {
name: string;
config: string;
entries: { [k: string]: number }
}
To make a valid Data, you don't need to use new anywhere; just make an object literal with name, config, and entries properties of the right types. The entries property is { [k: string]: number }, which means that you don't know or care what the keys are (other than the fact that they are strings as opposed to symbols), but the property values at those keys should be numbers.
Armed with that definition, let's convert data.datas to Data[] in a way that meets your three criteria:
const datas: Data[] = data.datas.map(d => ({
config: "origin", // any default values you want
...d, // the object
entries: onlyNumberValues(d.entries ?? {}) // filter out non-numeric entries
}));
function onlyNumberValues(x: { [k: string]: unknown }): { [k: string]: number } {
return Object.fromEntries(
Object.entries(x).filter(
(kv): kv is [string, number] => typeof kv[1] === "number"
)
);
}
The above sets the entries property to be a filtered version of the entries property in the incoming data, if it exists. (If entries does not exist, we use an empty object {}). The filter function is onlyNumberValues(), which breaks the object into its entries via the Object.entries() method, filters these entries with a user-defined type guard function, and packages them back into an object via the Object.fromEntries() method. The details of this function's implementation can be changed, but the idea is that you perform whatever validation/transformation you need here.
Any required property that may be absent in the JSON file should be given a default value. We do this by creating an object literal that starts with these default properties, after which we spread in the properties from the JSON object. We do this with the config property above. If the JSON object has a config property, it will overwrite the default when spread in. (At the very end we add in the entries property explicitly, to overwrite the value in the object with the filtered version).
Because we've spread the JSON object in, any properties added to the JSON object will automatically be added. Just remember to specify any defaults for these new properties, if they are required.
Let's make sure this works as desired:
console.log(datas)
/* [{
"config": "origin",
"name": "test1",
"entries": {
"red": 1,
"green": 2
}
}, {
"config": "remote",
"name": "test2",
"entries": {
"red": 1,
"blue": 3
}
}, {
"config": "origin",
"name": "test3",
"entries": {
"red": 1,
"blue": 3,
"purple": 3
}
}] */
Looks good.
Playground link to code
A XML is converted to JSON and sent to an EventHub and then a Stream Analytics process it.
The problem is when XML uses the same tags name it gets converted to a list on the JSON side, but when there is only one tag is not converted to a list. So the same tag can be an array or not.
Ex:
I can receive either:
{
"k1": 123,
"k2": {
"l1": 2,
"l2": 12
}
}
or:
{
"k1": 123,
"k2": [
{
"l1": 2,
"l2": 12
},
{
"l1": 3,
"l2": 34
}
]
}
I can easily deal with the first scenario and the second scenario independently, but I don't know how to deal with both at the same time, is this possible?
Yes, it is. If you know how to deal with each of the cases individually, I will just suggest an idea of how you can make the distinction between these two cases, before you treat them individually.
Essentially, the idea is to check if the field is an array. What I did was, I wrote a UDF function in javascript that returns "true"/"false", if the passed object is an array:
function UDFSample(arg1) {
'use strict';
var isArray = Array.isArray(arg1);
return isArray.toString();
}
here is how you can use this in the group query:
with test as (SELECT Document from input where UDF.IsArray(k2) = 'true')
now "test" contains items that you can treat as an array. The same you can do for the case where k2 is just an object.
I have a rest API to test and I have to compare two json responses. Below you can find a structure of the file. Both files to compare should contains the same elements but order might be different. Unfortunately the names, the type (simple, array) and the number of keys (root, nodeXYZ) are also not known.
{"root": [{
"node1": "value1",
"node2": "value1",
"node3": [
{
"node311": "value311",
"node312": "value312"
},
{
"node321": "value321",
"node322": "value322"
}
],
"node4": [
{
"node411": "value411",
"node412": "value413",
"node413": [ {
"node4131": "value4131",
"node4132": "value4131"
}],
"node414": []
}
{
"node421": "value421",
"node422": "value422",
"node423": [ {
"node4231": "value4231",
"node4232": "value4231"
}],
"node424": []
}]
"node5": [
{"node51": "value51"},
{"node52": "value52"},
]
}]}
I have found some useful information in
Groovy - compare two JSON objects (same structure) and return ArrayList containing differences
Getting node from Json Response
Groovy : how do i search json with key's value and find its children in groovy
but I could not combine it to an solution.
I thought the solution might look like this:
take root
get root children names
check if child has children and get their names
do it to the lowest leve child
With all names in place comparing should be easy (I guess)
Unfortunately I did not manage to get keys under root
Just compare the slurped maps:
def map1 = new JsonSlurper().parseText(document1)
def map2 = new JsonSlurper().parseText(document2)
assert map1 == map2
Try the JSONassert library: https://github.com/skyscreamer/JSONassert. Then you can use:
JSONAssert.assertEquals(expectedJson, actualJson, JSONCompareMode.STRICT)
And you will get nicely formatted deltas like:
java.lang.AssertionError: Resources.DbRdsLiferayInstance.Properties.KmsKeyId
Expected: kms-key-2
got: kms-key
I know i can customize the JSON response registering JSON marshallers to Domain entities, even i can create profiles with names for different responses.
This is done filling an array that later will be marshalled like:
JSON.registerObjectMarshaller(myDomain) {
def returnArray = [:]
returnArray['id'] = it.id
returnArray['name'] = it.name
returnArray['price'] = it.price
return returnArray
}
What i want is to alter the way it gets marshalled to have two sections like
{
"paging": {
"total": 100
},
"data": [
{
"id": 1,
"description": "description 1",
}
},
...
]
}
I assume i have to implemetn a custom JSON Marshaller but i don't know how to use it for a specific response instead of wide application.
EDIT: I assume i'll need a custom RENDERER apart from the marshaller. Is this one that i don't know how to use for specific response.
What about a simple:
def json = new JSON([ paging: [ total: myArray.totalCount ], data: myArray ])
Your domain objects will be converted with the marshaller you have set up while your paging data will simply be transformed into JSON.