I'm having a problem iterating over my json in TypeScript. I'm having trouble with one specific json field, the tribe. For some reason I can't iterate over that one. In the debugger, I'm expecting the Orc to show up but instead I get a 0. Why is this? How do I iterate correctly over my tribe data?
// Maps a profession or tribe group name to a bucket of characters
let professionMap = new Map<string, Character[]>()
let tribeMap = new Map<string, Character[]>()
let herolistJson = require('./data/HeroList.json')
for (let hero of herolistJson){
// Certain characters can have more than one tribe
// !!!!! The trouble begins here, tribe is 0???
for (let tribe in hero.tribe){
let tribeBucket = tribeMap.get(tribe) as Character[]
// If the hero does not already exist in this tribe bucket, add it
if(tribeBucket.find(x => x.name == hero.name) === undefined )
{
tribeBucket.push(new Character(hero.name, hero.tribe, hero.profession, hero.cost))
}
}
}
My json file looks like this
[
{
"name": "Axe",
"tribe": ["Orc"],
"profession": "Warrior",
"cost": 1
},
{
"name": "Enchantress",
"tribe": ["Beast"],
"profession": "Druid",
"cost": 1
}
]
in iterates over the keys of an object, not the values. The keys of an array are its indices. If you use of instead, you'll use the newer iterator protocol and an Array's iterator provides values instead of keys.
for (let tribe of /* in */ hero.tribe) {
Note, that this won't work in IE 11, but will work in most other browsers as well many JS environments that are ES2015 compatible. kangax/compat has a partial list.
Change the "in" to "of" in second loop.
Trying to pull first value from a parsed JSON string using the JsonConverter library.
What I currently have is:
result = objHTTP.responseText
Set parsedResult = JsonConverter.ParseJson(result)
i = 3
For Each Item In parsedResult("From")
wsSheet0.Cells(i, 12) = parsedResult("From")(Item)("Price")
i = i + 1
Next
With parsedResult("From")
wsSheet0.Cells(11, 12) = parsedResult("From")("Chocolate")("Price")("AsAtDate")
End With
The last line of this gets a type mismatch error, so I'm still trying to figure out how to just pull that one line item.
For the sake of clarity, the parsed JSON string looks like:
{
"From":{
"Chocolate":{
"Price":1.0,
"AsAtDate":"2018-05-04T00:00:00"
},
"Lime":{
"Price":1.35415115,
"AsAtDate":"2018-05-04T00:00:00"
},
"Strawberry":{
"Price":1.19517151,
"AsAtDate":"2018-05-04T00:00:00"
},
"Vanilla":{
"Price":0.77522986,
"AsAtDate":"2018-05-04T00:00:00"
},
"Blueberry":{
"Price":1.00084071,
"AsAtDate":"2018-05-04T00:00:00"
},
"Lemon":{
"Price":0.75030012,
"AsAtDate":"2018-05-04T00:00:00"
}
},
"To":"Chocolate",
"RequestedDate":"2018-05-22T08:26:16"
}
Use
parsedResult("From")("Chocolate")("AsAtDate")
Or more generally to get all:
parsedResult("From")(item)("AsAtDate")
Ok I've got a list of objects, pretty standard.
const list = Immutable.List([{type:'thang',data:{id:'pants'}}]);
Now I want to change pants to shorts... so I'm thinking
list.setIn([0,'data','id'],'shorts');
Alas
Error: invalid keyPath
How is this done?
I can't even get this far despite messing around with this for a while :/ Once I know how to do this I'd like to know how to add a new element at a position
list.setIn([0,'data','length'],'short');
To add a new length attribute to the data object at position 0 in the list.
My bad. I was going wrong with the creation of the Immutable structure. If I change
const list = Immutable.List([{type:'thang',data:{id:'pants'}}]);
To
const list = Immutable.fromJS([{type:'thang',data:{id:'pants'}}]);
Then I can
list.setIn([0,'data','id'],'shorts');
Our nested structured data:
const state = {
Persons: [
{
fname: 'J.R.R',
lname: 'Tolkin',
},
{
fname: 'jack',
lname: 'London',
}
]
};
Requiring Immutable
const { fromJS } = require('immutable')
Turning simple object to Map
const stateMapped = fromJS(state);
Getting Data from nested structure
console.log(stateMapped.getIn(['Persons', 0, 'fname']))//output: J.R.R
setting data in nested structure
var objClone = stateMapped.setIn(['Persons', '0', 'fname'], 'John Ronald Reuel');
console.log('' + objClone.getIn(['Persons', 0, 'fname'])); //output: John Ronald Reuel
I'm trying to take data in from a JSON file and link it to my geoJSON file to create a choropleth map with the county colours bound to the "amount" value but also I would like a corresponding "comment" value to be bound to a div for when I mouseover that county.
My code at http://bl.ocks.org/eoiny/6244102 will work to generate a choropleth map when my counties.json data is in the form:
"Carlow":3,"Cavan":4,"Clare":5,"Cork":3,
But things get tricky when I try to use the following form:
{
"id":"Carlow",
"amount":11,
"comment":"The figures for Carlow show a something." },
I can't get my head around how join the "id": "Carlow" from counties.json and "id": "Carlow" path created from ireland.json, while at the same time to have access to the other values in counties.json i.e. "amount" and "comment".
Apologies for my inarticulate question but if anyone could point me to an example or reference I could look up that would be great.
I would preprocess the data when it's loaded to make lookup easier in your quantize function. Basically, replace this: data = json; with this:
data = json.reduce(function(result, county) {
result[county.id] = county;
return result;
}, {});
and then in your quantize function, you get at the amounts like this:
function quantize(d) {
return "q" + Math.min(8, ~~(data[d.id].amount * 9 / 12)) + "-9";
}
What the preprocessing does is turn this array (easily accessed by index):
[{id: 'xyz', ...}, {id: 'pdq', ...}, ...]
into this object with county keys (easily accessed by county id):
{'xyz': {id: 'xyz', ...}, 'pdq': {id: 'pdq', ...}, ...}
Here's the working gist: http://bl.ocks.org/rwaldin/6244803
I see a lot of references to "compressed JSON" when it comes to different serialization formats. What exactly is it? Is it just gzipped JSON or something else?
Compressed JSON removes the key:value pair of json's encoding to store keys and values in seperate parallel arrays:
// uncompressed
JSON = {
data : [
{ field1 : 'data1', field2 : 'data2', field3 : 'data3' },
{ field1 : 'data4', field2 : 'data5', field3 : 'data6' },
.....
]
};
//compressed
JSON = {
data : [ 'data1','data2','data3','data4','data5','data6' ],
keys : [ 'field1', 'field2', 'field3' ]
};
This method of usage i found here
Content from link (http://www.nwhite.net/?p=242)
rarely find myself in a place where I am writing javascript applications that use AJAX in its pure form. I have long abandoned the ‘X’ and replaced it with ‘J’ (JSON). When working with Javascript, it just makes sense to return JSON. Smaller footprint, easier parsing and an easier structure are all advantages I have gained since using JSON.
In a recent project I found myself unhappy with the large size of my result sets. The data I was returning was tabular data, in the form of objects for each row. I was returning a result set of 50, with 19 fields each. What I realized is if I augment my result set I could get a form of compression.
// uncompressed
JSON = {
data : [
{ field1 : 'data1', field2 : 'data2', field3 : 'data3' },
{ field1 : 'data4', field2 : 'data5', field3 : 'data6' },
.....
]
};
//compressed
JSON = {
data : [ 'data1','data2','data3','data4','data5','data6' ],
keys : [ 'field1', 'field2', 'field3' ]
};
I merged all my values into a single array and store all my fields in a separate array. Returning a key value pair for each result cost me 8800 byte (8.6kb). Ripping the fields out and putting them in a separate array cost me 186 bytes. Total savings 8.4kb.
Now I have a much more compressed JSON file, but the structure is different and now harder to work with. So I implement a solution in Mootools to make the decompression transparent.
Request.JSON.extend({
options : {
inflate : []
}
});
Request.JSON.implement({
success : function(text){
this.response.json = JSON.decode(text, this.options.secure);
if(this.options.inflate.length){
this.options.inflate.each(function(rule){
var ret = ($defined(rule.store)) ? this.response.json[rule.store] : this.response.json[rule.data];
ret = this.expandData(this.response.json[rule.data], this.response.json[rule.keys]);
},this);
}
this.onSuccess(this.response.json, text);
},
expandData : function(data,keys){
var arr = [];
var len = data.length; var klen = keys.length;
var start = 0; var stop = klen;
while(stop < len){
arr.push( data.slice(start,stop).associate(keys) );
start = stop; stop += klen;
}
return arr;
}
});
Request.JSON now has an inflate option. You can inflate multiple segments of your JSON object if you so desire.
Usage:
new Request.JSON({
url : 'url',
inflate : [{ 'keys' : 'fields', 'data' : 'data' }]
onComplete : function(json){}
});
Pass as many inflate objects as you like to the option inflate array. It has an optional property called ’store’ If set the inflated data set will be stored in that key instead.
The ‘keys’ and ‘fields’ expect strings to match a location in the root of your JSON object.
Based in Paniyar's answer, we can convert a List of Objects in "compressed" Json format using C# like this:
var JsonString = serializer.Serialize(
new
{
cols = new[] { "field1", "field2", "field3"},
items = data.Select(x => new object[] {x.field1, x.field2, x.field3})
});
I used an array of object because the fields can be int, bool, string...
More Reduction:
If the field is repeated very often and it is a string type, you can get compressed a little be more if you add a distinct list of that field... for instance, a field name job position, city, etc are excellent candidate for this. You can add a distinct list of this items and in each item change the value for a reference number. That will make your Json more lite.
Compressed:
[["KeyA", "KeyB", "KeyC", "KeyD", "KeyE", "KeyF"],
["ValA1", "ValB1", "ValC1", "ValD1", "ValE1", "ValF1"],
["ValA2", "ValB2", "ValC2", "ValD2", "ValE2", "ValF2"],
["ValA3", "ValB3", "ValC3", "ValD3", "ValE3", "ValF3"],
["ValA4", "ValB4", "ValC4", "ValD4", "ValE4", "ValF4"]]
Uncompressed:
[{KeyA: "ValA1", KeyB: "ValB1", KeyC: "ValC1", KeyD: "ValD1", KeyE: "ValE1", KeyF: "ValF1"},
{KeyA: "ValA2", KeyB: "ValB2", KeyC: "ValC2", KeyD: "ValD2", KeyE: "ValE2", KeyF: "ValF2"},
{KeyA: "ValA3", KeyB: "ValB3", KeyC: "ValC3", KeyD: "ValD3", KeyE: "ValE3", KeyF: "ValF3"},
{KeyA: "ValA4", KeyB: "ValB4", KeyC: "ValC4", KeyD: "ValD4", KeyE: "ValE4", KeyF: "ValF4"}]
The most likely answer is that it really is just gzipped JSON. There is no other standard meaning to this phrase.
Re-organizing a homogenous array of JSON objects into a pair of arrays is a very useful technique to make the payload smaller and to speed up encoding and decoding, it is not commonly called "compressed JSON". I haven't run across it ever in open source or any open API, but we use this technique internally and call it "jsontable".