best practice to standardize data in JSON objects in a flexible way - json

I need to convert information in various structured formats to other formats (e.g. html or text) using NodeJS. So I took the approach to convert the source format to JSON based on JSON-Schemas. I can also convert the resulting JSON to text based on Pug templates.
What I'm now looking for is a flexible way to standardise the data and simplify the structures in the JSON so there are less variations in e.g. the date and time format.
An example of such object would be:
{
header: {
sender: {
// more properties
id: '12345';
}
receiver: {
// more properties
id: '987654';
}
date: {
date: '170910',
time: '0922'
}
// more properties
}
// more properties
someMore: {
birthdate: {
year: 2016,
month: 5,
day: 11
}
otherProperty: {
// more properties
date: '20170205'
}
}
}
I'd like to convert this to
{
header: {
senderId: '12345',
receiverId: '987654'
date: '20170910T0922'
}
// more properties
someMore: {
birthdate: '20160511',
otherProperty: {
// more properties
date: '20170205'
}
}
}
The idea is to recursively loop over all properties in the object and use a Map that has een entry for every property that should be acted on, e.g.
var map = new Map();
map.set('sender', getSender());
map.set('date', normalizeDate());
map.set('birthdate', normalizeDate());
for each property key the map is checked and if it returns a function, the function is executed, if not
the property is created and the loop goes on.
However, I get the distinct impression this problem has been solved before, so I wonder if there are npm packages I could use instead?

After searching some more and trying out different JSON transformers I settled on dot-object because it provides a simple way to transform part of the structure with multiple 'rules' in one go.
I gave up on the idea of writing a lot of general normalizing functions and just settled for the most obvious ones (gender and date/time) and handled them by transforming the various structures to a single standardized structure and then apply the appropriate function.
This might not be the most elegant approach but I'm time constraint and this works.
To convert the source to the result as described in the question, this code was used:
const dot = require('dot-object');
const rules = {
'header.sender.id': 'header.senderId',
'header.receiver.id': 'header.receiverId'
};
const target = {};
dot.transform(rules, src, target);
// normalizing
target.date = normalizeDate(target.date);
target.someMore.birthdate = normalizeDate(target.someMore.birthdate);
target.someMore.otherProperty.date = normalizeDate(target.someMore.otherProperty.date);

Related

Google Apps Script: How to get values from all dynamic keys in deeply nested object

Trying to retrieve all the dividend values from the object in this url. I'm getting an error, "TypeError: obj.chart.result[0].events.dividends.map is not a function". I'm trying to build a basic coding skill in handling nested objects. What should be changed in this code? Some explanation would be greatly helpful. Thank you!
function test() {
var url = "https://query1.finance.yahoo.com/v8/finance/chart/VZ?formatted=true&lang=en-US&region=US&interval=1d&period1=1451624400&period2=1672963200&events=div&useYfid=true&corsDomain=finance.yahoo.com";
var obj = UrlFetchApp.fetch(url, { muteHttpExceptions: true }).getContentText();
var obj = JSON.parse(obj);
var dividend = obj.chart.result[0].events.dividends.map(o => (({ o: { amount } }) => amount));
console.log(dividend)
}
Your dividends is not an array. It's an object. In the programming space people might call it a hashmap, key-value pair, or map. Since this is JavaScript, might also consider it just JSON.
The way you're trying to use it though, using .map() is a method on arrays which is completely different from what object is--although an object might be referred to as a map.
The .map() array method is a for loop that takes a predicate to alter the elements of the array. For example,
[1,2,3,4,5].map((n) => {return n * 2})
// returns: [2,4,6,8,10]
Since dividends is some object like...
{
12345: {amount: 1, date: 12345678},
12346: {amount: 1, date: 12345678},
// etc
}
Then you might do something like...
Object.keys(obj.chart.result[0].events.dividends).map((dividend_id) => {
Logger.log(obj.chart.result[0].events.dividends[dividend_id])
})
In this example we put the dividends object into Object.keys() which would give back the ids of those dividends like [12345, 12346, 12347, ...].
In the .map() predicate, (dividend_id) => { /** do stuff like console.log */} we're taking that id and using it to open it's matching key and return the value of that key from dividends.

How do I generate a serde_json object from a "." separated text format?

The Problem
I am trying to generate a json object (with serde) by parsing a custom macro format that looks like this:
Plot.Polar.max: 20
Plot.Polar.min: 0
Plot.Polar.numberlabel: 0101
Plot.Polar.chartname: small-chart
Plot.Polar.Var.1:
Plot.Polar.Var.2: A label: with T+ES[T] #Data
What I get stuck on is how to set the keys for the object. In my old JavaScript code I split on \n, ., and :, had a couple of nested loops, and a reduceRight in the end to create the object like this:
// rowObject equals one row in the old macro format
let rowObject = keys.reduceRight(
(allKeys, item) => ({ [item]: allKeys }),
val,
);
My Goal
My goal is to use that json object to generate a highcharts config (json) depending on the keys and values from the custom macro. I want to be able to print just the macro in json format as well hence why I want to convert the macro to json first and not use a separate data structure (though that might be a good idea?). The json I want to produce from the macro is this:
{
"Plot": {
"Polar": {
"max": 20,
"min": 0
}
}
}
What I Have Tried
Map::insert though I am not sure how to structure the key string. How do I manage the Map objects in this case?
Another solution I see is creating the object from a raw string and merging each rowObject with the main object though this approach feels a bit hacky.
The current loop I have:
// pseudo
// let mut json_macro = new Map();
for row in macro_rows.iter() {
let row_key_value: Vec<&str> = row.split(':').collect();
let keys = row_key_value[0];
let value = row_key_value[1];
let keys_split: Vec<&str> = keys.split('.').collect();
for key in keys_split.iter() {
// TODO: accumulate a objects to row_object
}
// TODO: insert row_object to json_macro
}
The Question
Is it possible to do something like reduceRight in JavaScript or something similar in rust?
Update
I realized that I will have to treat all values as strings because it is impossible to know if a number is a string or not. What worked in the end was the solution #gizmo provided.
To insert your row into json_macro you can fold keys_split from the left and insert every key into the top-level object:
let row_key_value: Vec<&str> = row.split(':').collect();
let keys = row_key_value[0];
let value: Value = serde_json::from_str(row_key_value[1]).unwrap();
let keys_split: Vec<&str> = keys.split('.').collect();
keys_split[..keys_split.len() - 1]
.iter()
.fold(&mut json_macro, |object, &key| {
object
.entry(key)
.or_insert(Map::new().into())
.as_object_mut()
.unwrap()
})
.insert(keys_split.last().unwrap().to_string(), value);
A couple things to note here about unwrap()s:
from_str(...).unwrap(): I parse val as a JSON object here. This might not be what you want. Maybe instead you want str::parse::<i32> or something else. In any case, this parsing might fail.
.as_object_mut().unwrap(): This will explode if the input redefines a key like
Plot.Polar: 0
Plot.Polar.max: 20
The other way around, you probably want to handle the case where the key is already defined as an object.
keys_split.last().unwrap() won't fail but you might want to check if it's the empty string

parsing json with different schema with Play json

I've got to parse a list of Json messages. I am using Play Json
All messages have similar structure, and at high level may be represented as
case class JMessage(
event: String,
messageType: String,
data: JsValue // variable data structure
)
data may hold entries of different types - Double, String, Int, so I cant go with a Map.
Currently there are at least three various types of the data. The structure of data may be identified by messageType.
So far I've created three case classes each representing the structure of data. As well as implicit Reads for them. And the 4th one that is a case class for result with some Option-al fields. So basically I need to map various json messages to some output format.
The approach I'm currently using is:
messages.map(Json.parse(_)).(_.as[JMessage]).map {
elem => {
if (elem.messageType == "event") {
Some(parseMessageOfTypeEvent(elem.data))
}
Some(..)
} else {
None
}
}
.filter(_.nonEmpty)
The parseMessageOfType%type% functions are basically (v: type) => JsValue.
So after all I have 4 case classes and 3 functions for parsing. It works, but it is ugly.
Is there a more beautiful Scala way to it?

Manually parse json data according to kendo model

Any built-in ready-to-use solution in Kendo UI to parse JSON data according to schema.model?
Maybe something like kendo.parseData(json, model), which will return array of objects?
I was searching for something like that and couldn't find anything built-in. However, using Model.set apparently uses each field's parse logic, so I ended up writing this function which works pretty good:
function parse(model, json) {
// I initialize the model with the json data as a quick fix since
// setting the id field doesn't seem to work.
var parsed = new model(json);
var fields = Object.keys(model.fields);
for (var i=0; i<fields.length; i++) {
parsed.set(fields[i], json[fields[i]]);
}
return parsed;
}
Where model is the kendo.data.Model definition (or simply datasource.schema.model), and json is the raw object. Using or modifying it to accept and return arrays shouldn't be too hard, but for my use case I only needed a single object to be parsed at a time.
I actually saw your post the day you posted it but did not have the answer. I just needed to solve this problem myself as part of a refactoring. My solution is for DataSources, not for models directly.
kendo.data.DataSource.prototype.parse = function (data) {
return this.reader.data(data);
// Note that the original data will be modified. If that is not what you want, change to the following commented line
// return this.reader.data($.extend({}, data));
}
// ...
someGrid.dataSource.parse(myData);
If you want to do it directly with a model, you will need to look at the DataReader class in kendo.data.js and use a similar logic. Unfortunately, the DataReader takes a schema instead of a model and the part dealing with the model is not extracted in it's own method.

Formatting date field in KendoUI Grid coming as object

I am trying to format a date field in KendoUI Grid where the field is coming as an object:
"callStart":{"date":"2014-01-24 12:04:36","timezone_type":3,"timezone":"Europe\/Berlin"}
I have tried:
{ field: "callStart", title: "Fecha", width: "100px", format: "{0:yyyy-MM-dd}" }
but still showing:
[object Object]
Any idea?
Thanks!
Don't know if you already solved it, but I'll answer late anyways. The reason why it's not working is because you are probably trying to bind the entire object callStart into the field column. The field expects only a date object, but you are not giving it one.
Also, your object seems to still be in JSON string format, you need to parse the object out as a first step (if it really is just raw JSON). Then the next step, you can:
Parse callStart on the columns themselves (with kendo Templates)
Parse callStart on the datasource itself through its schema
Option 1: Parse on the column field itself using a template
{
field: "callStart",
title: "Fecha",
width: "100px",
template: "#= kendo.toString(kendo.parseDate(callStart.date, 'yyyy-MM-dd HH:mm:ss'), 'MM/dd/yyyy') #"
}
The advantage of this option is that your data source object still maintains its original form, but filtering and sorting may get a bit tricky.
Option 2: Parse the object through the dataSource schema
var gridDS = new kendo.data.DataSource({
data: result,
schema: {
parse: function (result) {
for (var i = 0; i < result.length; i++) {
var resultItem = result[i];
resultItem.callStart = kendo.parseDate(result[i].callStart.date, 'yyyy-MM-dd HH:mm:ss');
}
return result;
}
},
//etc...
});
Each data source object goes through the parse function and you can do whatever processing you need to do to turn it into a JS date or kendo Date object. The advantage is that you can control the exact type for that column and makes filtering/sorting easier.
You'll probably have to do some tweaking to get your desired output, but these are the general options you need to pick from.