JSON parse when top node is a number - json

I connect to various crypto public API's and get values using 3 steps:
var response = UrlFetchApp.fetch("API URL",{muteHttpExceptions: true});
var json = JSON.parse(response.getContentText());
And then depending on the output format, I do one of the following:
Tickers:
0:
last_trade:
will result in me using:
var rate1 = json.tickers[0].last_trade;
result:
XXRPXXBT:
a:
0:
will result in me using:
var rate1 = json.result.XXRPXXBT.a[0];
All of the methods I use work fine except when I get this format:
0:
price:
1:
price:
When I try use one of these, it does not work:
var rate1 = json[0].price;
var rate1 = json.[0].price;
var rate1 = json.0.price;
How do I read it when the top node is a number?

When you need to access a key that is a number you should enclose it in quotes like this (although even not enclosing it should work):
json["0"].price
The other 2 ways you tried are not a valid js syntax
Also check this link from MDN : https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Property_accessors#property_names
var json = {
0:{
price: 100
}
}
console.log(json["0"].price)
// Update for your enpoint in my comments
// Assuming that this have been parse and stored in a variable like the following on
let priceData = [
{
"symbol": "ETHBTC",
"price": "0.06643200"
},
{
"symbol": "LTCBTC",
"price": "0.00461600"
}
]
///The above is an array
/// You access it via numeric indexes like that
console.log(priceData[0].price);
// However make sure that you have actually parsed your api response as a json object first

Related

Display comma separated data on Thingsboard in time series charts

I get comma spearated temperature data form a device, where the last entry is the last, the first one is from 1 hour before. The provided data is temperatures from the past hour by minute.
I get json data like this:
temperature 19.2,23.4,18.3 ...... 23.0, 18.2
How to show it in Thingsboard in a time series chart with proper timing?
Thanks!
You might use Rule Engine to convert your JSON in the format you need.
Supposing you are able to send device data to the platform via MQTT API or HTTP API with the following payload:
{
"temp": [22, 3, 45]
}
then with the Script Transformation Node you can convert the payload (the msg field of the POST_TELEMETRY event) to a format like the one below, which can be stored on database and directly shown from timeseries widgets:
[{
"ts": 1618296229874,
"values": {
"temp": 45
}
}, {
"ts": 1618296169874,
"values": {
"temp": 3
}
}, {
"ts": 1618296109874,
"values": {
"temp": 22
}
}]
Also you might need an upstream Switch Node if you have to distinguish between different kinds of telemetry formats. The resulting full working rule chain will be something like this:
Let's say your particular device is provisioned in Thingsboard with device type MULTIVALUE THERMOSTAT, you can configure the Switch node's function as follows:
function nextRelation(metadata, msg) {
return ['other'];
}
if(metadata.deviceType === 'MULTIVALUE THERMOSTAT') {
return ['multivalue'];
}
return nextRelation(metadata, msg);
This is the Script Transformation Node's function:
var tempArray = msg.temp;
var lastTs = Date.now();
var tsCounter = 0;
var MS_IN_ONE_MINUTE = 60000;
var newMsg = [];
for (var i = tempArray.length - 1; i >= 0; i--) {
let ts = {};
ts.ts = lastTs - tsCounter;
tsCounter += MS_IN_ONE_MINUTE;
let values = {};
values.temp = tempArray[i];
ts.values = values;
newMsg.push(ts);
}
return {msg: newMsg, metadata: metadata, msgType: msgType};
The Transformation function is just a starting point. You can improve it or make it more more accurate for you real needs.
In this example I assumed the input payload doesn't contain the base hour so I got it dinamically with Date.now(). So starting from the last telemetry, for all previous ones I calculated the corresponding timestamps going backward in time.

How to make Json.stringify ignore certain class memebers?

I'm using the latest Haxe and HaxeFlixel to make a simple game prototype.
I have the following class...
class GameData
{
public var playerHealth: Int;
public var playerScore: Int;
public var levelName: String;
public function new(playerHealth: Int = 0, playerScore: Int = 0, levelName: String = "")
{
this.playerHealth = playerHealth;
this.playerScore = playerScore;
this.levelName = levelName;
}
}
I convert it to JSON as follows...
Json.stringify(new GameData(64, 512, "Level 1"));
Is there's a way I can make it so the stringify ignores certain members?
haxe.Json has no mechanism to exclude fields, so I would recommend using a third-party library such as json2object that does. Here you can simply annotate fields that should be ignored with #:jignored:
#:jignored
public var levelName:String;
var data = new GameData(100, 10, "Level 1");
var json = new json2object.JsonWriter<GameData>().write(data);
trace(json); // {"playerHealth": 100,"playerScore": 10}
There are some possible workarounds that don't involve adding a library to your project, but they don't seem very nice:
Don't serialize the object directly, but a structure that only includes the desired fields:
var data = new GameData(100, 10, "Level 1");
var json = Json.stringify({
playerHealth: data.playerHealth,
playerScore: data.playerScore
});
trace(json); // {"playerHealth":100,"playerScore":10}
Remove the unwanted fields after serialization - this seems rather hacky as it involves a lot of unnecessary overhead due to an additional Json.parse() and Json.stringify() call:
var json = Json.stringify(new GameData(100, 10, "Level 1"));
var data:haxe.DynamicAccess<String> = Json.parse(json);
data.remove("levelName");
json = Json.stringify(data);
trace(json); // {"playerHealth":100,"playerScore":10}
Depending on your exact situation, it can be desirable to make a slightly modified version of standard library's JsonPrinter - for example, in GMEdit I allow JSON objects to have an hxOrder: Array<String> field, which, if provided, determines the field order for printing, and is initialized to a static array. You can make a similar scheme for field inclusion/exclusion.

Google Slides: newly inserted table not found

I´m wondering what is going on. I have two functions which both are working good when called one after one:
function createTable() {
var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0];
var table = slidesPage.insertTable(7, 4);
}
function changeColumnWidth() {
var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0];
var tableId = slidesPage.getTables()[0].getObjectId();
var requests = [{
updateTableColumnProperties: {
objectId: tableId,
"columnIndices": [ 1, 3],
"tableColumnProperties": {
"columnWidth": {
"magnitude": 80,
"unit": "PT"
}
},
"fields": "columnWidth"
}
}];
var createSlideResponse = Slides.Presentations.batchUpdate({
requests: requests
}, '1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI');
}
But trying to combine these two functions like:
function combined() {
createTable();
changeColumnWidth();
}
I´m getting Error:
Invalid requests[0].updateTableColumnProperties: The object (SLIDES_API456304911_0) could not be found.
Wondering if the insertTable method is asynchronous and therefore the created table is not ready?
Thanks for any help.
How about this modification? Please think of this as one of several workarounds. In my workaround, I used saveAndClose() for your situation. Using this, I thought to separate the process of SlidesApp and Slides API.
Modification points :
Save and close the slide using saveAndClose() after the table was inserted.
Return an object ID of inserted table to use at changeColumnWidth().
At changeColumnWidth(), the table is modified by Slides API using the received object ID.
Modified script :
function combined() {
var tableId = createTable(); // Modified
changeColumnWidth(tableId); // Modified
}
function createTable() {
var slide = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI'); // Modified
var slidesPage = slide.getSlides()[9]; // Modified
var table = slidesPage.insertTable(7, 4);
slide.saveAndClose(); // Added
return table.getObjectId();
}
function changeColumnWidth(tableId) { // Modified
// var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0]; // This line is not used.
// var tableId = slidesPage.getTables()[0].getObjectId(); // This line is not used because slidesPage.getTables().length becomes 0.
var requests = [{
updateTableColumnProperties: {
objectId: tableId,
"columnIndices": [ 1, 3],
"tableColumnProperties": {
"columnWidth": {
"magnitude": 80,
"unit": "PT"
}
},
"fields": "columnWidth"
}
}];
var createSlideResponse = Slides.Presentations.batchUpdate({
requests: requests
}, '1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI');
}
Note :
For the slide which is saved and closed by saveAndClose(), when the slide is reopened, the inserted table cannot be retrieved. When the table is tried to be retrieved using getTables() again, the length becomes 0. But at Slides API, the object ID of table can be retrieved. So I thought that the issue might be able to be solved by returning the object ID of table after the table was inserted.
But I couldn't understand about the reason that the values retrieved by getTables() from the reopened Slide become "0" yet. I'm sorry.
Reference :
saveAndClose()
If this workaround was not what you want, I'm sorry.
To achieve your goal - create a table with a specified layout and specific column sizes in one function - you should use the Slides API for the entire task. The Slides API lets you both create and modify the same element in the same batch request, if you provided a unique object ID for it. Otherwise, you have to first create the element, then send the modification request using the objectId found in the response to the first request. This second approach is essentially the behavior you were experiencing when the function calls were done separately.
There are restrictions on user-supplied IDs, naturally:
objectId string: A user-supplied object ID.If you specify an ID, it must be unique among all pages and page elements in the presentation. The ID must start with an alphanumeric character or an underscore (matches regex [a-zA-Z0-9_] ); remaining characters may include those as well as a hyphen or colon (matches regex [a-zA-Z0-9_-:] ). The length of the ID must not be less than 5 or greater than 50.If you don't specify an ID, a unique one is generated.
Given that hyphens are allowed, we can use the Utilites.getUuid() method to help supply our own unique object IDs.
When mixing SlidesApp and Slides, it is very likely that internal Google optimizations (e.g. write-caching) change the operation order. By restricting to a single service for related task operations, we can ensure that the objects we need are available when needed.
This example uses two methods that make Request objects for batchUpdate and ultimately creates a presentation, adds a blank slide, adds a table and modifies it, and then creates another blank slide.
function makeCreateTableRequest_(slideId, rows, columns, shouldSupplyID) {
const tablerq = {
rows: rows,
columns: columns,
elementProperties: {
pageObjectId: slideId,
/** size: {
height: {...},
width: {...}
},
transform: { ... } */
}
};
// If asked to use a custom ID (e.g. also going to modify this table), use a unique one.
if (shouldSupplyID)
tablerq.objectId = ("table" + Utilities.getUuid()).slice(0, 50);
return {createTable: tablerq};
}
function makeModifyTableColumnPropsRequest_(tableId, newWidthDimension, indicesArray) {
const rq = {
objectId: tableId,
fields: "columnWidth" // There are no other fields for this request as of 2018-07
};
if (newWidthDimension && newWidthDimension.magnitude !== undefined && newWidthDimension.unit)
rq.tableColumnProperties = { columnWidth: newWidthDimension };
if (indicesArray && indicesArray.length)
rq.columnIndices = indicesArray;
return {updateTableColumnProperties: rq};
}
function createPresentation_() {
const newPres = { title: "API-created Presentation" };
// Presentations are huge... limit the metadata sent back to us.
const fields = "presentationId,pageSize,title"
+ ",slides(objectId,pageType,pageElements(objectId,size,title,description))"
+ ",masters(objectId,pageType,pageElements(objectId,size,title,description))"
+ ",layouts(objectId,pageType,pageElements(objectId,size,title,description))";
const createdMetadata = Slides.Presentations.create(newPres, {fields: fields});
console.log({message:"Created a Presentation", response: createdMetadata});
return createdMetadata;
}
function addSlide_(pId) {
const response = Slides.Presentations.batchUpdate({ requests: [{ createSlide: {} }] }, pId);
return response.replies[0].createSlide.objectId;
}
function foo() {
const pres = createPresentation_();
const newSlideId = addSlide_(pres.presentationId);
// Get requests to add and to modify tables.
const openingTableRq = makeCreateTableRequest_(pres.slides[0].objectId, 2, 4);
const newTableRq = makeCreateTableRequest_(newSlideId, 7, 4, true);
const changeWidthRq = makeModifyTableColumnPropsRequest_(newTableRq.createTable.objectId, {magnitude: 80, unit: "PT"}, [0]);
// Add and update the desired table, then create a new slide.
var response = Slides.Presentations.batchUpdate({
requests: [
openingTableRq, // will have reply
newTableRq, // will have reply
changeWidthRq, // no reply
{ createSlide: {} } // will have reply
]
}, pres.presentationId);
console.log({message: "Performed updates to the created presentation", response: response});
}

Transforming JSON

I am trying to transform JSON data provided by Quandl into a custom JSON format so that I can load it into my database.
The JSON Array is stock market data with Date, High, Low, Open, Close values. I need a flat JSON instead of an array.
I tried the following, but it returns full array instead of individual element. If I use [0][1], [0][2], it returns empty values.
Here is my code
var DataTransform = require("node-json-transform").DataTransform
var myData = {"dataset":{"data":[["2016-01-15",292.5,294.4,267.1,279.9,273.0,64104.0,182.09],["2016-01-14",288.0,302.0,265.0,287.6,288.2,68271.0,199.82],["2016-01-13",303.95,307.65,275.0,290.1,292.75,99921.0,293.08]]}}
var map = {
list : 'dataset.data',
item: {
date: [0],
high: [0][1],
low: [0][0][1]
}
};
var dataTransform = DataTransform(myData, map);
var result = dataTransform.transform();
console.log(result);
======================================================================
Output
[{"date":[["2016-01-15",292.5,294.4,267.1,279.9,273,64104,182.09]],"high":"","low":""},{"date":[["2016-01-14",288,302,265,287.6,288.2,68271,199.82]],"high":"","low":""},{"date":[["2016-01-13",303.95,307.65,275,290.1,292.75,99921,293.08]],"high":"","low":""}]
You should use the built-in Array functions for transforming data. e.g. map, reduce, filter, sort, etc. In this case map will do the job perfectly. e.g.
var ds = {"dataset":{"data":[["2016-01-15",292.5,294.4,267.1,279.9,273.0,64104.0,182.09],["2016-01-14",288.0,302.0,265.0,287.6,288.2,68271.0,199.82],["2016-01-13",303.95,307.65,275.0,290.1,292.75,99921.0,293.08]]}}
var transformed = ds.dataset.data
.map(function (d) {
return {
date : d[0],
high : d[1],
low : d[2],
etc:
}
})
// output in format : [{date:"2016-01-15",high:"294.4",low:"267.1"},...]

Difference between json.js and json2.js

Can someone tell me what the difference is between the 2 JSON parsers?
https://github.com/douglascrockford/JSON-js/blob/master/json.js
https://github.com/douglascrockford/JSON-js/blob/master/json2.js
I have a JSON file from 2007-04-13 (It has methods such as parseJSON). I don't see these methods in any of the new versions.
From their code:
// Augment the basic prototypes if they have not already been augmented.
// These forms are obsolete. It is recommended that JSON.stringify and
// JSON.parse be used instead.
if (!Object.prototype.toJSONString) {
Object.prototype.toJSONString = function (filter) {
return JSON.stringify(this, filter);
};
Object.prototype.parseJSON = function (filter) {
return JSON.parse(this, filter);
};
}
I guess parseJSON is obsolete, therefore the new version (json2) doesn't even use it anymore. However if your code uses parseJSON a lot you could just add this piece of code somewhere to make it work again:
Object.prototype.parseJSON = function (filter) {
return JSON.parse(this, filter);
};
Quoting here:
"JSON2.js - Late last year Crockford quietly released a new version of his JSON API that replaced his existing API. The important difference was that it used a single base object."
I also noticed that json2 stringified arrays differently than json2007.
In json2007:
var array = [];
array[1] = "apple";
array[2] = "orange";
alert(array.toJSONString()); // Output: ["apple", "orange"].
In json2:
var array = [];
array[1] = "apple";
array[2] = "orange";
alert(JSON.stringify(array)); // Output: [null, "apple", "orange"].