Display comma separated data on Thingsboard in time series charts - csv

I get comma spearated temperature data form a device, where the last entry is the last, the first one is from 1 hour before. The provided data is temperatures from the past hour by minute.
I get json data like this:
temperature 19.2,23.4,18.3 ...... 23.0, 18.2
How to show it in Thingsboard in a time series chart with proper timing?
Thanks!

You might use Rule Engine to convert your JSON in the format you need.
Supposing you are able to send device data to the platform via MQTT API or HTTP API with the following payload:
{
"temp": [22, 3, 45]
}
then with the Script Transformation Node you can convert the payload (the msg field of the POST_TELEMETRY event) to a format like the one below, which can be stored on database and directly shown from timeseries widgets:
[{
"ts": 1618296229874,
"values": {
"temp": 45
}
}, {
"ts": 1618296169874,
"values": {
"temp": 3
}
}, {
"ts": 1618296109874,
"values": {
"temp": 22
}
}]
Also you might need an upstream Switch Node if you have to distinguish between different kinds of telemetry formats. The resulting full working rule chain will be something like this:
Let's say your particular device is provisioned in Thingsboard with device type MULTIVALUE THERMOSTAT, you can configure the Switch node's function as follows:
function nextRelation(metadata, msg) {
return ['other'];
}
if(metadata.deviceType === 'MULTIVALUE THERMOSTAT') {
return ['multivalue'];
}
return nextRelation(metadata, msg);
This is the Script Transformation Node's function:
var tempArray = msg.temp;
var lastTs = Date.now();
var tsCounter = 0;
var MS_IN_ONE_MINUTE = 60000;
var newMsg = [];
for (var i = tempArray.length - 1; i >= 0; i--) {
let ts = {};
ts.ts = lastTs - tsCounter;
tsCounter += MS_IN_ONE_MINUTE;
let values = {};
values.temp = tempArray[i];
ts.values = values;
newMsg.push(ts);
}
return {msg: newMsg, metadata: metadata, msgType: msgType};
The Transformation function is just a starting point. You can improve it or make it more more accurate for you real needs.
In this example I assumed the input payload doesn't contain the base hour so I got it dinamically with Date.now(). So starting from the last telemetry, for all previous ones I calculated the corresponding timestamps going backward in time.

Related

JSON parse when top node is a number

I connect to various crypto public API's and get values using 3 steps:
var response = UrlFetchApp.fetch("API URL",{muteHttpExceptions: true});
var json = JSON.parse(response.getContentText());
And then depending on the output format, I do one of the following:
Tickers:
0:
last_trade:
will result in me using:
var rate1 = json.tickers[0].last_trade;
result:
XXRPXXBT:
a:
0:
will result in me using:
var rate1 = json.result.XXRPXXBT.a[0];
All of the methods I use work fine except when I get this format:
0:
price:
1:
price:
When I try use one of these, it does not work:
var rate1 = json[0].price;
var rate1 = json.[0].price;
var rate1 = json.0.price;
How do I read it when the top node is a number?
When you need to access a key that is a number you should enclose it in quotes like this (although even not enclosing it should work):
json["0"].price
The other 2 ways you tried are not a valid js syntax
Also check this link from MDN : https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Property_accessors#property_names
var json = {
0:{
price: 100
}
}
console.log(json["0"].price)
// Update for your enpoint in my comments
// Assuming that this have been parse and stored in a variable like the following on
let priceData = [
{
"symbol": "ETHBTC",
"price": "0.06643200"
},
{
"symbol": "LTCBTC",
"price": "0.00461600"
}
]
///The above is an array
/// You access it via numeric indexes like that
console.log(priceData[0].price);
// However make sure that you have actually parsed your api response as a json object first

Google Slides: newly inserted table not found

I´m wondering what is going on. I have two functions which both are working good when called one after one:
function createTable() {
var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0];
var table = slidesPage.insertTable(7, 4);
}
function changeColumnWidth() {
var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0];
var tableId = slidesPage.getTables()[0].getObjectId();
var requests = [{
updateTableColumnProperties: {
objectId: tableId,
"columnIndices": [ 1, 3],
"tableColumnProperties": {
"columnWidth": {
"magnitude": 80,
"unit": "PT"
}
},
"fields": "columnWidth"
}
}];
var createSlideResponse = Slides.Presentations.batchUpdate({
requests: requests
}, '1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI');
}
But trying to combine these two functions like:
function combined() {
createTable();
changeColumnWidth();
}
I´m getting Error:
Invalid requests[0].updateTableColumnProperties: The object (SLIDES_API456304911_0) could not be found.
Wondering if the insertTable method is asynchronous and therefore the created table is not ready?
Thanks for any help.
How about this modification? Please think of this as one of several workarounds. In my workaround, I used saveAndClose() for your situation. Using this, I thought to separate the process of SlidesApp and Slides API.
Modification points :
Save and close the slide using saveAndClose() after the table was inserted.
Return an object ID of inserted table to use at changeColumnWidth().
At changeColumnWidth(), the table is modified by Slides API using the received object ID.
Modified script :
function combined() {
var tableId = createTable(); // Modified
changeColumnWidth(tableId); // Modified
}
function createTable() {
var slide = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI'); // Modified
var slidesPage = slide.getSlides()[9]; // Modified
var table = slidesPage.insertTable(7, 4);
slide.saveAndClose(); // Added
return table.getObjectId();
}
function changeColumnWidth(tableId) { // Modified
// var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0]; // This line is not used.
// var tableId = slidesPage.getTables()[0].getObjectId(); // This line is not used because slidesPage.getTables().length becomes 0.
var requests = [{
updateTableColumnProperties: {
objectId: tableId,
"columnIndices": [ 1, 3],
"tableColumnProperties": {
"columnWidth": {
"magnitude": 80,
"unit": "PT"
}
},
"fields": "columnWidth"
}
}];
var createSlideResponse = Slides.Presentations.batchUpdate({
requests: requests
}, '1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI');
}
Note :
For the slide which is saved and closed by saveAndClose(), when the slide is reopened, the inserted table cannot be retrieved. When the table is tried to be retrieved using getTables() again, the length becomes 0. But at Slides API, the object ID of table can be retrieved. So I thought that the issue might be able to be solved by returning the object ID of table after the table was inserted.
But I couldn't understand about the reason that the values retrieved by getTables() from the reopened Slide become "0" yet. I'm sorry.
Reference :
saveAndClose()
If this workaround was not what you want, I'm sorry.
To achieve your goal - create a table with a specified layout and specific column sizes in one function - you should use the Slides API for the entire task. The Slides API lets you both create and modify the same element in the same batch request, if you provided a unique object ID for it. Otherwise, you have to first create the element, then send the modification request using the objectId found in the response to the first request. This second approach is essentially the behavior you were experiencing when the function calls were done separately.
There are restrictions on user-supplied IDs, naturally:
objectId string: A user-supplied object ID.If you specify an ID, it must be unique among all pages and page elements in the presentation. The ID must start with an alphanumeric character or an underscore (matches regex [a-zA-Z0-9_] ); remaining characters may include those as well as a hyphen or colon (matches regex [a-zA-Z0-9_-:] ). The length of the ID must not be less than 5 or greater than 50.If you don't specify an ID, a unique one is generated.
Given that hyphens are allowed, we can use the Utilites.getUuid() method to help supply our own unique object IDs.
When mixing SlidesApp and Slides, it is very likely that internal Google optimizations (e.g. write-caching) change the operation order. By restricting to a single service for related task operations, we can ensure that the objects we need are available when needed.
This example uses two methods that make Request objects for batchUpdate and ultimately creates a presentation, adds a blank slide, adds a table and modifies it, and then creates another blank slide.
function makeCreateTableRequest_(slideId, rows, columns, shouldSupplyID) {
const tablerq = {
rows: rows,
columns: columns,
elementProperties: {
pageObjectId: slideId,
/** size: {
height: {...},
width: {...}
},
transform: { ... } */
}
};
// If asked to use a custom ID (e.g. also going to modify this table), use a unique one.
if (shouldSupplyID)
tablerq.objectId = ("table" + Utilities.getUuid()).slice(0, 50);
return {createTable: tablerq};
}
function makeModifyTableColumnPropsRequest_(tableId, newWidthDimension, indicesArray) {
const rq = {
objectId: tableId,
fields: "columnWidth" // There are no other fields for this request as of 2018-07
};
if (newWidthDimension && newWidthDimension.magnitude !== undefined && newWidthDimension.unit)
rq.tableColumnProperties = { columnWidth: newWidthDimension };
if (indicesArray && indicesArray.length)
rq.columnIndices = indicesArray;
return {updateTableColumnProperties: rq};
}
function createPresentation_() {
const newPres = { title: "API-created Presentation" };
// Presentations are huge... limit the metadata sent back to us.
const fields = "presentationId,pageSize,title"
+ ",slides(objectId,pageType,pageElements(objectId,size,title,description))"
+ ",masters(objectId,pageType,pageElements(objectId,size,title,description))"
+ ",layouts(objectId,pageType,pageElements(objectId,size,title,description))";
const createdMetadata = Slides.Presentations.create(newPres, {fields: fields});
console.log({message:"Created a Presentation", response: createdMetadata});
return createdMetadata;
}
function addSlide_(pId) {
const response = Slides.Presentations.batchUpdate({ requests: [{ createSlide: {} }] }, pId);
return response.replies[0].createSlide.objectId;
}
function foo() {
const pres = createPresentation_();
const newSlideId = addSlide_(pres.presentationId);
// Get requests to add and to modify tables.
const openingTableRq = makeCreateTableRequest_(pres.slides[0].objectId, 2, 4);
const newTableRq = makeCreateTableRequest_(newSlideId, 7, 4, true);
const changeWidthRq = makeModifyTableColumnPropsRequest_(newTableRq.createTable.objectId, {magnitude: 80, unit: "PT"}, [0]);
// Add and update the desired table, then create a new slide.
var response = Slides.Presentations.batchUpdate({
requests: [
openingTableRq, // will have reply
newTableRq, // will have reply
changeWidthRq, // no reply
{ createSlide: {} } // will have reply
]
}, pres.presentationId);
console.log({message: "Performed updates to the created presentation", response: response});
}

MongoDB SSIS with $unwind

I recently started using MongoDB as a source in SSIS (using C# driver). I am very new with MongoDB and C#.
When I did not have nested documents, statements like below worked for me:
var query = Query.And(Query.Or(Query.GT("CreatedOn",maxUpdatedOnBSON), Query.GT("UpdatedOn", maxUpdatedOnBSON)),
Query.Or(Query.LT("CreatedOn", cutoffDate), Query.LT("UpdatedOn", cutoffDate)),Query.In("TestType", testTypes) );
MongoCursor<BsonDocument> toReturn = collection.Find(query);
Now, I got nested documents. I was able to create java script, and it works with MongoDB itself
db.Test.aggregate( [
{ $unwind : { path: "$Items",includeArrayIndex: "arrayIndex"} } ,
{ $match: { $and: [
{$or: [ { CreatedOn: { $gt: ISODate("2015-11-22T00:00:00Z")} }, {UpdatedOn: { $gt: ISODate("2015-11-22T00:00:00Z") } } ] },
{$or: [ { CreatedOn: { $lt: ISODate("2016-05-09T00:00:00Z")} }, {UpdatedOn: { $lt: ISODate("2016-05-09T00:00:00Z") } } ] }
] }
}] )
In C#, as I understand, I have to use aggregate instead of find but I cannot translate this code to C#. I still have selection criteria and unwind.
Can you please help?
because there is no collection template posted, i'm attaching a snippet something similar to what you would be looking for. does this help?
var builder = Builders<BsonDocument>.Filter;
//and operator can be used similar to below by using operator "&" or builder.And.
var filter = builder.Eq("state", "nj") | builder.Eq("state", "CO");
var filter2 = builder.Eq("pop", 6033) | builder.Eq("city", "nyc");
filter = builder.And(filter, filter2);
var pipeline = grades.Aggregate()
.Unwind(x => x["Items"])
.Match(filter);
var list = pipeline.ToList();
foreach (var item in list)
{
//do something
}
I got help and sharing the solution:
//Create matching criteria used in the aggregation pipeline to bring back only the specified documents based on date range
var match = new BsonDocument("$match",
new BsonDocument("$and",
new BsonArray()
.Add(new BsonDocument("$or", new BsonArray().Add(new BsonDocument("CreatedOn", new BsonDocument("$gt", maxUpdatedOnBSON))).Add(new BsonDocument("UpdatedOn", new BsonDocument("$gt", maxUpdatedOnBSON)))))
.Add(new BsonDocument("$or", new BsonArray().Add(new BsonDocument("CreatedOn", new BsonDocument("$lt", cutoffDate))).Add(new BsonDocument("UpdatedOn", new BsonDocument("$lt", cutoffDate)))))));
//create the arguments to pass to the $unwind method of the aggregation
var unwindargs = new BsonDocument("path", "$LineItems");
unwindargs.Add("includeArrayIndex", "arrayIndex");
//create the unwind stage and add the arguments
var unwind = new BsonDocument("$unwind", unwindargs);
//create a new pipeline and gather the results
var pipeline = new[] { match, unwind };
var mongoArgs = new AggregateArgs { Pipeline = pipeline };
var toReturn = collection.Aggregate(mongoArgs).ToList();

Transforming JSON

I am trying to transform JSON data provided by Quandl into a custom JSON format so that I can load it into my database.
The JSON Array is stock market data with Date, High, Low, Open, Close values. I need a flat JSON instead of an array.
I tried the following, but it returns full array instead of individual element. If I use [0][1], [0][2], it returns empty values.
Here is my code
var DataTransform = require("node-json-transform").DataTransform
var myData = {"dataset":{"data":[["2016-01-15",292.5,294.4,267.1,279.9,273.0,64104.0,182.09],["2016-01-14",288.0,302.0,265.0,287.6,288.2,68271.0,199.82],["2016-01-13",303.95,307.65,275.0,290.1,292.75,99921.0,293.08]]}}
var map = {
list : 'dataset.data',
item: {
date: [0],
high: [0][1],
low: [0][0][1]
}
};
var dataTransform = DataTransform(myData, map);
var result = dataTransform.transform();
console.log(result);
======================================================================
Output
[{"date":[["2016-01-15",292.5,294.4,267.1,279.9,273,64104,182.09]],"high":"","low":""},{"date":[["2016-01-14",288,302,265,287.6,288.2,68271,199.82]],"high":"","low":""},{"date":[["2016-01-13",303.95,307.65,275,290.1,292.75,99921,293.08]],"high":"","low":""}]
You should use the built-in Array functions for transforming data. e.g. map, reduce, filter, sort, etc. In this case map will do the job perfectly. e.g.
var ds = {"dataset":{"data":[["2016-01-15",292.5,294.4,267.1,279.9,273.0,64104.0,182.09],["2016-01-14",288.0,302.0,265.0,287.6,288.2,68271.0,199.82],["2016-01-13",303.95,307.65,275.0,290.1,292.75,99921.0,293.08]]}}
var transformed = ds.dataset.data
.map(function (d) {
return {
date : d[0],
high : d[1],
low : d[2],
etc:
}
})
// output in format : [{date:"2016-01-15",high:"294.4",low:"267.1"},...]

Iterating through data structure

I'm kinda wracking my brain on how to iterate through my data object. I've created a recursive function to iterate through XML elements and essentially output the same structure as an object, using the attribute in the element as the key, which holds it's own object. The object, conceptually, looks something like this. It's sort of a Tree, but I'm not using flex Trees, just as3.
The thing I want to get across is that each "branch" has any number of children, so I can't hard code the depth of each branch.
What I'm trying to accomplish
Each node in my data structure is a folder name, and I need to append all of the children into a single string and store that. To generate an "asset", i need a string from each of these initial nodes (in the example picture, they'd be 1, 2, 4, 5, 6, and 17). So, I need to iterate through each branch and return a different set than previously, such that every possible asset combination is found.
Conceptually, I know that I need to grab the "1st index" of each branch, output all of those strings into an asset, and then move the first branch up an "index" until it reaches it's limit, which triggers the next branch to move its "index". But, how I actually do that is a bit of a mystery to me.
Do I need to restructure my data tree into an array that can be referenced by index, or is there some simpler way of iterating through every possible combination that I am missing.
I'm using actionscript 3, but I'm not specifically looking for a code example, just pseudo-code is fine.
Because you're trying to pull out the N'th of every tree into a seperate array of strings, it'll be easier to first distill these trees into their sequential sets (with their concatenations) in a first pass. A second pass then constructs your intended "asset" arrays. Because the nature of your document may vary wildly, I've created a demonstration data source below. You can run the following code, and it will produce the following results:
//Assume the following structure.
var data:Array = [
{
"path":"one",
"sub":[
{
"path":"seven",
"sub":[
{"path":"nine.png"},
{"path":"ten.png"}
]
},
{
"path":"eight",
"sub":[
{"path":"eleven.png"},
{"path":"twelve.png"}
]
}
]
},
{
"path":"two",
"sub":[
{"path":"thirteen.png"},
{"path":"fourteen.png"},
{"path":"fifteen.png"},
{"path":"sixteen.png"}
]
},
{
"path":"four",
"sub":[
{
"path":"seventeen",
"sub":[
{"path":"nineteen.png"},
{"path":"twenty.png"}
]
},
{
"path":"twenty-two",
"sub":[
{"path":"twenty-one.png"},
{"path":"twenty-two.png"}
]
}
]
},
{
"path":"five",
"sub":[
{"path":"twenty-three.png"}
]
},
{
"path":"six",
"sub":[
{
"path":"twenty-four",
"sub":[
{"path":"twenty-six.png"},
{"path":"twenty-seven.png"}
]
},
{
"path":"twenty-nine",
"sub":[
{"path":"twenty-eight.png"},
{"path":"twenty-nine.png"}
]
}
]
},
{
"path":"seventeen",
"sub":[
{"path":"thirtee.png"}
]
}
]
function init():void {
var sets:Array = [];
// First we'll create a complete sequence of concatenated strings per set
for (var i:int = 0; i < data.length; i++) {
sets[i] = [];
scan(data[i], sets[i]);
}
// Find the max length
var max:int = 0, a:Array;
for each (a in sets) {
max = (a.length > max) ? a.length : max;
}
// Now we'll create our ordered assets, pulling out the firsts, then the seconds, and so on...
var assets:Array = [];
for (i = 0; i < max; i++) {
assets[i] = [];
for each (a in sets) {
if (i < a.length) {
assets[i].push(a[i]);
}
}
}
}
function scan(node:Object, a:Array, prefix:String = ""):void {
var subNode:Object;
// This is a recursive function which digs till it finds no more sub properties.
if (node.hasOwnProperty("sub")) {
// On every sub node, it passes the currently concatenated path so far
for each (subNode in node.sub) {
scan(subNode, a, prefix + "/" + node.path);
}
} else {
// When we reach the final depth, we can begin populating our array with paths.
a.push(prefix + "/" + node.path);
}
}
init();