Iterating through data structure - actionscript-3

I'm kinda wracking my brain on how to iterate through my data object. I've created a recursive function to iterate through XML elements and essentially output the same structure as an object, using the attribute in the element as the key, which holds it's own object. The object, conceptually, looks something like this. It's sort of a Tree, but I'm not using flex Trees, just as3.
The thing I want to get across is that each "branch" has any number of children, so I can't hard code the depth of each branch.
What I'm trying to accomplish
Each node in my data structure is a folder name, and I need to append all of the children into a single string and store that. To generate an "asset", i need a string from each of these initial nodes (in the example picture, they'd be 1, 2, 4, 5, 6, and 17). So, I need to iterate through each branch and return a different set than previously, such that every possible asset combination is found.
Conceptually, I know that I need to grab the "1st index" of each branch, output all of those strings into an asset, and then move the first branch up an "index" until it reaches it's limit, which triggers the next branch to move its "index". But, how I actually do that is a bit of a mystery to me.
Do I need to restructure my data tree into an array that can be referenced by index, or is there some simpler way of iterating through every possible combination that I am missing.
I'm using actionscript 3, but I'm not specifically looking for a code example, just pseudo-code is fine.

Because you're trying to pull out the N'th of every tree into a seperate array of strings, it'll be easier to first distill these trees into their sequential sets (with their concatenations) in a first pass. A second pass then constructs your intended "asset" arrays. Because the nature of your document may vary wildly, I've created a demonstration data source below. You can run the following code, and it will produce the following results:
//Assume the following structure.
var data:Array = [
{
"path":"one",
"sub":[
{
"path":"seven",
"sub":[
{"path":"nine.png"},
{"path":"ten.png"}
]
},
{
"path":"eight",
"sub":[
{"path":"eleven.png"},
{"path":"twelve.png"}
]
}
]
},
{
"path":"two",
"sub":[
{"path":"thirteen.png"},
{"path":"fourteen.png"},
{"path":"fifteen.png"},
{"path":"sixteen.png"}
]
},
{
"path":"four",
"sub":[
{
"path":"seventeen",
"sub":[
{"path":"nineteen.png"},
{"path":"twenty.png"}
]
},
{
"path":"twenty-two",
"sub":[
{"path":"twenty-one.png"},
{"path":"twenty-two.png"}
]
}
]
},
{
"path":"five",
"sub":[
{"path":"twenty-three.png"}
]
},
{
"path":"six",
"sub":[
{
"path":"twenty-four",
"sub":[
{"path":"twenty-six.png"},
{"path":"twenty-seven.png"}
]
},
{
"path":"twenty-nine",
"sub":[
{"path":"twenty-eight.png"},
{"path":"twenty-nine.png"}
]
}
]
},
{
"path":"seventeen",
"sub":[
{"path":"thirtee.png"}
]
}
]
function init():void {
var sets:Array = [];
// First we'll create a complete sequence of concatenated strings per set
for (var i:int = 0; i < data.length; i++) {
sets[i] = [];
scan(data[i], sets[i]);
}
// Find the max length
var max:int = 0, a:Array;
for each (a in sets) {
max = (a.length > max) ? a.length : max;
}
// Now we'll create our ordered assets, pulling out the firsts, then the seconds, and so on...
var assets:Array = [];
for (i = 0; i < max; i++) {
assets[i] = [];
for each (a in sets) {
if (i < a.length) {
assets[i].push(a[i]);
}
}
}
}
function scan(node:Object, a:Array, prefix:String = ""):void {
var subNode:Object;
// This is a recursive function which digs till it finds no more sub properties.
if (node.hasOwnProperty("sub")) {
// On every sub node, it passes the currently concatenated path so far
for each (subNode in node.sub) {
scan(subNode, a, prefix + "/" + node.path);
}
} else {
// When we reach the final depth, we can begin populating our array with paths.
a.push(prefix + "/" + node.path);
}
}
init();

Related

Display comma separated data on Thingsboard in time series charts

I get comma spearated temperature data form a device, where the last entry is the last, the first one is from 1 hour before. The provided data is temperatures from the past hour by minute.
I get json data like this:
temperature 19.2,23.4,18.3 ...... 23.0, 18.2
How to show it in Thingsboard in a time series chart with proper timing?
Thanks!
You might use Rule Engine to convert your JSON in the format you need.
Supposing you are able to send device data to the platform via MQTT API or HTTP API with the following payload:
{
"temp": [22, 3, 45]
}
then with the Script Transformation Node you can convert the payload (the msg field of the POST_TELEMETRY event) to a format like the one below, which can be stored on database and directly shown from timeseries widgets:
[{
"ts": 1618296229874,
"values": {
"temp": 45
}
}, {
"ts": 1618296169874,
"values": {
"temp": 3
}
}, {
"ts": 1618296109874,
"values": {
"temp": 22
}
}]
Also you might need an upstream Switch Node if you have to distinguish between different kinds of telemetry formats. The resulting full working rule chain will be something like this:
Let's say your particular device is provisioned in Thingsboard with device type MULTIVALUE THERMOSTAT, you can configure the Switch node's function as follows:
function nextRelation(metadata, msg) {
return ['other'];
}
if(metadata.deviceType === 'MULTIVALUE THERMOSTAT') {
return ['multivalue'];
}
return nextRelation(metadata, msg);
This is the Script Transformation Node's function:
var tempArray = msg.temp;
var lastTs = Date.now();
var tsCounter = 0;
var MS_IN_ONE_MINUTE = 60000;
var newMsg = [];
for (var i = tempArray.length - 1; i >= 0; i--) {
let ts = {};
ts.ts = lastTs - tsCounter;
tsCounter += MS_IN_ONE_MINUTE;
let values = {};
values.temp = tempArray[i];
ts.values = values;
newMsg.push(ts);
}
return {msg: newMsg, metadata: metadata, msgType: msgType};
The Transformation function is just a starting point. You can improve it or make it more more accurate for you real needs.
In this example I assumed the input payload doesn't contain the base hour so I got it dinamically with Date.now(). So starting from the last telemetry, for all previous ones I calculated the corresponding timestamps going backward in time.

Google Slides: newly inserted table not found

I´m wondering what is going on. I have two functions which both are working good when called one after one:
function createTable() {
var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0];
var table = slidesPage.insertTable(7, 4);
}
function changeColumnWidth() {
var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0];
var tableId = slidesPage.getTables()[0].getObjectId();
var requests = [{
updateTableColumnProperties: {
objectId: tableId,
"columnIndices": [ 1, 3],
"tableColumnProperties": {
"columnWidth": {
"magnitude": 80,
"unit": "PT"
}
},
"fields": "columnWidth"
}
}];
var createSlideResponse = Slides.Presentations.batchUpdate({
requests: requests
}, '1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI');
}
But trying to combine these two functions like:
function combined() {
createTable();
changeColumnWidth();
}
I´m getting Error:
Invalid requests[0].updateTableColumnProperties: The object (SLIDES_API456304911_0) could not be found.
Wondering if the insertTable method is asynchronous and therefore the created table is not ready?
Thanks for any help.
How about this modification? Please think of this as one of several workarounds. In my workaround, I used saveAndClose() for your situation. Using this, I thought to separate the process of SlidesApp and Slides API.
Modification points :
Save and close the slide using saveAndClose() after the table was inserted.
Return an object ID of inserted table to use at changeColumnWidth().
At changeColumnWidth(), the table is modified by Slides API using the received object ID.
Modified script :
function combined() {
var tableId = createTable(); // Modified
changeColumnWidth(tableId); // Modified
}
function createTable() {
var slide = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI'); // Modified
var slidesPage = slide.getSlides()[9]; // Modified
var table = slidesPage.insertTable(7, 4);
slide.saveAndClose(); // Added
return table.getObjectId();
}
function changeColumnWidth(tableId) { // Modified
// var slidesPage = SlidesApp.openById('1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI').getSlides()[0]; // This line is not used.
// var tableId = slidesPage.getTables()[0].getObjectId(); // This line is not used because slidesPage.getTables().length becomes 0.
var requests = [{
updateTableColumnProperties: {
objectId: tableId,
"columnIndices": [ 1, 3],
"tableColumnProperties": {
"columnWidth": {
"magnitude": 80,
"unit": "PT"
}
},
"fields": "columnWidth"
}
}];
var createSlideResponse = Slides.Presentations.batchUpdate({
requests: requests
}, '1QWRV4eQzGNNBz4SkR3WPurTL3O60oGYxQpBu63KrUoI');
}
Note :
For the slide which is saved and closed by saveAndClose(), when the slide is reopened, the inserted table cannot be retrieved. When the table is tried to be retrieved using getTables() again, the length becomes 0. But at Slides API, the object ID of table can be retrieved. So I thought that the issue might be able to be solved by returning the object ID of table after the table was inserted.
But I couldn't understand about the reason that the values retrieved by getTables() from the reopened Slide become "0" yet. I'm sorry.
Reference :
saveAndClose()
If this workaround was not what you want, I'm sorry.
To achieve your goal - create a table with a specified layout and specific column sizes in one function - you should use the Slides API for the entire task. The Slides API lets you both create and modify the same element in the same batch request, if you provided a unique object ID for it. Otherwise, you have to first create the element, then send the modification request using the objectId found in the response to the first request. This second approach is essentially the behavior you were experiencing when the function calls were done separately.
There are restrictions on user-supplied IDs, naturally:
objectId string: A user-supplied object ID.If you specify an ID, it must be unique among all pages and page elements in the presentation. The ID must start with an alphanumeric character or an underscore (matches regex [a-zA-Z0-9_] ); remaining characters may include those as well as a hyphen or colon (matches regex [a-zA-Z0-9_-:] ). The length of the ID must not be less than 5 or greater than 50.If you don't specify an ID, a unique one is generated.
Given that hyphens are allowed, we can use the Utilites.getUuid() method to help supply our own unique object IDs.
When mixing SlidesApp and Slides, it is very likely that internal Google optimizations (e.g. write-caching) change the operation order. By restricting to a single service for related task operations, we can ensure that the objects we need are available when needed.
This example uses two methods that make Request objects for batchUpdate and ultimately creates a presentation, adds a blank slide, adds a table and modifies it, and then creates another blank slide.
function makeCreateTableRequest_(slideId, rows, columns, shouldSupplyID) {
const tablerq = {
rows: rows,
columns: columns,
elementProperties: {
pageObjectId: slideId,
/** size: {
height: {...},
width: {...}
},
transform: { ... } */
}
};
// If asked to use a custom ID (e.g. also going to modify this table), use a unique one.
if (shouldSupplyID)
tablerq.objectId = ("table" + Utilities.getUuid()).slice(0, 50);
return {createTable: tablerq};
}
function makeModifyTableColumnPropsRequest_(tableId, newWidthDimension, indicesArray) {
const rq = {
objectId: tableId,
fields: "columnWidth" // There are no other fields for this request as of 2018-07
};
if (newWidthDimension && newWidthDimension.magnitude !== undefined && newWidthDimension.unit)
rq.tableColumnProperties = { columnWidth: newWidthDimension };
if (indicesArray && indicesArray.length)
rq.columnIndices = indicesArray;
return {updateTableColumnProperties: rq};
}
function createPresentation_() {
const newPres = { title: "API-created Presentation" };
// Presentations are huge... limit the metadata sent back to us.
const fields = "presentationId,pageSize,title"
+ ",slides(objectId,pageType,pageElements(objectId,size,title,description))"
+ ",masters(objectId,pageType,pageElements(objectId,size,title,description))"
+ ",layouts(objectId,pageType,pageElements(objectId,size,title,description))";
const createdMetadata = Slides.Presentations.create(newPres, {fields: fields});
console.log({message:"Created a Presentation", response: createdMetadata});
return createdMetadata;
}
function addSlide_(pId) {
const response = Slides.Presentations.batchUpdate({ requests: [{ createSlide: {} }] }, pId);
return response.replies[0].createSlide.objectId;
}
function foo() {
const pres = createPresentation_();
const newSlideId = addSlide_(pres.presentationId);
// Get requests to add and to modify tables.
const openingTableRq = makeCreateTableRequest_(pres.slides[0].objectId, 2, 4);
const newTableRq = makeCreateTableRequest_(newSlideId, 7, 4, true);
const changeWidthRq = makeModifyTableColumnPropsRequest_(newTableRq.createTable.objectId, {magnitude: 80, unit: "PT"}, [0]);
// Add and update the desired table, then create a new slide.
var response = Slides.Presentations.batchUpdate({
requests: [
openingTableRq, // will have reply
newTableRq, // will have reply
changeWidthRq, // no reply
{ createSlide: {} } // will have reply
]
}, pres.presentationId);
console.log({message: "Performed updates to the created presentation", response: response});
}

getBulkProperties() Hangs and Errors Out

Our application needs to pull a set of properties from all objects in the model. Our application will concatenate properties from leaf nodes with properties from the parent nodes.
We are calling the getBulkProperties() method with around 20K nodes and around 5 properties. This runs for quite some time and then we receive server errors and the callbacks are never invoked.
Is there a limit we should use? Should we split these calls with a max number X of nodes?
Any help would be appreciated as this is causing our application to hang.
Thanks!
I don't think there is a limit, but you may consider listing properties for a specific group at a time, or just leaf nodes.
This blog post shows how to optimize the search performance, and the code below (from this post) how to integrate with .getBulkProperties:
viewer.search('Steel',
function(dbIds){
viewer.model.getBulkProperties(dbIds, ['Mass'],
function(elements){
var totalMass = 0;
for(var i=0; i<elements.length; i++){
totalMass += elements[i].properties[0].displayValue;
}
console.log(totalMass);
})
}, null, ['Material'])
You may also consider enurating only leaf on the model, as shown at this post and below:
function getAllLeafComponents(viewer, callback) {
var cbCount = 0; // count pending callbacks
var components = []; // store the results
var tree; // the instance tree
function getLeafComponentsRec(parent) {
cbCount++;
if (tree.getChildCount(parent) != 0) {
tree.enumNodeChildren(parent, function (children) {
getLeafComponentsRec(children);
}, false);
} else {
components.push(parent);
}
if (--cbCount == 0) callback(components);
}
viewer.getObjectTree(function (objectTree) {
tree = objectTree;
var allLeafComponents = getLeafComponentsRec(tree.getRootId());
});
}

How to find the column field name when setting template in kendo grid column

I have a list of dynamically generated columns in Kendo grid. I am using here colsList just as an example.
var colsList = ["A", "B", "C"];
for (var j = 0; j < colsList.length; j++){
var columnSchema = {
"field": colsList [j],
template: function (dataItem) {
return getTemplate(dataItem, colsList [j]);
}
};
}
var getTemplate = function (dataItem, field) {
// return tempate format;
};
When getTemplate is called, second parameter i.e. field is always passed here as the last item of colsList.
I need to prepare a column template which will information about the column field associated with it.
How can this be achieved ? Have tried to do this via a number of ways but have not been successful.
I am new to kendo.js and not much familiar with templates.
Is there any other way of preparing template which will help me in achieving what i want to. dataItem and associated col field are the main two requirements while preparing template as on refreshing grid datasource, some conditions need to be checked in template and column data will be filled accordingly.
I have finally found out the way to get the column field by using each function of jQuery instead of using for loop as follows:
var colsList = ["A", "B", "C"];
$.each(colsList, function (index, item) {
var columnSchema = {
"field": item,
template: function (dataItem) {
return getTemplate(dataItem, item);
}
};
});
var getTemplate = function (dataItem, item) {
/* item gives the column field for which template will be set*/
// return tempate format;
};
But still I have no idea why it behaves differently in case of for loop and returns field as undefined in template.

MongoDB SSIS with $unwind

I recently started using MongoDB as a source in SSIS (using C# driver). I am very new with MongoDB and C#.
When I did not have nested documents, statements like below worked for me:
var query = Query.And(Query.Or(Query.GT("CreatedOn",maxUpdatedOnBSON), Query.GT("UpdatedOn", maxUpdatedOnBSON)),
Query.Or(Query.LT("CreatedOn", cutoffDate), Query.LT("UpdatedOn", cutoffDate)),Query.In("TestType", testTypes) );
MongoCursor<BsonDocument> toReturn = collection.Find(query);
Now, I got nested documents. I was able to create java script, and it works with MongoDB itself
db.Test.aggregate( [
{ $unwind : { path: "$Items",includeArrayIndex: "arrayIndex"} } ,
{ $match: { $and: [
{$or: [ { CreatedOn: { $gt: ISODate("2015-11-22T00:00:00Z")} }, {UpdatedOn: { $gt: ISODate("2015-11-22T00:00:00Z") } } ] },
{$or: [ { CreatedOn: { $lt: ISODate("2016-05-09T00:00:00Z")} }, {UpdatedOn: { $lt: ISODate("2016-05-09T00:00:00Z") } } ] }
] }
}] )
In C#, as I understand, I have to use aggregate instead of find but I cannot translate this code to C#. I still have selection criteria and unwind.
Can you please help?
because there is no collection template posted, i'm attaching a snippet something similar to what you would be looking for. does this help?
var builder = Builders<BsonDocument>.Filter;
//and operator can be used similar to below by using operator "&" or builder.And.
var filter = builder.Eq("state", "nj") | builder.Eq("state", "CO");
var filter2 = builder.Eq("pop", 6033) | builder.Eq("city", "nyc");
filter = builder.And(filter, filter2);
var pipeline = grades.Aggregate()
.Unwind(x => x["Items"])
.Match(filter);
var list = pipeline.ToList();
foreach (var item in list)
{
//do something
}
I got help and sharing the solution:
//Create matching criteria used in the aggregation pipeline to bring back only the specified documents based on date range
var match = new BsonDocument("$match",
new BsonDocument("$and",
new BsonArray()
.Add(new BsonDocument("$or", new BsonArray().Add(new BsonDocument("CreatedOn", new BsonDocument("$gt", maxUpdatedOnBSON))).Add(new BsonDocument("UpdatedOn", new BsonDocument("$gt", maxUpdatedOnBSON)))))
.Add(new BsonDocument("$or", new BsonArray().Add(new BsonDocument("CreatedOn", new BsonDocument("$lt", cutoffDate))).Add(new BsonDocument("UpdatedOn", new BsonDocument("$lt", cutoffDate)))))));
//create the arguments to pass to the $unwind method of the aggregation
var unwindargs = new BsonDocument("path", "$LineItems");
unwindargs.Add("includeArrayIndex", "arrayIndex");
//create the unwind stage and add the arguments
var unwind = new BsonDocument("$unwind", unwindargs);
//create a new pipeline and gather the results
var pipeline = new[] { match, unwind };
var mongoArgs = new AggregateArgs { Pipeline = pipeline };
var toReturn = collection.Aggregate(mongoArgs).ToList();