Our application needs to pull a set of properties from all objects in the model. Our application will concatenate properties from leaf nodes with properties from the parent nodes.
We are calling the getBulkProperties() method with around 20K nodes and around 5 properties. This runs for quite some time and then we receive server errors and the callbacks are never invoked.
Is there a limit we should use? Should we split these calls with a max number X of nodes?
Any help would be appreciated as this is causing our application to hang.
Thanks!
I don't think there is a limit, but you may consider listing properties for a specific group at a time, or just leaf nodes.
This blog post shows how to optimize the search performance, and the code below (from this post) how to integrate with .getBulkProperties:
viewer.search('Steel',
function(dbIds){
viewer.model.getBulkProperties(dbIds, ['Mass'],
function(elements){
var totalMass = 0;
for(var i=0; i<elements.length; i++){
totalMass += elements[i].properties[0].displayValue;
}
console.log(totalMass);
})
}, null, ['Material'])
You may also consider enurating only leaf on the model, as shown at this post and below:
function getAllLeafComponents(viewer, callback) {
var cbCount = 0; // count pending callbacks
var components = []; // store the results
var tree; // the instance tree
function getLeafComponentsRec(parent) {
cbCount++;
if (tree.getChildCount(parent) != 0) {
tree.enumNodeChildren(parent, function (children) {
getLeafComponentsRec(children);
}, false);
} else {
components.push(parent);
}
if (--cbCount == 0) callback(components);
}
viewer.getObjectTree(function (objectTree) {
tree = objectTree;
var allLeafComponents = getLeafComponentsRec(tree.getRootId());
});
}
Related
This question already has answers here:
JavaScript closure inside loops – simple practical example
(44 answers)
Closed 4 years ago.
The community reviewed whether to reopen this question 3 months ago and left it closed:
Duplicate This question has been answered, is not unique, and doesn’t differentiate itself from another question.
I am running an event loop of the following form:
var i;
var j = 10;
for (i = 0; i < j; i++) {
asynchronousProcess(callbackFunction() {
alert(i);
});
}
I am trying to display a series of alerts showing the numbers 0 through 10. The problem is that by the time the callback function is triggered, the loop has already gone through a few iterations and it displays a higher value of i. Any recommendations on how to fix this?
The for loop runs immediately to completion while all your asynchronous operations are started. When they complete some time in the future and call their callbacks, the value of your loop index variable i will be at its last value for all the callbacks.
This is because the for loop does not wait for an asynchronous operation to complete before continuing on to the next iteration of the loop and because the async callbacks are called some time in the future. Thus, the loop completes its iterations and THEN the callbacks get called when those async operations finish. As such, the loop index is "done" and sitting at its final value for all the callbacks.
To work around this, you have to uniquely save the loop index separately for each callback. In Javascript, the way to do that is to capture it in a function closure. That can either be done be creating an inline function closure specifically for this purpose (first example shown below) or you can create an external function that you pass the index to and let it maintain the index uniquely for you (second example shown below).
As of 2016, if you have a fully up-to-spec ES6 implementation of Javascript, you can also use let to define the for loop variable and it will be uniquely defined for each iteration of the for loop (third implementation below). But, note this is a late implementation feature in ES6 implementations so you have to make sure your execution environment supports that option.
Use .forEach() to iterate since it creates its own function closure
someArray.forEach(function(item, i) {
asynchronousProcess(function(item) {
console.log(i);
});
});
Create Your Own Function Closure Using an IIFE
var j = 10;
for (var i = 0; i < j; i++) {
(function(cntr) {
// here the value of i was passed into as the argument cntr
// and will be captured in this function closure so each
// iteration of the loop can have it's own value
asynchronousProcess(function() {
console.log(cntr);
});
})(i);
}
Create or Modify External Function and Pass it the Variable
If you can modify the asynchronousProcess() function, then you could just pass the value in there and have the asynchronousProcess() function the cntr back to the callback like this:
var j = 10;
for (var i = 0; i < j; i++) {
asynchronousProcess(i, function(cntr) {
console.log(cntr);
});
}
Use ES6 let
If you have a Javascript execution environment that fully supports ES6, you can use let in your for loop like this:
const j = 10;
for (let i = 0; i < j; i++) {
asynchronousProcess(function() {
console.log(i);
});
}
let declared in a for loop declaration like this will create a unique value of i for each invocation of the loop (which is what you want).
Serializing with promises and async/await
If your async function returns a promise, and you want to serialize your async operations to run one after another instead of in parallel and you're running in a modern environment that supports async and await, then you have more options.
async function someFunction() {
const j = 10;
for (let i = 0; i < j; i++) {
// wait for the promise to resolve before advancing the for loop
await asynchronousProcess();
console.log(i);
}
}
This will make sure that only one call to asynchronousProcess() is in flight at a time and the for loop won't even advance until each one is done. This is different than the previous schemes that all ran your asynchronous operations in parallel so it depends entirely upon which design you want. Note: await works with a promise so your function has to return a promise that is resolved/rejected when the asynchronous operation is complete. Also, note that in order to use await, the containing function must be declared async.
Run asynchronous operations in parallel and use Promise.all() to collect results in order
function someFunction() {
let promises = [];
for (let i = 0; i < 10; i++) {
promises.push(asynchonousProcessThatReturnsPromise());
}
return Promise.all(promises);
}
someFunction().then(results => {
// array of results in order here
console.log(results);
}).catch(err => {
console.log(err);
});
async await is here
(ES7), so you can do this kind of things very easily now.
var i;
var j = 10;
for (i = 0; i < j; i++) {
await asycronouseProcess();
alert(i);
}
Remember, this works only if asycronouseProcess is returning a Promise
If asycronouseProcess is not in your control then you can make it return a Promise by yourself like this
function asyncProcess() {
return new Promise((resolve, reject) => {
asycronouseProcess(()=>{
resolve();
})
})
}
Then replace this line await asycronouseProcess(); by await asyncProcess();
Understanding Promises before even looking into async await is must
(Also read about support for async await)
Any recommendation on how to fix this?
Several. You can use bind:
for (i = 0; i < j; i++) {
asycronouseProcess(function (i) {
alert(i);
}.bind(null, i));
}
Or, if your browser supports let (it will be in the next ECMAScript version, however Firefox already supports it since a while) you could have:
for (i = 0; i < j; i++) {
let k = i;
asycronouseProcess(function() {
alert(k);
});
}
Or, you could do the job of bind manually (in case the browser doesn't support it, but I would say you can implement a shim in that case, it should be in the link above):
for (i = 0; i < j; i++) {
asycronouseProcess(function(i) {
return function () {
alert(i)
}
}(i));
}
I usually prefer let when I can use it (e.g. for Firefox add-on); otherwise bind or a custom currying function (that doesn't need a context object).
var i = 0;
var length = 10;
function for1() {
console.log(i);
for2();
}
function for2() {
if (i == length) {
return false;
}
setTimeout(function() {
i++;
for1();
}, 500);
}
for1();
Here is a sample functional approach to what is expected here.
ES2017: You can wrap the async code inside a function(say XHRPost) returning a promise( Async code inside the promise).
Then call the function(XHRPost) inside the for loop but with the magical Await keyword. :)
let http = new XMLHttpRequest();
let url = 'http://sumersin/forum.social.json';
function XHRpost(i) {
return new Promise(function(resolve) {
let params = 'id=nobot&%3Aoperation=social%3AcreateForumPost&subject=Demo' + i + '&message=Here%20is%20the%20Demo&_charset_=UTF-8';
http.open('POST', url, true);
http.setRequestHeader('Content-type', 'application/x-www-form-urlencoded');
http.onreadystatechange = function() {
console.log("Done " + i + "<<<<>>>>>" + http.readyState);
if(http.readyState == 4){
console.log('SUCCESS :',i);
resolve();
}
}
http.send(params);
});
}
(async () => {
for (let i = 1; i < 5; i++) {
await XHRpost(i);
}
})();
JavaScript code runs on a single thread, so you cannot principally block to wait for the first loop iteration to complete before beginning the next without seriously impacting page usability.
The solution depends on what you really need. If the example is close to exactly what you need, #Simon's suggestion to pass i to your async process is a good one.
I am missing something fundamental in terms of callbacks/async in the code below: why do I get:
[,,'[ {JSON1} ]']
[,,'[ {JSON2} ]']
(=2 console returns) instead of only one console return with one proper table, which is want I want and would look like:
[,'[ {JSON1} ]','[ {JSON2} ]']
or ideally:
[{JSON1},{JSON2}]
See my code below, getPTdata is a function I created to retrieve some JSON via a REST API (https request). I cannot get everything at once since the API I'm talking to has a limit, hence the limit and offset parameters of my calls.
offsets = [0,1]
res = []
function goGetData(callback) {
for(var a = 0; a < offsets.length; a++){
getPTdata('stories',
'?limit=1&offset='+offsets[a]+'&date_format=millis',
function(result){
//called once getPTdata is done
res[a] = result
callback(res)
});
}
}
goGetData(function(notgoingtowork){
//called once goGetData is done
console.log(res)
})
Solved like this:
offsets = [0,1]
res = []
function goGetData(callback) {
var nb_returns = 0
for(var a = 0; a < offsets.length; a++){
getPTdata('stories','?limit=1&offset='+offsets[a]+'&date_format=millis', function(result){
//note because of "loop closure" I cannot use a here anymore
//called once getPTdata is done, therefore we know result and can store it
nb_returns++
res.push(JSON.parse(result))
if (nb_returns == offsets.length) {
callback(res)
}
});
}
}
goGetData(function(consolidated){
//called once goGetData is done
console.log(consolidated)
})
New to node, As I am cycling through a roster of students, I need to check and see if a teacher has requested them for tutoring.
I realized I can't just do this:
var checkRequest = function(id){
var value = '';
roster.query('SELECT * FROM teacher_request WHERE student_id ='+id, function(err, row){
value = row.length;
}
return value;
}
After a bit of digging around promises looked like a great solution, but if I simply return the deferred.promise from the checkRequest function, all I get is an object that says [deferred promise] which I can't access the actual data from. (Or have not figured out how yet)
If I follow along with their api and use .then (as illustrated in the getRow) function, I am back in the same problem I was in before.
function checkRequest(id) {
console.log(id);
var deferred = Q.defer();
connection.query('SELECT * FROM teacher_request WHERE student_id ='+id, function(err, row){
deferred.resolve(row.length);
});
return deferred.promise;
}
var getRow = function(id){
checkRequest(id).then(function(val) {
console.log(val); // works great
return val; //back to the same problem
});
}
The roster needs to be able to be pulled from an external API which is why I am not bundling the request check with the original roster query.
Thanks in advance
From the stuff you posted, I assume you have not really understood the concept of promises. They allow you to queue up callbacks, that get executed, when the asynchronous operation has finished (by succeeding or failing).
So instead of somehow getting the results back to your synchronous workflow, you should convert that workflow to work asynchronous as well. So a small example for your current problem:
// your students' ids in here
var studentsArray = [ 1, 2, 5, 6, 9 ];
for( var i=0; i<studentsArray.length; i++ ) {
checkRequest( i )
.then( function( data ){
console.log( data.student_id );
// any other code related to a specific student in here
});
}
or another option, if you need all students' data at the same time:
// your students' ids in here
var studentsArray = [ 1, 2, 5, 6, 9 ];
// collect all promises
var reqs = [];
for( var i=0; i<studentsArray.length; i++ ) {
reqs.push( checkRequest( i ) );
}
Q.all( reqs )
.then( function(){
// code in here
// use `arguments` to access data
});
I'm attempting to gradually refactor existing code. I have a set of functions that are defined, and only differ by one of the internal arguments:
function loadGame1():void
{
loadGame("save1");
}
function loadGame2():void
{
loadGame("save2");
}
function loadGame3():void
{
loadGame("save3");
}
//... snip many, many lines
// Note- I cannot pass function arguments at this time!
picker(loadGame1, loadGame2, loadGame3 ...);
I'm trying to refactor at least part of this (I can't completely replace the whole thing yet, too many interdependencies).
Basically, I want to be able to generate a big set of functions with the difference between the functions being a internal parameter:
var fNames:Array = new Array("save1", "save2", "save3");
var funcs:Array = new Array();
for (var i = 0; i < fNames.length; i += 1)
{
trace("Creating function with indice = ", i);
funcs.push(
function() : void
{
saveGame(fNames[i]);
}
)
}
picker(funcs[0], funcs[1], funcs[2] ...);
However, as I understand it, closure is causing the state of i to be maintained beyond the scope of the for loop, and any attempt to call any of the generated functions is failing with an out-of-bounds error, which is what you would expect given that i will reach fNames.size + 1 before i < fNames.size evaluates to false.
So, basically, given that I need to generate functions that are passed as arguments to a pre-existing function that I cannot change currently. How can I dynamically generate these functions?
Try to use IIFE:
for (var i = 0; i < fNames.length; i += 1)
{
(function(i){
trace("Creating function with indice = ", i);
funcs.push(
function() : void
{
saveGame(fNames[i]);
}
)
})(i);
}
I want to write a function in JS where I will loop through a tables in my indexed DB and get the maximum value of last modified of table and return that
function readData(){
var trans = '';
trans = idb.transaction(["tableName"],'readonly'); // Create the transaction
var request = trans.objectStore("tableName").openCursor();
request.onsuccess = function(e) {
var cursor = request.result || e.result;
if(cursor) {
// logic to and find maximum
} else {
return // max last modified
}
cursor.continue();
}
}
IMP--Since onsuccess method is asynchronous how can i make it synchronous? so that my method readData() will return only when max last modified record is found successfully. I can call this method(readData()) synchronously to get last modified record of 2-3 tables if I want.
The sync API is only available in a webworker. So this would be the first requirement. (As far as I know only IE10 supports this at the moment)
An other shot you can give is working with JS 1.7 and use the yield keyword. For more information about it look here
I would sugest to work with a callbakck method that you call when you reached the latest value.
function readData(callback){
var trans = '';
trans = idb.transaction(["tableName"],'readonly'); //Create the transaction
var request = trans.objectStore("tableName").openCursor();
var maxKey;
request.onsuccess = function(e) {
var cursor = request.result || e.result;
if(cursor.value){
//logic to and find maximum
maxKey = cursor.primaryKey
cursor.continue();
}
}
trans.oncomplete = function(e) {
callback(maxKey);
}
}
IndexedDB API in top frame is async. async cannot be synchronous. But you can read all tables in single transaction.