Delay between promises when using Promise.all - ecmascript-6

Is there a way to delay the evaluation of an array of promises using Promise.all()?
Does it make sense to manually add a delay function to the end of each promise before adding them to the array?
Promise.all([p1,p2,p3]).then(res => console.log(res))
I would like to add a delay because my server can't handle too many requests at once.

Promise.all is intended to resolve when the promises are fulfilled, but existing promises are evaluated regardless of Promise.all.
In order to do this, promises should be initially created to produce a delay:
const delayIncrement = 500;
let delay = 0;
const p1 = new Promise(resolve => setTimeout(resolve, delay)).then(() => fetch(...));
delay += delayIncrement;
const p2 = new Promise(resolve => setTimeout(resolve, delay)).then(() => fetch(...));
delay += delayIncrement;
...
Promise.all([p1,p2,p3]).then(...);
The same solution can be used for creating request promises in batch inside a loop.
The recipes for delayed promises can be found in this answer.

I needed to create the calls dynamically, so based on the answer from #estus-flask, managed to come up with:
let delay = 0; const delayIncrement = 1000;
const promises = items.map(item => {
delay += delayIncrement;
return new Promise(resolve => setTimeout(resolve, delay)).then(() =>
fetch(...);
})
let results = await Promise.all(promises);

Yes, you can delay promises using Promise.all to create staggered execution and it's quite easy to do:
// Promise.all() with delays for each promise
let tasks = [];
for (let i = 0; i < 10; i++) {
const delay = 500 * i;
tasks.push(new Promise(async function(resolve) {
// the timer/delay
await new Promise(res => setTimeout(res, delay));
// the promise you want delayed
// (for example):
// let result = await axios.get(...);
let result = await new Promise(r => {
console.log("I'm the delayed promise...maybe an API call!");
r(delay); //result is delay ms for demo purposes
});
//resolve outer/original promise with result
resolve(result);
}));
}
let results = Promise.all(tasks).then(results => {
console.log('results: ' + results);
});
You can run it here too.
Rather than a delay between the chain, which can be done with .then() as shown in other answers, this is a delay that differs for each Promise so that when you call Promise.all() they will be staggered. This is useful when, say, you are calling an API with a rate limit that you'd breach by firing all the calls in parallel.
Peace

The simplest solution for me seemed to be to just take the current index of the map function that produces the promises, and use that index to base a delay on:
const sleep = ms => new Promise(resolve => setTimeout(resolve, ms))
await Promise.all(
dataPoints.map(async (dataPoint, index) => {
await sleep(index * 1000)
...
This makes each of the operations wait index * 1 second to be fired, effectively placing a 1s delay between each operation.

Is there a way to delay the evaluation of an array of promises using
Promise.all()?
No. Promises are not "evaluated", they just resolve. When this happens is determined by their creator and nothing else. When Promise.all is called, the promises p1, p2 and p3 have already been created (and their asynchronous tasks probably already have been started).

Another way you can do this is by hijacking the way loops are transpiled:
async function doABatchOfAsyncWork(workItems) {
for (const item of workItems) {
await workTask(item)
await delay(1000) // not built-in but easily implemented with setTimeout + promise
}
}
You can also save the values of course and return them at the end, exactly as you usually can in for-loops. You can't do this with map since the await would have to be in the async context of the map-functor passed in. If you used map it would execute everything at ~the same time, with a delay of 1s at the end.

Related

Use CLS with Sequelize Unmanaged transactions

I am writing unit tests for my code and wish to use transactions to prevent any stray data between tests.
The code uses Sequelize ORM for all interactions with the database. Since changing the actual code is not an option, I would be using cls-hooked to maintain transaction context instead of passing transaction to all the queries. There is a problem, however. On reading the official documentation and trying to go about it, the above approach seems to only work for managed transactions.
So far, the test code looks somewhat like:
test("Test decription", async () => {
try {
await sequelize.transaction(async (t) => {
//Actual test code
});
} catch (error) {
//Do nothing if query rolled back
}
});
What I intend to achieve (for obvious reasons):
let t;
beforeEach(async () => {
t = await sequelize.transaction();
});
test("Test decription", async () => {
//Actual test code
});
afterEach(async () => {
await t.rollback();
});
Is this possible? If yes, any help in implementing this would be appreciated.
I'm having the same problem -- after much Googling, I found this closed issue where they indicate that unmanaged transactions aren't supported by design 🥲
It's true that Sequelize doesn't automatically pass transactions to queries when you're using unmanaged transactions. But you can manually set the transaction property on the CLS namespace, just like Sequelize does on a managed transaction:
https://github.com/sequelize/sequelize/blob/v6.9.0/lib/transaction.js#L135
namespace.run(() => {
namespace.set('transaction', transaction);
/* run code that does DB operations */
});
This is tricky for tests because describe() calls can be nested and each can have their own beforeAll()/afterAll() and beforeEach()/afterEach() hooks. To do this right, each before hook needs to set up a nested transaction and the corresponding after hook should roll it back. In addition, the test case itself needs to run in a nested transaction so that its DB operations don't leak into other tests.
For anyone from the future:
I was facing the above problem and found a way to fix it with a helper function and cls-hooked.
const transaction = async (namespace: string, fn: (transaction: Transaction) => unknown) => {
const nameSpace = createNamespace(namespace);
db.Sequelize.useCLS(nameSpace);
const sequelize = db.sequelize;
const promise = sequelize.transaction(async (transaction: Transaction) => {
try {
await fn(transaction);
} catch (error) {
console.error(error);
throw error;
}
throw new TransactionError();
});
await expect(promise).rejects.toThrow(TransactionError);
destroyNamespace(namespace);
};
What the above code does is creating a cls namespace and transaction that will be discarded after the test run, the TransactionError is thrown to ensure the entire transaction is always rolled back on each test run.
Usage on tests would be:
describe('Transaction', () => {
test('Test', async () => {
await transaction('test', async () => {
// test logic here
});
});
});

calling store procedures within fast-csv asynchronously

I am writing a backend API in node.js and need the functionality for users to be able to upload files with data and then calling stored procedures for inserting data into MySQL. I'm thinking of using fast-csv as parser, however I am struggling with how to set up the call to stored procedure in csv stream. the idea is something like this:
var fs = require("fs");
var csv = require("fast-csv");
var stream1 = fs.createReadStream("files/testCsvFile.csv");
csv
.fromStream(stream2, { headers: true })
.on("data", function(data) {
//CALL TO SP with params from "data"//
numlines++;
})
.on("end", function() {
console.log("done");
});
In other parts of application I have set up routes as follows:
auth.post("/verified", async (req, res) => {
var user = req.session.passwordless;
if (user) {
const rawCredentials = await admin.raw(getUserRoleCredentials(user));
const { user_end, role } = await normalizeCredentials(rawCredentials);
const user_data = { user_end, role };
res.send(user_data);
} else {
res.sendStatus(401);
}
});
..that is - routes are written in async/await way with queries (all are Stored Procedures called) being defined as Promises.. I would like to follow this pattern in upload/parse csv/call SP for every line function
This is doing the job for me - - can you please describe how to achive that with your framework - - I believe it should be done somehowe, I just need to configure it correctli
//use fast-csv to stream data from a file
csv
.fromPath(form.FileName, { headers: true })
.on("data", async data => {
const query = await queryBuilder({
schema,
routine,
parameters,
request
}); //here we prepare query for calling the SP with parameters from data
winston.info(query + JSON.stringify(data));
const rawResponse = await session.raw(query); //here the query gets executed
fileRows.push(data); // push each row - for testing only
})
.on("end", function() {
console.log(fileRows);
fs.unlinkSync(form.FileName); // remove temp file
//process "fileRows" and respond
res.end(JSON.stringify(fileRows)) // - for testing
});
As mentioned in the comment, I made my scramjet to handle such a use case with ease... Please correct me if I understood it wrong, but I understand you want to call the two await lines for every CSV row in the test.
If so, your code would look like this (updated to match your comment/answer):
var fs = require("fs");
var csv = require("fast-csv");
var stream1 = fs.createReadStream("files/testCsvFile.csv");
var {DataStream} = require("scramjet");
DataStream
// the following line will convert any stream to scramjet.DataStream
.from(csv.fromStream(stream2, { headers: true }))
// the next lines controls how many simultaneous operations are made
// I assumed 16, but if you're fine with 40 or you want 1 - go for it.
.setOptions({maxParallel: 16})
// the next line will call your async function and wait until it's completed
// and control the back-pressure of the stream
.do(async (data) => {
const query = await queryBuilder({
schema,
routine,
parameters,
request
}); //here we prepare query for calling the SP with parameters from data
winston.info(query + JSON.stringify(data));
const rawResponse = await session.raw(query); //here the query gets executed
return data; // push each row - for testing only)
})
// next line will run the stream until end and return a promise
.toArray()
.then(fileRows => {
console.log(fileRows);
fs.unlinkSync(form.FileName); // remove temp file
//process "fileRows" and respond
res.end(JSON.stringify(fileRows)); // - for testing
})
.catch(e => {
res.writeHead(500); // some error handling
res.end(e.message);
})
;
// you may want to put an await statement before this, or call then to check
// for errors, which I assume is your use case.
;
To answer your comment question - if you were to use an async function in the on("data") event - you would need to create an array of promises and await Promise.all of that array on stream end - but that would need to be done synchronously - so async function in an event handler won't do it.
In scramjet this happens under the hood, so you can use the function.

How do I use promises in a Chrome extension?

What I am trying to do is create a chrome extension that creates new, nested, bookmark folders, using promises.
The function to do this is chrome.bookmarks.create(). However I cannot just
loop this function, because chrome.bookmarks.create is asynchronous. I need to wait until the folder is created, and get its new ID, before going on to its children.
Promises seem to be the way to go. Unfortunately I cannot find a minimal working example using an asynchronous call with its own callback like chrome.bookmarks.create.
I have read some tutorials 1, 2, 3, 4. I have searched stackOverflow but all the questions do not seem to be about plain vanilla promises with the chrome extension library.
I do not want to use a plugin or library: no node.js or jquery or Q or whatever.
I have tried following the examples in the tutorials but many things do not make sense. For example, the tutorial states:
The promise constructor takes one argument—a callback with two
parameters: resolve and reject.
But then I see examples like this:
const wait = ms => new Promise(resolve => setTimeout(resolve, ms));
How this works is a mystery to me.
Also, how can you call resolve() when its never been defined? No example in the tutorials seem to match real life code. Another example is:
function isUserTooYoung(id) {
return openDatabase() // returns a promise
.then(function(col) {return find(col, {'id': id});})
How do I pass in col, or get any results!
So if anyone can give me a minimal working example of promises with an asynchronous function with its own callback, it would be greatly appreciated.
SO wants code, so here is my non-working attempt:
//loop through all
function createBookmarks(nodes, parentid){
var jlen = nodes.length;
var i;
var node;
for(var i = 0; i < nodes.length; i++){
var node = nodes[i];
createBookmark(node, parentid);
}
}
//singular create
function createBookmark(node, parentid){
var bookmark = {
parentId : parentid,
index : node['index'],
title : node['title'],
url : node['url']
}
var callback = function(result){
console.log("creation callback happened.");
return result.id; //pass ID to the callback, too
}
var promise = new Promise(function(resolve, reject) {
var newid = chrome.bookmarks.create(bookmark, callback)
if (newid){
console.log("Creating children with new id: " + newid);
resolve( createBookmarks(bookmark.children, newid));
}
});
}
//allnodes already exists
createBookmarks(allnodes[0],"0");
Just doesn't work. The result from the callback is always undefined, which it should be, and I do not see how a promise object changes anything. I am equally mystified when I try to use promise.then().
var newid = promise.then( //wait for a response?
function(result){
return chrome.bookmarks.create(bookmark, callback);
}
).catch(function(error){
console.log("error " + error);
});
if (node.children) createBookmarks(node.children, newid);
Again, newid is always undefined, because of course bookmarks.create() is asynchronous.
Thank you for any help you can offer.
Honestly, you should just use the web extension polyfill. Manually promisifying the chrome APIs is a waste of time and error prone.
If you're absolutely insistent, this is an example of how you'd promisify chrome.bookmarks.create. For other chrome.* APIs, you also have to reject the callback's error argument.
function createBookmark(bookmark) {
return new Promise(function(resolve, reject) {
try {
chrome.bookmarks.create(bookmark, function (result) {
if (chrome.runtime.lastError) reject(chrome.runtime.lastError)
else resolve(result)
})
} catch (error) {
reject(error)
}
})
}
createBookmark({})
.then(function (result) {
console.log(result)
}).catch(function (error) {
console.log(error)
})
To create multiple bookmarks, you could then:
function createBookmarks(bookmarks) {
return Promise.all(
bookmarks.map(function (bookmark) {
return createBookmark(bookmark)
})
)
}
createBookmarks([{}, {}, {}, {}])
.catch(function (error) {
console.log(error)
})
Take the advantage of the convention that the callback function always be the last argument, I use a simple helper function to promisify the chrome API:
function toPromise(api) {
return (...args) => {
return new Promise((resolve) => {
api(...args, resolve);
});
};
}
and use it like:
toPromise(chrome.bookmarks.create)(bookmark).then(...);
In my use case, it just works most of the time.

NightmareJS asynchronous operations in a loop, wait for everything to be done

I have a gulp task that uses Nightmare to visit a series of URLs, extract SVGs from them, process them and output them.
gulp.task('export', done => {
const path = require('path');
const Nightmare = require('nightmare');
const nightmare = new Nightmare();
const urls = ['http://one.com', 'http://two.org', 'http://three.net'];
async function exportPDFs (items) {
for (url of items) {
const filename = path.parse(url).name;
const selector = 'svg';
await nightmare
.goto(url)
.wait(selector)
.evaluate(selector => {
let content;
// Extract SVG from the page
return content;
}, selector)
.then(
svg => {
// Heavy operation that takes long
// How do I wait for this properly?
processThing(filename);
outputThing(filename);
},
err => console.error('Page evaluation failed', err)
);
}
await nightmare.end().then(() => done()); // ???
}
exportPDFs(urls);
});
How can I make it wait for the processing and outputting on each iteration, and at the end of all of them end the gulp task with done()?
Currently it ends before saving the last PDF:
Starting 'export'...
one.pdf saved
two.pdf saved
Finished 'export' after 3.2 s
three.pdf saved
Convert processThing and outputThing into promise. Then chain them like this,
.evaluate(()=>/*The code*/)
.then(processThing)
.then(outputThing)
.catch(e=>/*deal with errors*/)

Yielding streams in generator functions

I have recently started using Q.spawn function to run generators that yield promises. This works well in browsers where the support for streams is yet to land but in case of node we have streams. If you're using streams inside a generator function and would like to yield once writer stream is done then your code becomes not so clean.
Q.spawn(function* () {
yield new Promise(resolve => {
let fromStream = fs.createReadStream('x.txt');
let toStream = fs.createWriteStream('y.txt');
toStream.on('end', () => resolve());
fromStream.pipe(toStream);
});
});
It works but as soon as I start dealing with a lot streams the code becomes really ugly. Can this be made as simple as following snippet?.
someGeneratorFuncRunner(function* () {
yield fs.createReadStream('x.txt')
.pipe(fs.createWriteStream('y.txt'));
});
You don't need to put most of that code inside the Promise constructor:
Q.spawn(function* () {
let fromStream = fs.createReadStream('x.txt');
let toStream = fs.createWriteStream('y.txt');
fromStream.pipe(toStream);
yield new Promise(resolve => {
toStream.on('end', resolve);
});
});
And of course if you're waiting for lots of streams, it would make sense to factor out the promise constructor call into a helper function:
function endOf(stream) {
return new Promise(resolve => {
stream.on('end', resolve);
});
}