Implement polling mechanism within a JavaScript promise chain - es6-promise

I am writing a node.js (v12.13.0) script that involves a REST API. Imagine the following sequence of tasks, to be performed in the same order:
Task 1
|
User Action
|
Task 2
|
Task 3
Task 1 => obtain a 5-digit alphanumeric code (valid for 10 mins) by calling the REST API
User Action => user enters aforementioned alphanumeric code in a web browser
Task 2 => obtain an OAuth2 authorization code by calling the REST API; if the User Action step is not completed, the REST API will return an error response
Task 3 => obtain OAuth2 tokens by calling the REST API after the authorization code is received as as a result of completing Task 2
User Action and Task 2 must be completed within 10 minutes of the generation of the 5-digit alphanumeric code in Task 1. Since Tasks 1, 2 and 3 are performed within a script, there is no way of knowing exactly when a user completes User Action. So, the only option is for Task 2 to repeatedly call the REST API until an OAuth2 authorization code is generated. If 10 minutes has elapsed and the REST API continues to yield an error, then the script must stop since it cannot perform Task 3.
I tried implementing this using JavaScript's Promise (ES 6) and setInterval along the following lines:
const fnPerformTask1 = () => {
return new Promise((resolve, reject) => {
/*
code goes here, to call REST API and obtain a 5-digit alphanumeric code
*/
resolve({
"code": "<5_digit_apha_code>"
"genTime": new Date()
})
});
}
const fnPerformTask2 = (task1Result) => {
return new Promise((resolve, reject) => {
/*
(a) code goes here, to call REST API and obtain an OAuth2 authorization code
(b) if REST API returns error _and_ 10 minutes have not yet elapsed, repeat step (a)
*/
resolve(<authorization_code>);
});
}
const fnPerformTask3 = (task2Result) => {
return new Promise(resolve, reject) => {
/*
code goes here, to call REST API and obtain OAuth2 tokens
*/
resolve(<tokens>);
});
}
fnPerformTask1()
.then(fnPerformTask2)
.catch(console.error) // if Task 2 fails, break out of this promise chain and don't perform Task 3
.then(fnPerformTask3)
.then(console.info)
.catch(console.error);
Can I implement some kind of a poll-like mechanism in Task 2, using Promise and setInterval, and clear the interval when either the REST API returns an authorization code, or 10 minutes has elapsed without receiving an authorization code... presumably because the user did not complete User Action within 10 minutes?
A second, related question is how do I break out of a promise chain if Task 2 cannot be completed?

how do I break out of a promise chain if Task 2 cannot be completed?
fnPerformTask1()
.then(fnPerformTask2)
.catch(err => throw err) // this will break the promise chain and the control will move to subsequent .catch()
.then(fnPerformTask3)
.then(console.info)
.catch((err) => console.log("potential error thrown from fnPerformTask2"));

Related

How to fetch data from MySQL after POST completes

When the user clicks a button, I want the app to
read text from an input field
post it to MySQL to add as a new row
pull all rows from MySQL (including the new row)
update the react component state with the new data it just pulled
and rerender a list of all that data for the user
The problem I'm seeing is that pulling down all rows doesn't include the one that was just posted unless I include a manual delay (via setTimeout).
This is what my 2 functions look like
// gets all data, updates state, thus triggering react to rerender
listWords() {
setTimeout(() => fetch("http://localhost:9000/listWords")
.then(res => res.text())
.then(res => this.setState({ wordList: JSON.parse(res)})), 2000)
}
// sends new word to the MySQL database, then kicks off the listWords process to refetch all data
addWord() {
fetch("http://localhost:9000/addWord", {
method: 'POST',
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json',
},
body: JSON.stringify({
word: this.state.newWord
})
})
.then(res => res.json())
.then(this.listWords())
}
I shouldn't have to have that setTimeout in there, but when I remove it, the listWords update doesn't have the newly posted row.
My best guess is that either
I'm messing up the promise .then() paradigm, or
MySQL responds to my POST request but doesn't actually add the row until the GET has already completed, thereby missing out on the new row.
How can I ensure the POST has successfully completed before I pull the rows again?
I am using react, express, sequelize, and for the DB it's MySQL in a docker container.
You indeed made a mistake, a common trap which I myself fell into as well a few times :)
When calling addWord() it will evaluate then(this.listWords()). This will immediately call listWords().
Instead you pass a function to then(...) (instead of calling a function) for example like this:
.then(() => this.listWords())
or even like this:
.then(this.listWords)
Now, instead of passing the result of listWords() to then() it will pass a function that will call listWords() to then() thus only executing listWords() when the Promise reaches this then()
Example to make this behavior even clearer:
When calling a function function foo(arg1) {}, JS needs to know the value of arg1 before calling foo. Let's take the following piece of code:
foo(bar())
In this case, bar() must be called before foo(...) because the arg1 will be the returned value of bar(). In constrast to the following case:
foo(() => bar())
Now, arg1 will be a function instead of the returned value of bar(). This is equivalent to:
var arg = () => bar();
foo(arg);
In contrast to:
var arg = bar();
foo(arg);
Where it is obvious what will happen.
In chained functions like foo(arg1).bar(arg2).baz(arg3), all args will be evaluated before calling foo(...).
Some help to debug such problems: Use the Network Inspector in your browser. It will show the order of requests performed and in this example you would have seen that the GET request was actually performed before the POST request. This might not explain why it happens but you can understand the problem faster.

Cloud Functions for Firebase Could not handle the request after a successful request

TLDR: After writing a JSON (successfully) to my Firestore, the next request will give me Internal Server Error (500). I have a suspicion that the problem is that inserting is not yet complete.
So basically, I have this code:
const jsonToDb = express();
exports.jsondb = functions.region('europe-west1').https.onRequest(jsonToDb);
jsonToDb.post('', (req, res) => {
let doc;
try {
doc = JSON.parse(req.body);
} catch(error) {
res.status(400).send(error.toString()).end();
return;
}
myDbFuncs.saveMyDoc(doc);
res.status(201).send("OK").end();
}
The database functions are in another JS file.
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
newDoc.set(myDoc).then().catch();
return;
};
So I have several theories, maybe one of them is not wrong, but please help me with this. (Also if I made some mistakes in this little snippet, just tell me.)
Reproduction:
I send the first request => everything is OK, Json in the database.
I send a second request after the first request give me OK status => it does not do anything for a few secs, then 500: Internal Server Error.
Logs: Function execution took 4345 ms, finished with status: 'connection error'.
I just don't understand. Let's imagine I'm using this as an API, several requests simultaneously. Can't it handle? (I suppose it can handle, just I do something stupid.) Deliberately, I'm sending the second request after the first has finished and this occurs.
Should I make the saveMyDoc async?
saveMyDoc isn't returning a promise that resolves when all the async work is complete. If you lose track of a promise, Cloud Functions will shut down the work and clean up before the work is complete, making it look like it simply doesn't work. You should only send a response from an HTTP type function after all the work is fully complete.
Minimally, it should look more like this:
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
return newDoc.set(myDoc);
};
Then you would use the promise in your main function:
myDbFuncs.saveMyDoc(doc).then(() => {
res.status(201).send("OK").end();
}
See how the response is only sent after the data is saved.
Read more about async programming in Cloud Functions in the documentation. Also watch this video series that talks about working with promises in Cloud Functions.

Angular 6 HTTPClient: Request fired once, receives 2 responses

I've refactored an old project (Angular 2) to Angular 6. All works well, besides a problem I have with api calls.
On the sign-in component, when I submit the form, fires a POST request with the data and the interceptor adds certain headers (for now only content-type).
Code on submitting the form:
this.authService.signIn(this.account)
.subscribe( res => {
console.log('RES -> ', res);
this.router.navigate([this.returnUrl]);
},
err => console.log(err));
AuthService methods:
signIn(account: Account) {
const req = new HttpRequest(HttpMethods.Post, AuthService.signInUrl,{account: account});
return this.makeRequest(req);
}
private makeRequest(req: HttpRequest<any>): Observable<any> {
this.progressBarService.availableProgress(true);
return this.http.request(req)
.finally( () => this.progressBarService.availableProgress(false));
}
The console.log I've added is fired twice for some reason: the first time is {type: 0}, and second time returns the data I needed.
I've removed everything from interceptor, leaved only next.handle(req) and it does the same.
Any idea why I receive 2 responses, the first being just {type: 0}?
That's because you using this.http.request(). I guess the first response is actually the response for the OPTIONS request.
If you still insist to use this.http.request(), for example if you using it to upload files, you might need to use rxJs takeLast(1) to get the response that you need.
Here's the reference.
https://angular.io/api/common/http/HttpClient#request

How to parse or Stringify in asycnhronous way in javascript

I see that JSON.stringify and JSON.parse are both sycnhronous.
I would like to know if there a simple npm library that does this in an asynchonous way .
Thank you
You can make anything "asynchronous" by using Promises:
function asyncStringify(str) {
return new Promise((resolve, reject) => {
resolve(JSON.stringify(str));
});
}
Then you can use it like any other promise:
asyncStringfy(str).then(ajaxSubmit);
Note that because the code is not asynchronous, the promise will be resolved right away (there's no blocking operation on stringifying a JSON, it doesn't require any system call).
You can also use the async/await API if your platform supports it:
async function asyncStringify(str) {
return JSON.stringify(str);
}
Then you can use it the same way:
asyncStringfy(str).then(ajaxSubmit);
// or use the "await" API
const strJson = await asyncStringify(str);
ajaxSubmit(strJson);
Edited: One way of adding true asynchrnous parsing/stringifying (maybe because we're parsing something too complex) is to pass the job to another process (or service) and wait on the response.
You can do this in many ways (like creating a new service that shares a REST API), I will demonstrate here a way of doing this with message passing between processes:
First create a file that will take care of doing the parsing/stringifying. Call it async-json.js for the sake of the example:
// async-json.js
function stringify(value) {
return JSON.stringify(value);
}
function parse(value) {
return JSON.parse(value);
}
process.on('message', function(message) {
let result;
if (message.method === 'stringify') {
result = stringify(message.value)
} else if (message.method === 'parse') {
result = parse(message.value);
}
process.send({ callerId: message.callerId, returnValue: result });
});
All this process does is wait a message asking to stringify or parse a JSON and then respond with the right value.
Now, on your code, you can fork this script and send messages back and forward. Whenever a request is sent, you create a new promise, whenever a response comes back to that request, you can resolve the promise:
const fork = require('child_process').fork;
const asyncJson = fork(__dirname + '/async-json.js');
const callers = {};
asyncJson.on('message', function(response) {
callers[response.callerId].resolve(response.returnValue);
});
function callAsyncJson(method, value) {
const callerId = parseInt(Math.random() * 1000000);
const callPromise = new Promise((resolve, reject) => {
callers[callerId] = { resolve: resolve, reject: reject };
asyncJson.send({ callerId: callerId, method: method, value: value });
});
return callPromise;
}
function JsonStringify(value) {
return callAsyncJson('stringify', value);
}
function JsonParse(value) {
return callAsyncJson('parse', value);
}
JsonStringify({ a: 1 }).then(console.log.bind(console));
JsonParse('{ "a": "1" }').then(console.log.bind(console));
Note: this is just one example, but knowing this you can figure out other improvements or other ways to do it. Hope this is helpful.
Check this out, another npm package-
async-json is a library that provides an asynchronous version of the standard JSON.stringify.
Install-
npm install async-json
Example-
var asyncJSON = require('async-json');
asyncJSON.stringify({ some: "data" }, function (err, jsonValue) {
if (err) {
throw err;
}
jsonValue === '{"some":"data"}';
});
Note-Didn't test it, you need to manually check it's dependency and
required packages.
By asynchronous I assume you actually mean non-blocking asynchronous - i.e., if you have a large (megabytes large) JSON string, and you stringify, you don't want your web server to hard freeze and block newly incoming web requests for 500+ milliseconds while it processes the object.
Option 1
The generic answer is to iterate through your object piece by piece, and to then call setImmedate whenever a threshold is reached. This then allows other functions in the event queue to run for a bit.
For JSON (de)serialization, the yieldable-json library does this very well. It does however drastically sacrifice JSON processing time (which is somewhat intentional).
Usage example from the yieldable-json readme:
const yj = require('yieldable-json')
yj.stringifyAsync({key:"value"}, (err, data) => {
if (!err)
console.log(data)
})
Option 2
If processing speed is extremely important (such as with real-time data), you may want to consider spawning multiple Node threads instead. I've used used the PM2 Process Manager with great success, although initial setup was quite daunting. Once it works however, the final result is magic, and does not require modifying your source code, just your package.json file. It acts as a proxy, load balancer, and monitoring tool for Node applications. It's somewhat analogous to Docker swarm, but bare metal, and does not require a special client on the server.

Run multiple functions based on a SINGLE form submission (method="post") using Node-express

I am looking to perform multiple actions upon receiving HTML(or EJS) form content using the POST method. I am using Node express, mongoose & mongoDB. Each of the below POST responses work individually but i am unsure how to proceed in updating multiple databases based on ONE SINGLE form submission.
// insert into passport db
app.post('/signup', passport.authenticate('local-signup',
{
successRedirect : '/index', // redirect to the secure profile section
failureRedirect : '/signup', // redirect back to the signup page if there is an error
failureFlash : true // allow flash messages
}));
//insert into my database here
[the content of in the second function is unimportant as that is working fine and has been stripped down for simplification.]
app.post('/signup', function( req, res )
{
new UserDB(
{
user_id : req.body.content,
first_name : req.body.fname,
}).save( function( err, mySite, count )
{
res.redirect( '/index' );
});
});
I have tried redirecting but the form content is not accessible after the redirect so only the first function stores the data (ie. only 1 database is filled).
How would i run both functions within
app.post('/signup',.....
{
...
});
?
Thanks in advance!
You can do this by making one function the callback of the other. This is easy because each function maintains the same Connect middleware signature, function(req, res, next), where req and res are the request and response objects created and manipulated by the application, and next is the next function to call at the end of the current function's execution.
According to the official documentation, passport.authenticate() is a normal piece of middleware. All you need to do is specify the middleware you want to be called next. Express queues middleware functions in the order in which you pass them into app.post. You can do something like this:
app.post('/signup', passport.authenticate('local-signup', {
failureRedirect : '/signup',
failureFlash : true
}),
function(req, res) {
new UserDB({
user_id : req.body.content,
first_name : req.body.fname,
}).save(function(err, mySite, count) {
res.redirect('/index');
});
});
Middleware is an extremely powerful feature of the Express framework and possibly the single most important one to master. This guide would be a great next step if you want to learn more.