catch mysql errors before sending HTTP response - mysql

I am coming from php background where sql queries are blocking.
My usual setup is to make a try, catch block to handle any possible errors that might happen during table updating. I wanted to transfer this logic to nodejs, but it's asynchronous nature is making this troublesome. Take a look at example code:
var httpServer = Https.createServer(httpsOptions, function (request, response) {
try {
//many queries in if-else, switch statements
if(something){
mysqlConnection.query(query,[],function(err, result){});
}else{
//many queries will follow
if(somethingelse){
mysqlConnection.query(query2,[],function(err, result){});
}else{
mysqlConnection.query(query3,[],function(err, result){});
if(andsoon){}
}
}
response.writeHead(200, { "Content-Type": "text/plain" });
response.end("Got it.");
console.log("Response successful");
}catch(error){
console.error(error);
response.writeHead(400, { "Content-Type": "text/plain" });
response.end("Error.");
}
}).listen(8000);
So, response.end() will trigger before the query executes so I send the message before I know if the query was successful. I could wrap the response.end() code in callback, but there are MANY queries that are executing, so now I have to track how many were executed so far. Things get worse, 1 if-else control flow requires just 1 query, some other requires 10 so there is crazy overhead with tracking when the code is completed.
This all can't be the optimal workflow. What do you do when you have a complex query system and you want to send response only after all queries have executed(or failed)?

You've put null callbacks from your .query invocations. That's your problem.
mysqlConnection.query(query,[],function(err, result){});
xxxxxxxxxxxxxxxxxxxxxxxx
Javascript goes on to run the next statement after the one shown immediately without waiting for the query to finish. You know the query is complete (or it hit an error) only when the .query method invokes your callback function.
So, you need to do something like this instead.
mysqlConnection.query(query,[],function(err, result){
if (err) throw Error(err);
//many queries will follow
if(somethingelse){
...
}
});
A sequence of queries like you need ends up with an absurdly deeply nested set of callback functions. But don't despair: Javascript's Promise pattern works around this for you and gives you clean code. Use npm's promise-mysql package. And write code like this.
mysqlConnection.query(query,[])
.then (function(mySqlConnection) {
if (somethingelse)
return mysqlConnection.query(query2,[])
else
return mysqlConnection.query(query3,[])
})
.then (function(mySqlConnection) {
return mysqlConnection.query(query4,[])
})
.then (function(mySqlConnection) {
response.writeHead(200, { "Content-Type": "text/plain" });
response.end("Got it.");
})
.catch (function(err)) {
/* failure somewhere ! */
});
You have a series of then functions, with a catch at the end. The Promise subsystem hides all the callback invocation for you, organizing it this way instead. (Notice that I did not debug this sample code.)
You must figure out this Promise stuff to program sql / nodejs successfully. It's counterintuitive at first (at least it was for me when I was learning it). But it's worth your effort. Pro tip: if you're stepping through this kind of code in a debugger, use Step Into liberally.

Related

Is there a way for KNEX ERRORS to also log WHERE in the code they take place?

Some Knex errors log the file and line in which they occur, but many DO NOT. This makes debugging unnecessarily tedious. Is .catch((err)=>{console.log(err)}) supposed to take care of this?
The fact that code tries to repeat around 4 times (I want it to try once and stop, absolutely no need for more attempts, ever - it only messes things up when further entries are made to the database)?
Some Knex errors log the file and line in which they occur, but many DO NOT
Can you give us some of your query examples which silent the error?
I'm heavy Knex user, during my development, almost all errors show which file and line they occurred unless two kind of situations:
query in transaction which may complete early.
In this situation, we have to customize knex inner catch logic and do some knex injection such as Runner.prototype.query, identify the transactionEarlyCompletedError, and log more info: sql or bindings on catch clause.
pool connection error
such as mysql error: Knex:Error Pool2 - Error: Pool.release(): Resource not member of pool
this is another question which depends on your database env and connection package.
The fact that code tries to repeat around 4 times
if your repeat code written in Promise chain,I don't think it will throw 4 times, it should blows up at the first throw.
query1
.then(query2)
.then(query3)
.then(query4)
.catch(err => {})
concurrently executed queries
If any promise in the array is rejected, or any promise returned by the mapper function is rejected, the returned promise is rejected as well.
Promise.map(queries, (query) => {
return query.execute()
.then()
.catch((err) => {
return err;
})
}, { concurrency: 4})
.catch((err) => {
// handle error here
})
if you use try catch and async await
still it would not repeat 4 times, if you already know the error type, meanwhile, if you don't know what error will throw, why don't you execute it only once to find out the error?
async function repeatInsert(retryTimes = 0) {
try {
await knex.insert().into();
} catch(err) {
// handle known error
if (err.isKnown) {
throw err;
}
// and retry
if (retryTimes < 4) {
return await repeatInsert(retryTimes + 1);
}
}
}

Cloud Functions for Firebase Could not handle the request after a successful request

TLDR: After writing a JSON (successfully) to my Firestore, the next request will give me Internal Server Error (500). I have a suspicion that the problem is that inserting is not yet complete.
So basically, I have this code:
const jsonToDb = express();
exports.jsondb = functions.region('europe-west1').https.onRequest(jsonToDb);
jsonToDb.post('', (req, res) => {
let doc;
try {
doc = JSON.parse(req.body);
} catch(error) {
res.status(400).send(error.toString()).end();
return;
}
myDbFuncs.saveMyDoc(doc);
res.status(201).send("OK").end();
}
The database functions are in another JS file.
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
newDoc.set(myDoc).then().catch();
return;
};
So I have several theories, maybe one of them is not wrong, but please help me with this. (Also if I made some mistakes in this little snippet, just tell me.)
Reproduction:
I send the first request => everything is OK, Json in the database.
I send a second request after the first request give me OK status => it does not do anything for a few secs, then 500: Internal Server Error.
Logs: Function execution took 4345 ms, finished with status: 'connection error'.
I just don't understand. Let's imagine I'm using this as an API, several requests simultaneously. Can't it handle? (I suppose it can handle, just I do something stupid.) Deliberately, I'm sending the second request after the first has finished and this occurs.
Should I make the saveMyDoc async?
saveMyDoc isn't returning a promise that resolves when all the async work is complete. If you lose track of a promise, Cloud Functions will shut down the work and clean up before the work is complete, making it look like it simply doesn't work. You should only send a response from an HTTP type function after all the work is fully complete.
Minimally, it should look more like this:
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
return newDoc.set(myDoc);
};
Then you would use the promise in your main function:
myDbFuncs.saveMyDoc(doc).then(() => {
res.status(201).send("OK").end();
}
See how the response is only sent after the data is saved.
Read more about async programming in Cloud Functions in the documentation. Also watch this video series that talks about working with promises in Cloud Functions.

How to put the results of MySQL database in a variable in Node.js?

I have a problem in my node.js application. I'm connecting to the database and getting the data
let users = {};
let currentUser = "example"; // this variable changes every time a user connects
connection.query('SELECT * FROM users WHERE name="'+currentUser+'"', function (err, result) {
if (err) console.log(err);
users[currentUser] = result[0];
});
console.log(users[currentUser]);
When I try to console.log the result[0] from inside the function, it returns this:
RowDataPacket {
n: 11,
id: 'VKhDKmXF1s',
name: 'user3',
status: 'online',
socketID: 'JbZLNjKQK15ZkzTXAAAB',
level: 0,
xp: 0,
reg_date: 2018-07-16T20:37:45.000Z }
I want to put that result from MySQL into users.example or something like that, but when I try the code it returns undefined. So I tried console.log(users[currentUser].id) as an example and it shows an error
TypeError: Cannot read property 'id' of undefined
So how do I put the data from the result inside my variable users[currentUser]?
So how do I put the data from the result inside my variable users[currentUser]?
That's happening. The problem is how it is being tested.
The correct test is with the console.log inside the callback function(err, result)
function (err, result) {
if (err) console.log(err);
users[currentUser] = result[0];
console.log(users[currentUser]);
});
Issue: users[currentUser] is still undefined outside that function
Well, yes and no. It is undefined in code that executes before the callback is fired.
And how do I fix that?
Anything that needs the result of the query must be executed from within the callback function, because that's the only code location where you know for certain that the data exists.
Well, I need that data outside in a global variable.
You can stick the query data in a global, but that doesn't solve the timing issue
of only accessing that global when it is defined and contains current data. That will cause lots of frustration.
If you don't want to call one or more specific functions to process the query data
within the callback, an alternative is to use a nodejs EventEmitter to coordinate the data production and data consumption.
Yet another alternative is to not use a callback function, and use Promise and/or async/await, both of which are supported by modern nodejs. This alternative doesn't involve global variables, but provides different ways to code the fact that some operations need to wait for the results of others.
connection.query is an async call. The console.log is called before the query fetches the data from db.
Because you're using a callback, your console.log is happening before the result comes back from the database. Additionally, you have a type in your code user -> users.
let users = {};
let currentUser = "example"; // this variable changes every time a user connects
connection.query('SELECT * FROM users WHERE name="'+currentUser+'"', function (err, result) {
if (err) console.log(err);
users[currentUser] = result[0];
console.log(users[currentUser]);
});
Side note: research "SQL Injection" before you use this code. The way you're building your query opens you up for anyone to access your database. Using a library like squel.js or knex.js will help you avoid this.
For an explanation of why things happen in the order they do, take a look at the JavaScript event loop.

How to parse or Stringify in asycnhronous way in javascript

I see that JSON.stringify and JSON.parse are both sycnhronous.
I would like to know if there a simple npm library that does this in an asynchonous way .
Thank you
You can make anything "asynchronous" by using Promises:
function asyncStringify(str) {
return new Promise((resolve, reject) => {
resolve(JSON.stringify(str));
});
}
Then you can use it like any other promise:
asyncStringfy(str).then(ajaxSubmit);
Note that because the code is not asynchronous, the promise will be resolved right away (there's no blocking operation on stringifying a JSON, it doesn't require any system call).
You can also use the async/await API if your platform supports it:
async function asyncStringify(str) {
return JSON.stringify(str);
}
Then you can use it the same way:
asyncStringfy(str).then(ajaxSubmit);
// or use the "await" API
const strJson = await asyncStringify(str);
ajaxSubmit(strJson);
Edited: One way of adding true asynchrnous parsing/stringifying (maybe because we're parsing something too complex) is to pass the job to another process (or service) and wait on the response.
You can do this in many ways (like creating a new service that shares a REST API), I will demonstrate here a way of doing this with message passing between processes:
First create a file that will take care of doing the parsing/stringifying. Call it async-json.js for the sake of the example:
// async-json.js
function stringify(value) {
return JSON.stringify(value);
}
function parse(value) {
return JSON.parse(value);
}
process.on('message', function(message) {
let result;
if (message.method === 'stringify') {
result = stringify(message.value)
} else if (message.method === 'parse') {
result = parse(message.value);
}
process.send({ callerId: message.callerId, returnValue: result });
});
All this process does is wait a message asking to stringify or parse a JSON and then respond with the right value.
Now, on your code, you can fork this script and send messages back and forward. Whenever a request is sent, you create a new promise, whenever a response comes back to that request, you can resolve the promise:
const fork = require('child_process').fork;
const asyncJson = fork(__dirname + '/async-json.js');
const callers = {};
asyncJson.on('message', function(response) {
callers[response.callerId].resolve(response.returnValue);
});
function callAsyncJson(method, value) {
const callerId = parseInt(Math.random() * 1000000);
const callPromise = new Promise((resolve, reject) => {
callers[callerId] = { resolve: resolve, reject: reject };
asyncJson.send({ callerId: callerId, method: method, value: value });
});
return callPromise;
}
function JsonStringify(value) {
return callAsyncJson('stringify', value);
}
function JsonParse(value) {
return callAsyncJson('parse', value);
}
JsonStringify({ a: 1 }).then(console.log.bind(console));
JsonParse('{ "a": "1" }').then(console.log.bind(console));
Note: this is just one example, but knowing this you can figure out other improvements or other ways to do it. Hope this is helpful.
Check this out, another npm package-
async-json is a library that provides an asynchronous version of the standard JSON.stringify.
Install-
npm install async-json
Example-
var asyncJSON = require('async-json');
asyncJSON.stringify({ some: "data" }, function (err, jsonValue) {
if (err) {
throw err;
}
jsonValue === '{"some":"data"}';
});
Note-Didn't test it, you need to manually check it's dependency and
required packages.
By asynchronous I assume you actually mean non-blocking asynchronous - i.e., if you have a large (megabytes large) JSON string, and you stringify, you don't want your web server to hard freeze and block newly incoming web requests for 500+ milliseconds while it processes the object.
Option 1
The generic answer is to iterate through your object piece by piece, and to then call setImmedate whenever a threshold is reached. This then allows other functions in the event queue to run for a bit.
For JSON (de)serialization, the yieldable-json library does this very well. It does however drastically sacrifice JSON processing time (which is somewhat intentional).
Usage example from the yieldable-json readme:
const yj = require('yieldable-json')
yj.stringifyAsync({key:"value"}, (err, data) => {
if (!err)
console.log(data)
})
Option 2
If processing speed is extremely important (such as with real-time data), you may want to consider spawning multiple Node threads instead. I've used used the PM2 Process Manager with great success, although initial setup was quite daunting. Once it works however, the final result is magic, and does not require modifying your source code, just your package.json file. It acts as a proxy, load balancer, and monitoring tool for Node applications. It's somewhat analogous to Docker swarm, but bare metal, and does not require a special client on the server.

socketstream async call to mysql within rpc actions

First, I need to tell you that I am very new to the wonders of nodejs, socketstream, angularjs and JavaScript in general. I come from a Java background and this might explain my ignorance of the correct way of doing things async.
To toy around with things I installed the ss-angular-demo from americanyak. My problem is now that the Rpc seems to be a synchronous interface and my call the the mysql database has an asynchronous interface. How can I return the database results upon a call of the Rpc?
Here is what I did so far with socketstream 0.3:
In app.js I successfully tell ss to allow my mysql database connection to be accessed by putting ss.api.add('coolStore',mysqlConn); in there at the right place (as explained in the socketstream docs). I use the mysql npm, so I can call mysql within the Rpc
server/rpc/coolRpc.js
exports.actions = function (req, res, ss) {
// use session middleware
req.use('session');
return {
get: function(threshold){
var sql = "SELECT cool.id, cool.score, cool.data FROM cool WHERE cool.score > " + threshold;
if (!ss.arbStore) {
console.log("connecting to mysql arb data store");
ss.coolStore = ss.coolStore.connect();
}
ss.coolStore.query(sql, function(err, rows, fields) {
if(err) {
console.log("error fetching stuff", err);
} else {
console.log("first row = "+rows[0].id);
}
});
var db_rows = ???
return res(null, db_rows || []);
}
}
The console logs the id of my database entry, as expected. However, I am clueless how I can make the Rpc's return statement return the rows of my query. What is the right way of addressing this sort of problem?
Thanks for your help. Please be friendly with me, because this is also my first question on stackoverflow.
It's not synchronous. When your results are ready, you can send them back:
exports.actions = function (req, res, ss) {
// use session middleware
req.use('session');
return {
get: function(threshold){
...
ss.coolStore.query(sql, function(err, rows, fields) {
res(err, rows || []);
});
}
}
};
You need to make sure that you always call res(...) from an RPC function, even when an error occurs, otherwise you might get dangling requests (where the client code keeps waiting for a response that's never generated). In the code above, the error is forwarded to the client so it can be handled there.