How to handle async call to mysql database - mysql

I'm pretty new to nodejs which is probably why I'm asking this question. I recently discovered that calls being made with nodejs to any database are async.
As a former C# .Net programmer this is little bit a surprise for me. I'm just used to code synchronous and it's ok to wait a little.
Currently I want to make a database call and with the returned result the code should continue to run. How to do this best? I found something about promises but I can't find the proper solution yet.
What I really want is something like this:
var requestLoop = setInterval(function(){
console.log('Trading bot (re)started..');
var wlist = [];
wlist = db_connection.getWatchList_DB() ==> Database call here
if(wlist.length > 0){
// Perform the rest of the code
}
}, 5000);//300000 five minutes
So, for me it's ok to wait for the database call and continue with the fetched results. Is there any simple solution for this?

You can try this mysql2 module which has inbuilt support for the promises.Code snippet from the official documentation
async function main() {
// get the client
const mysql = require('mysql2');
// create the pool
const pool = mysql.createPool({host:'localhost', user: 'root', database: 'test'});
// now get a Promise wrapped instance of that pool
const promisePool = pool.promise();
// query database using promises
const [rows,fields] = await promisePool.query("SELECT 1");
}
Also if you are very new to Node and async programming I would suggest you to learn about Callbacks,Promises and ofcourse Async-Await

Related

Knex js stream large data

I have a MySQL table with millions of data.
For each row I have to apply a custom logic and update the modified data on another table.
Using knex.js I run the query to read the data using the stream() function
Once I get the Stream object I apply my logic to the data event.
Everything works correctly but at a certain point it stops without giving any errors.
I tried to pause the stream before each update operation in the new table and restart it after completing the update but the problem is not solved.
Trying to put a limit on the query, for example to 1000 results, the system works fine.
Sample code:
const readableStream = knex.select('*')
.from('big_table')
.stream();
readableStream.on('data', async(data) => {
readableStream.pause() // pause stream
const toUpdate = applyLogic(data) // sync func
const whereCond = getWhereCondition(data) // sync func
try {
await knex('to_update').where(whereCond).update(toUpdate)
console.log('UPDATED')
readableStream.resume() // resume stream
} catch (e) {
console.log('ERROR', e)
}
readableStream.resume() // resume stream
}).on('finish', () => {
console.log('FINISH')
}).on('error', (err) => {
console.log('ERROR', err)
})
Thanks!
I solved.
The problem is not due to knex.js or the streams but to my development environment.
I use k3d to simulate the production environment on the gcp. So to test my script locally I did a port-forward of the MySQL service.
It is not clear to me why the system crashes but by creating a container with my script so that it connects to the MySQL service, the algorithm works as I expect.
Thanks

Cloud Functions for Firebase Could not handle the request after a successful request

TLDR: After writing a JSON (successfully) to my Firestore, the next request will give me Internal Server Error (500). I have a suspicion that the problem is that inserting is not yet complete.
So basically, I have this code:
const jsonToDb = express();
exports.jsondb = functions.region('europe-west1').https.onRequest(jsonToDb);
jsonToDb.post('', (req, res) => {
let doc;
try {
doc = JSON.parse(req.body);
} catch(error) {
res.status(400).send(error.toString()).end();
return;
}
myDbFuncs.saveMyDoc(doc);
res.status(201).send("OK").end();
}
The database functions are in another JS file.
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
newDoc.set(myDoc).then().catch();
return;
};
So I have several theories, maybe one of them is not wrong, but please help me with this. (Also if I made some mistakes in this little snippet, just tell me.)
Reproduction:
I send the first request => everything is OK, Json in the database.
I send a second request after the first request give me OK status => it does not do anything for a few secs, then 500: Internal Server Error.
Logs: Function execution took 4345 ms, finished with status: 'connection error'.
I just don't understand. Let's imagine I'm using this as an API, several requests simultaneously. Can't it handle? (I suppose it can handle, just I do something stupid.) Deliberately, I'm sending the second request after the first has finished and this occurs.
Should I make the saveMyDoc async?
saveMyDoc isn't returning a promise that resolves when all the async work is complete. If you lose track of a promise, Cloud Functions will shut down the work and clean up before the work is complete, making it look like it simply doesn't work. You should only send a response from an HTTP type function after all the work is fully complete.
Minimally, it should look more like this:
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
return newDoc.set(myDoc);
};
Then you would use the promise in your main function:
myDbFuncs.saveMyDoc(doc).then(() => {
res.status(201).send("OK").end();
}
See how the response is only sent after the data is saved.
Read more about async programming in Cloud Functions in the documentation. Also watch this video series that talks about working with promises in Cloud Functions.

Handle Nested Queries Callback of MySQL in NodeJS

I am new to NodeJS with MySQL Database and I want to execute my code for nested queries.
The scenario is that I have to get the list of incomplete trades and then by using iterative loop to iterate the list which I have received from database.
In the loop there are more queries executing and a 3rd party API is called to fetch the data which returns data in a callback. Now the issue is that callback execute asynchronously and the loop doesnt wait for the callback to return data and it moves on.
Kindly guide me as I am stucked in this situation.
Here is my code
var sql = incompleteTradesQuery.getIncompleteTrades();
sqlConn.query(sql, function (err, data) {
if (err) {
console.log(err);
}
else {
for (var i = 0; i < data.length; i++) {
bittrexExchange.getOrder(data.order_uuid, function(err, order_data) {
if (order_data.result.IsOpen != true) {
var order_sql = tradesQuery.insertTrade(order_data.result.OrderUuid, order_data.result.Exchange, data.customer_id, order_data.result.Quantity, order_data.result.QuantityRemaining, order_data.result.Limit, order_data.result.Reserved, order_data.result.ReserveRemaining, order_data.result.CommissionReserved, order_data.result.CommissionReserveRemaining, order_data.result.CommissionPaid, order_data.result.Price, order_data.result.PricePerUnit, order_data.result.Opened, order_data.result.Closed, order_data.result.IsOpen, null, data.commission_fee, data.total_transfer, new Date());
sqlConn.query(order_sql);
var incomplete_trades_query = incompleteTradesQuery.deleteIncompleteTradesById(data.id);
sqlConn(incomplete_trades_query);
}
});
}
}
});
Since, NodeJS fundamentally works in asynchronous nature, your nested queries will also be asynchronous and it will be a painful task to write chain of callbacks. One simple answer to your question will be use promises.
Additionally, I would recommend you using asynchronous way to handle your multiple queries which will undoubtably, work faster than the synchronous way of handling queries/requests.
NodeJS also provides async.js module that will solve your problem. Q and Step are also good packages to handle your nested callback code.
https://code.tutsplus.com/tutorials/managing-the-asynchronous-nature-of-nodejs--net-36183

how to implement clustering in Sequelize?

I have two mysql server both are (MASTER,MASTER). how could we implement clustering in sequelize. if one of sql server has stopped then all request goes to other mysql server without restarting node server.
They haven't implemented clustering feature or fallback. I have found a way around for the same.
var Sequelize = require("sequelize");
sequelize.connectionManager.connect = function(){
return new Promise(function(resolve,reject){
// create your connection and return its instance in promise
resolve(connection);
});
}
sequelize.connectionManager.disconnect = function(connection){
// to disconnect the connection
if (!connection._protocol._ended) {
connection.release()
}
return Promise.resolve();
}
I can share code if above code is not enough.

socketstream async call to mysql within rpc actions

First, I need to tell you that I am very new to the wonders of nodejs, socketstream, angularjs and JavaScript in general. I come from a Java background and this might explain my ignorance of the correct way of doing things async.
To toy around with things I installed the ss-angular-demo from americanyak. My problem is now that the Rpc seems to be a synchronous interface and my call the the mysql database has an asynchronous interface. How can I return the database results upon a call of the Rpc?
Here is what I did so far with socketstream 0.3:
In app.js I successfully tell ss to allow my mysql database connection to be accessed by putting ss.api.add('coolStore',mysqlConn); in there at the right place (as explained in the socketstream docs). I use the mysql npm, so I can call mysql within the Rpc
server/rpc/coolRpc.js
exports.actions = function (req, res, ss) {
// use session middleware
req.use('session');
return {
get: function(threshold){
var sql = "SELECT cool.id, cool.score, cool.data FROM cool WHERE cool.score > " + threshold;
if (!ss.arbStore) {
console.log("connecting to mysql arb data store");
ss.coolStore = ss.coolStore.connect();
}
ss.coolStore.query(sql, function(err, rows, fields) {
if(err) {
console.log("error fetching stuff", err);
} else {
console.log("first row = "+rows[0].id);
}
});
var db_rows = ???
return res(null, db_rows || []);
}
}
The console logs the id of my database entry, as expected. However, I am clueless how I can make the Rpc's return statement return the rows of my query. What is the right way of addressing this sort of problem?
Thanks for your help. Please be friendly with me, because this is also my first question on stackoverflow.
It's not synchronous. When your results are ready, you can send them back:
exports.actions = function (req, res, ss) {
// use session middleware
req.use('session');
return {
get: function(threshold){
...
ss.coolStore.query(sql, function(err, rows, fields) {
res(err, rows || []);
});
}
}
};
You need to make sure that you always call res(...) from an RPC function, even when an error occurs, otherwise you might get dangling requests (where the client code keeps waiting for a response that's never generated). In the code above, the error is forwarded to the client so it can be handled there.