MySQL nodejs crash upon selecting data from big table - mysql

I'm attempting to convert data from one database to the other but I'm getting some issues trying to get data from a big table, save it to an object and insert into another database. This is my code:
let sql;
let resultsToFetch = true;
while (resultsToFetch) {
sql = `SELECT X FROM Y LIMIT ${index}, 1000`;
DB1.query(sql, (err, result) => {
if (err) {
resultsToFetch = false;
throw err;
} else if (result.length == 0) {
resultsToFetch = false;
} else {
result.forEach(res => {
const obj = {
id: res.id,
name: res.name
};
sql = "INSERT INTO X SET ?";
DB2.query(sql, obj, (err, result) => {
if (err) throw err;
});
});
}
});
index += 1000;
}
I'm trying to use LIMIT so I'm not selecting all 6 million entries right away but I still get a Javascript heap out of memory error. I think I misunderstood something related to Node.js, but I'm not quite sure what it is. This is the error:
<--- Last few GCs --->
[11256:000002A5D2CBB600] 22031 ms: Mark-sweep 1418.5 (1482.0) -> 1418.5 (1451.5) MB, 918.3 / 0.0 ms last resort GC in old space requested
[11256:000002A5D2CBB600] 22947 ms: Mark-sweep 1418.5 (1451.5) -> 1418.5 (1451.5) MB, 915.2 / 0.0 ms last resort GC in old space requested
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 000000B356525529 <JSObject>
1: /* anonymous */ [\index.js:~1] [pc=00000042DA416732](this=000000C326B04AD1 <Object map = 0000027D35B023B9>,exports=000000C326B04AD1 <Object map = 0000027D35B023B9>,require=000000C326B04A89 <JSFunction require (sfi = 00000229888651E9)>,module=000000C326B04A39 <Module map = 0000027D35B44F69>,__filename=000002298886B769 <String[52]\
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node::DecodeWrite
2: node_module_register
3: v8::internal::FatalProcessOutOfMemory
4: v8::internal::FatalProcessOutOfMemory
5: v8::internal::Factory::NewUninitializedFixedArray
6: v8::internal::WasmDebugInfo::SetupForTesting
7: v8::internal::interpreter::BytecodeArrayRandomIterator::UpdateOffsetFromIndex
8: 00000042DA2843C1
Edit: #Grégory NEUT
let query = DB1.query("SELECT * FROM X");
let index = 0;
query
.on("error", function(err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on("fields", function(fields) {
// the field packets for the rows to follow
})
.on("result", function(row) {
// Pausing the connnection is useful if your processing involves I/O
DB1.pause();
const obj = {
id: row.id,
};
console.log(obj);
const sql = `INSERT INTO X SET ?`;
DB2.query(sql, obj, (err, result) => {
if (err) {
throw err;
}
DB1.resume();
});
console.log(index);
index++;
})
.on("end", function() {
// all rows have been received
});

I don't know how mysql driver is done in node.js but maybe it load everything and then limit the data. Or maybe 1000 entry are too much.
Anyway the soluce is to use streams
var query = connection.query('SELECT * FROM posts');
query
.on('error', function(err) {
// Handle error, an 'end' event will be emitted after this as well
})
.on('fields', function(fields) {
// the field packets for the rows to follow
})
.on('result', function(row) {
// Pausing the connnection is useful if your processing involves I/O
connection.pause();
processRow(row, function() {
connection.resume();
});
})
.on('end', function() {
// all rows have been received
});
So it will load in memory only the processed data at a time. Using it you will be sure that whatever the amount of data you have, you won't hit allocation failure.

Related

Catching exception errors when logging in via NodeJS + MySQL

Recently I've been trying to learn NodeJS to set up a log in process, so I decided to write all the errors and make exceptions for them. My question is how can I make sure each if is responsible for each error code. I haven't worked with try and catch before so this is a new territory for me.
Also is it better to use multiple try-catch or should I consider using 1 block where I can use a switch for example (used else if here as a quick example).
Table with Status errors
0 - no connection with database.
1 - connection ok but we dont have any privileges for access to the
database or something like that.
2 - all ok.
3 - ok, but no data found in query results.
4 - error getting results of query.
5 - other.
module.exports = (username,password,connection ) => {
var data ={
"Status" : null,
"Data" : {} //JSON results inside of Data Json
}
try{
connection.query("SELECT id FROM players", function (error, results, fields) {
if (error){
data.Status = 0;
data.Data= "No connection can be established with the database";
return data
}
else if(error){
data.Status = 1;
data.Data= results + "Connection OK but no priviliges";
return data
}
else if(error){
data.Status = 2;
data.Data=results + "connection running";
return data
}
else if(error){
data.Status = 3;
data.Data=results + "No data found in query results";
return data
}
else if(error){
data.Status = 4;
data.Data=results;
return data
}
else if(error){
data.Status = 5;
data.Data=results;
return data
}
});
}
catch(e){
console.log(e);
data.Status= 2;
data.Data=null;
return data;
}
};
Welcome to async programming, your try/catch block won't do anything for any I/O process, all errors are handled by error object in the callback function. (unless you use the last async/await ES6 pattern)
connection.query("SELECT id FROM players", function (error, results, fields) {
if (!error) { // no error, return results
data.status = 2;
data.Data = results;
return data;
}
// for all error code, please check mysql library document
// https://www.npmjs.com/package/mysql#error-handling
if (error.code === 'ER_ACCESS_DENIED_ERROR') {
data.Status = 1;
data.Data=results;
return data
}
// handle any other error codes
// if ....
});
Edit: please note that, your exported function in module.exports won't return anything because you are calling database query which is an async I/O process and requires another callback function to get the data returned by database
This will never work as expected :
if (error){
console.log("I'm the error");
return;
} else if(error){
console.log("I will never be display on error because of return in the first if");
}
Should be :
if (!error){
// No error
} else if(error === 'something'){
// Error something
} else if ....
// Other type of error
} else {
// Unknown error
}
You can use a switch instead in a more elegant way :
const data = { Status: 1, Data: results }
if(error) {
switch(error.code) {
case 'ER_ACCESS_DENIED_ERROR' :
data.Satus = 2;
return data;
...
}
}
return data;

NodeJs MySql multiple update

I have a method in NodeJs using Express framework, in which I am iterating an array and making an update in a Mysql DB,
The Code receives a Connection object and a Post Body,
The Post Request Body is an Array of objects, of the data to be saved in the DB,
I am trying to loop the objects one by one and save them in the DB using an Update Query.
Now the strange part is, the code works only if it gets called twice immediately,
Ie. On testing I found out, I have to make the API request twice in order for the code to save the data.
I get the following error on the first API call -
Error Code: 1205. Lock wait timeout exceeded; try restarting transaction
It's a simple Update call, I checked the MySql processes and there was no deadlock,
SHOW FULL PROCESSLIST;
But the same code work's on the immediate 2nd API call.
let updateByDonationId = async (conn, requestBody, callback) => {
let donations = [];
donations = requestBody.donations;
//for(let i in donations){
async.each(donations, function(singleDonation, callback) {
//let params = donations[i];
let params = singleDonation;
let sqlData = []
let columns = "";
if(params.current_location != null){
sqlData.push(params.current_location);
columns += "`current_location` = ?,";
}
if(params.destination_location != null){
sqlData.push(params.destination_location);
columns += "`destination_location` = ?,";
}
if(columns != ''){
columns = columns.substring(0,columns.length-1);
let sqlQuery = 'UPDATE donation_data SET '+columns
+' WHERE donation_id = "' + params.donation_id + '"';
conn.query(sqlQuery, sqlData, function (err, result) {
logger.info(METHOD_TAG, this.sql);
if (err) {
logger.error(METHOD_TAG, err);
return callback(err, false);
}
})
}
else{
return callback(null, false);
}
columns = "";
sqlData = [];
},
function(err, results) {
if (err) {
logger.error(METHOD_TAG, err);
return callback(err, false);
}
else{
return callback(null, true);
}
});
//return callback(null, true);
} // END
Also referring the following, i guess he was getting an ER_LOCK_WAIT_TIMEOUT for weird reason as well -
NodeJS + mysql: using connection pool leads to deadlock tables
The issue seems to be with the Non blocking Async nature of Node as rightly pointed out
Can anyone help with a correct code?
I'd say the asynchronous nature of Node.js is going to be causing you issues here. You can try rewriting your loop. You can either use promises or the async.eachSeries method.
Try changing your loop to use the below:
async.eachSeries(donations, function(singleDonation, callback) {
.query() the method is asynchronous, because of this your code tries to execute one query after another without waiting for the former to finish. On the database side, they just get queued up if they happen to affect the same portion of it, i.e., one query has a "Lock" on that portion of the database. Now one of the transactions has to wait for another to finish and if the wait is longer than the threshold value then the error which you are getting is caused.
But you said that you are not getting the error on the second immediate call, my guess is that during first call(s) the data was cached so therefore the second call was faster and it was fast enough to keep the wait under threshold value thus the error was not caused on the second call.
To avoid this all together and still maintain the asynchronous nature of code you can use Promise along with async-await.
The first step is to create a Promise based wrapper function for our .query() function, like so:
let qPromise = async (conn, q, qData) => new Promise((resolve, reject) => {
conn.query(q, qData, function (err, result) {
if (err) {
reject(err);
return;
}
resolve(result);
});
});
Now here is your modified function which uses this Promise based function and async-await:
let updateByDonationId = async (conn, requestBody, callback) => {
let donations = [];
donations = requestBody.donations;
try {
for (let i in donations) {
let params = donations[i];
let sqlData = [];
let columns = "";
if (params.current_location != null) {
sqlData.push(params.current_location);
columns += "`current_location` = ?,";
}
if (params.destination_location != null) {
sqlData.push(params.destination_location);
columns += "`destination_location` = ?,";
}
if (columns != '') {
columns = columns.substring(0, columns.length - 1);
let sqlQuery = 'UPDATE donation_data SET ' + columns
+ ' WHERE donation_id = "' + params.donation_id + '"';
let result = await qPromise(conn, sqlQuery, sqlData);
logger.info(METHOD_TAG, result); // logging result, you might wanna modify this
}
else {
return callback(null, false);
}
columns = "";
sqlData = [];
}
} catch (e) {
logger.error(METHOD_TAG, e);
return callback(err, false);
}
return callback(null, true);
}
I wrote the code on fly so there maybe some syntactical error(s).

Node micro-service continuously querying mysql db

I am new to Node.js and I am trying to create a simple micro-service where it continuously polls records from a database, execute asynchronous jobs on those records and update the state of those records in the database once the job is done.
Basically this is my inner query and loop:
const con = mysql.createConnection({
host: 'localhost',
user: 'user',
password: 'password',
database: 'sitepoint'
});
con.query('SELECT * FROM records where flag = 0', (err,rows) => {
if(err) throw err;
rows.forEach( (row) => {
someFunction(row, function(result,err){
if(err) throw err;
//Update record in db
}
});
});
This is a rough idea of what I'm doing, my issue is I want to do this continuously on a set interval, say every 1 minute yet I do want the interval to be calculated after processing the last row in my query. In other words I want to block until all fetched rows are processed. What are my options in node.js?
I'd suggest using a simple pattern used in plenty of servers, where jobs queue themselves, you get the benefit that the interval can change each time and the interval between job completions is fixed rather than the interval between job starts being fixed.
function getQueryInterval() {
// Could read from a DB, Redis, etc.
return 60000;
}
function processRow(row) {
/* Do some good stuff with the row. */
return Promise.resolve('OK');
}
function runQuery() {
console.log(new Date().toISOString(), 'runQuery: Running..');
// Return a master promise, this will only resolve when everything is complete.
// However as soon as an error is encountered it will reject.
return new Promise((resolve, reject) => {
let processPromises = [];
con.query('SELECT * FROM records where flag = 0', (err,rows) => {
if(err) reject(err);
rows.forEach( (row) => {
someFunction(row, function(result,err) {
if(err) reject(err);
//Update record in db
processPromises.push(processRow(row));
});
});
// Only resolve when all records are processed.
Promise.all(processPromises).then((result) => {
resolve(result);
}).catch ((err) => {
reject(err)
});
});
})
}
async function runQueryAndQueueNext() {
try
{
await runQuery();
}
catch (err) {
console.error(new Date().toISOString(), 'runQueryAndQueueNext: Error occurred: ', err);
}
console.log(new Date().toISOString(), 'runQueryAndQueueNext: Query complete, queuing next in ' + getQueryInterval() + ' ms');
setTimeout(runQueryAndQueueNext, getQueryInterval())
}
runQueryAndQueueNext();

Nodejs, Cloud Firestore Upload Tasks - Auth error:Error: socket hang up

I'm coding a function that runs API calls and requests JSON from a huge database in sequence via offsets. The JSON response is parsed and then the subsequent data within is uploaded to our Cloud Firestore server.
Nodejs (Node 6.11.3) & Latest Firebase Admin SDK
The information is parsed as expected, and prints to the console perfectly. When the data attempts to upload to our Firestore database however, the console is spammed with the error message:
Auth error:Error: socket hang up
(node:846) UnhandledPromiseRejectionWarning: Unhandled promise rejection
(rejection id: -Number-): Error: Getting metadata from plugin failed with
error: socket hang up
and occasionally:
Auth error:Error: read ECONNRESET
The forEach function collects the items from the downloaded JSON and processes the data before uploading to the Firestore database. Each JSON has up to 1000 items of data (1000 documents worth) to pass through the forEach function. I understand that this might be a problem if the function repeats before the upload set finishes?
I'm a coding newbie and understand that the control flow of this function isn't the best. However, I can't find any information on the error that the console prints. I can find plenty of information on socket hang ups, but none on the Auth error section.
I'm using a generated service account JSON as a credential to access our database, which uses the firebase-adminsdk account. Our read/write rules for the database are currently open to allow any access (as we're in development with no real users).
Here's my function:
Firebase initialisation & offset zero-ing
const admin = require('firebase-admin');
var serviceAccount = require("JSON");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "URL"
});
var db = admin.firestore();
var offset = 0;
var failed = false;
Running the function & setting HTTP Headers
var runFunction = function runFunction() {
var https = require('https');
var options = {
host: 'website.com',
path: (path including an offset and 1000 row specifier),
method: 'GET',
json: true,
headers: {
'content-type': 'application/json',
'Authorization': 'Basic ' + new Buffer('username' + ':' + 'password').toString('base64')
}
};
Running the HTTP Request & Re-running the function if we haven't reached the end of the response from the API
if (failed === false) {
var req = https.request(options, function (res) {
var body = '';
res.setEncoding('utf8');
res.on('data', function (chunk) {
body += chunk;
});
res.on('end', () => {
console.log('Successfully processed HTTPS response');
body = JSON.parse(body);
if (body.hasOwnProperty('errors')) {
console.log('Body ->' + body)
console.log('API Call failed due to server error')
console.log('Function failed at ' + offset)
req.end();
return
} else {
if (body.hasOwnProperty('result')) {
let result = body.result;
if (Object.keys(result).length === 0) {
console.log('Function has completed');
failed = true;
return;
} else {
result.forEach(function (item) {
var docRef = db.collection('collection').doc(name);
console.log(name);
var upload = docRef.set({
thing: data,
thing2: data,
})
});
console.log('Finished offset ' + offset)
offset = offset + 1000;
failed = false;
}
if (failed === false) {
console.log('Function will repeat with new offset');
console.log('offset = ' + offset);
req.end();
runFunction();
} else {
console.log('Function will terminate');
}
}
}
});
});
req.on('error', (err) => {
console.log('Error -> ' + err)
console.log('Function failed at ' + offset)
console.log('Repeat from the given offset value or diagnose further')
req.end();
});
req.end();
} else {
req.end();
}
};
runFunction();
Any help would be greatly appreciated!
UPDATE
I've just tried changing the rows of JSON that I pull at a time and subsequently upload at a time using the function - from 1000 down to 100. The socket hang up errors are less frequent so it is definitely due to overloading the database.
Ideally it would be perfect if each forEach array iteration waited for the previous iteration to complete before commencing.
UPDATE #2
I've installed the async module and I'm currently using the async.eachSeries function to perform one document upload at a time. All errors mid-upload disappear - however the function will take an insane amount of time to finish (roughly 9 hours for 158,000 documents). My updated loop code is this, with a counter implemented:
async.eachSeries(result, function (item, callback) {
// result.forEach(function (item) {
var docRef = db.collection('collection').doc(name);
console.log(name);
var upload = docRef.set({
thing: data,
thing2: data,
}, { merge: true }).then(ref => {
counter = counter + 1
if (counter == result.length) {
console.log('Finished offset ' + offset)
offset = offset + 1000;
console.log('Function will repeat with new offset')
console.log('offset = ' + offset);
failed = false;
counter = 0
req.end();
runFunction();
}
callback()
});
});
Also, after a period of time the database returns this error:
(node:16168) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: -Number-): Error: The datastore operation timed out, or the data was temporarily unavailable.
It seems as if now my function is taking too long... instead of not long enough. Does anyone have any advice on how to make this run faster without stated errors?
The write requests as part of this loop were simply exceeding Firestore's quota - thus the server was rejecting the majority of them.
To solve this issue I converted my requests to upload in chunks of 50 or so items at a time, with Promises confirming when to move onto the next chunk upload.
The answer was posted here -> Iterate through an array in blocks of 50 items at a time in node.js, and the template for my working code is as below:
async function uploadData(dataArray) {
try {
const chunks = chunkArray(dataArray, 50);
for (const [index, chunk] of chunks.entries()) {
console.log(` --- Uploading ${index + 1} chunk started ---`);
await uploadDataChunk(chunk);
console.log(`---Uploading ${index + 1} chunk finished ---`);
}
} catch (error) {
console.log(error)
// Catch en error here
}
}
function uploadDataChunk(chunk) {
return Promise.all(
chunk.map((item) => new Promise((resolve, reject) => {
setTimeout(
() => {
console.log(`Chunk item ${item} uploaded`);
resolve();
},
Math.floor(Math.random() * 500)
);
}))
);
}
function chunkArray(array, chunkSize) {
return Array.from(
{ length: Math.ceil(array.length / chunkSize) },
(_, index) => array.slice(index * chunkSize, (index + 1) * chunkSize)
);
}
Pass the data array through to uploadData - using uploadData(data); and post your upload code for each item into uploadDataChunk inside the setTimeout block (before the resolve() line) within the chunk.map function.
I got around this by chaining the promises in the loop with a wait of 50 milliseconds in between each.
function Wait() {
return new Promise(r => setTimeout(r, 50))
}
function writeDataToFirestoreParentPhones(data) {
let chain = Promise.resolve();
for (let i = 0; i < data.length; ++i) {
var docRef = db.collection('parent_phones').doc(data[i].kp_ID_for_Realm);
chain = chain.then(()=> {
var setAda = docRef.set({
parent_id: data[i].kf_ParentID,
contact_number: data[i].contact_number,
contact_type: data[i].contact_type
}).then(ref => {
console.log(i + ' - Added parent_phones with ID: ', data[i].kp_ID_for_Realm);
}).catch(function(error) {
console.error("Error writing document: ", error);
});
})
.then(Wait)
}
}
For me this turned out to be a network issue.
Uploading 180,000 documents in batches of 10,000 was no trouble for me before and today having used a public, slower wifi connection, I received that error.
Switching back to my 4G mobile connection sorted the problem for me. Not sure whether it's a speed issue - could have been a security issue - but I'll go with that assumption.

Use promise to process MySQL return value in node.js

I have a python background and is currently migrating to node.js. I have problem adjusting to node.js due to its asynchronous nature.
For example, I am trying to return a value from a MySQL function.
function getLastRecord(name)
{
var connection = getMySQL_connection();
var query_str =
"SELECT name, " +
"FROM records " +
"WHERE (name = ?) " +
"LIMIT 1 ";
var query_var = [name];
var query = connection.query(query_str, query_var, function (err, rows, fields) {
//if (err) throw err;
if (err) {
//throw err;
console.log(err);
logger.info(err);
}
else {
//console.log(rows);
return rows;
}
}); //var query = connection.query(query_str, function (err, rows, fields) {
}
var rows = getLastRecord('name_record');
console.log(rows);
After some reading up, I realize the above code cannot work and I need to return a promise due to node.js's asynchronous nature. I cannot write node.js code like python. How do I convert getLastRecord() to return a promise and how do I handle the returned value?
In fact, what I want to do is something like this;
if (getLastRecord() > 20)
{
console.log("action");
}
How can this be done in node.js in a readable way?
I would like to see how promises can be implemented in this case using bluebird.
This is gonna be a little scattered, forgive me.
First, assuming this code uses the mysql driver API correctly, here's one way you could wrap it to work with a native promise:
function getLastRecord(name)
{
return new Promise(function(resolve, reject) {
// The Promise constructor should catch any errors thrown on
// this tick. Alternately, try/catch and reject(err) on catch.
var connection = getMySQL_connection();
var query_str =
"SELECT name, " +
"FROM records " +
"WHERE (name = ?) " +
"LIMIT 1 ";
var query_var = [name];
connection.query(query_str, query_var, function (err, rows, fields) {
// Call reject on error states,
// call resolve with results
if (err) {
return reject(err);
}
resolve(rows);
});
});
}
getLastRecord('name_record').then(function(rows) {
// now you have your rows, you can see if there are <20 of them
}).catch((err) => setImmediate(() => { throw err; })); // Throw async to escape the promise chain
So one thing: You still have callbacks. Callbacks are just functions that you hand to something to call at some point in the future with arguments of its choosing. So the function arguments in xs.map(fn), the (err, result) functions seen in node and the promise result and error handlers are all callbacks. This is somewhat confused by people referring to a specific kind of callback as "callbacks," the ones of (err, result) used in node core in what's called "continuation-passing style", sometimes called "nodebacks" by people that don't really like them.
For now, at least (async/await is coming eventually), you're pretty much stuck with callbacks, regardless of whether you adopt promises or not.
Also, I'll note that promises aren't immediately, obviously helpful here, as you still have a callback. Promises only really shine when you combine them with Promise.all and promise accumulators a la Array.prototype.reduce. But they do shine sometimes, and they are worth learning.
I have modified your code to use Q(NPM module) promises.
I Assumed your 'getLastRecord()' function that you specified in above snippet works correctly.
You can refer following link to get hold of Q module
Click here : Q documentation
var q = require('q');
function getLastRecord(name)
{
var deferred = q.defer(); // Use Q
var connection = getMySQL_connection();
var query_str =
"SELECT name, " +
"FROM records " +
"WHERE (name = ?) " +
"LIMIT 1 ";
var query_var = [name];
var query = connection.query(query_str, query_var, function (err, rows, fields) {
//if (err) throw err;
if (err) {
//throw err;
deferred.reject(err);
}
else {
//console.log(rows);
deferred.resolve(rows);
}
}); //var query = connection.query(query_str, function (err, rows, fields) {
return deferred.promise;
}
// Call the method like this
getLastRecord('name_record')
.then(function(rows){
// This function get called, when success
console.log(rows);
},function(error){
// This function get called, when error
console.log(error);
});
I am new to Node.js and promises. I was searching for a while for something that will meet my needs and this is what I ended up using after combining several examples I found. I wanted the ability to acquire connection per query and release it right after the query finishes (querySql), or to get a connection from pool and use it within Promise.using scope, or release it whenever I would like it (getSqlConnection).
Using this method you can concat several queries one after another without nesting them.
db.js
var mysql = require('mysql');
var Promise = require("bluebird");
Promise.promisifyAll(mysql);
Promise.promisifyAll(require("mysql/lib/Connection").prototype);
Promise.promisifyAll(require("mysql/lib/Pool").prototype);
var pool = mysql.createPool({
host: 'my_aws_host',
port: '3306',
user: 'my_user',
password: 'my_password',
database: 'db_name'
});
function getSqlConnection() {
return pool.getConnectionAsync().disposer(function (connection) {
console.log("Releasing connection back to pool")
connection.release();
});
}
function querySql (query, params) {
return Promise.using(getSqlConnection(), function (connection) {
console.log("Got connection from pool");
if (typeof params !== 'undefined'){
return connection.queryAsync(query, params);
} else {
return connection.queryAsync(query);
}
});
};
module.exports = {
getSqlConnection : getSqlConnection,
querySql : querySql
};
usage_route.js
var express = require('express');
var router = express.Router();
var dateFormat = require('dateformat');
var db = require('../my_modules/db');
var getSqlConnection = db.getSqlConnection;
var querySql = db.querySql;
var Promise = require("bluebird");
function retrieveUser(token) {
var userQuery = "select id, email from users where token = ?";
return querySql(userQuery, [token])
.then(function(rows){
if (rows.length == 0) {
return Promise.reject("did not find user");
}
var user = rows[0];
return user;
});
}
router.post('/', function (req, res, next) {
Promise.resolve().then(function () {
return retrieveUser(req.body.token);
})
.then(function (user){
email = user.email;
res.status(200).json({ "code": 0, "message": "success", "email": email});
})
.catch(function (err) {
console.error("got error: " + err);
if (err instanceof Error) {
res.status(400).send("General error");
} else {
res.status(200).json({ "code": 1000, "message": err });
}
});
});
module.exports = router;
I am still a bit new to node, so maybe I missed something let me know how it works out. Instead of triggering async node just forces it on you, so you have to think ahead and plan it.
const mysql = require('mysql');
const db = mysql.createConnection({
host: 'localhost',
user: 'user', password: 'password',
database: 'database',
});
db.connect((err) => {
// you should probably add reject instead of throwing error
// reject(new Error());
if(err){throw err;}
console.log('Mysql: Connected');
});
db.promise = (sql) => {
return new Promise((resolve, reject) => {
db.query(sql, (err, result) => {
if(err){reject(new Error());}
else{resolve(result);}
});
});
};
Here I am using the mysql module like normal, but instead I created a new function to handle the promise ahead of time, by adding it to the db const. (you see this as "connection" in a lot of node examples.
Now lets call a mysql query using the promise.
db.promise("SELECT * FROM users WHERE username='john doe' LIMIT 1;")
.then((result)=>{
console.log(result);
}).catch((err)=>{
console.log(err);
});
What I have found this useful for is when you need to do a second query based on the first query.
db.promise("SELECT * FROM users WHERE username='john doe' LIMIT 1;")
.then((result)=>{
console.log(result);
var sql = "SELECT * FROM friends WHERE username='";
sql = result[0];
sql = "';"
return db.promise(sql);
}).then((result)=>{
console.log(result);
}).catch((err)=>{
console.log(err);
});
You should actually use the mysql variables, but this should at least give you an example of using promises with mysql module.
Also with above you can still continue to use the db.query the normal way anytime within these promises, they just work like normal.
Hope this helps with the triangle of death.
You don't need to use promises, you can use a callback function, something like that:
function getLastRecord(name, next)
{
var connection = getMySQL_connection();
var query_str =
"SELECT name, " +
"FROM records " +
"LIMIT 1 ";
var query_var = [name];
var query = connection.query(query_str, query_var, function (err, rows, fields) {
//if (err) throw err;
if (err) {
//throw err;
console.log(err);
logger.info(err);
next(err);
}
else {
//console.log(rows);
next(null, rows);
}
}); //var query = connection.query(query_str, function (err, rows, fields) {
}
getLastRecord('name_record', function(err, data) {
if(err) {
// handle the error
} else {
// handle your data
}
});
Using the package promise-mysql the logic would be to chain promises using then(function(response){your code})
and
catch(function(response){your code}) to catch errors from the "then" blocks preceeding the catch block.
Following this logic, you will pass query results in objects or arrays using return at the end of the block. The return will help passing the query results to the next block. Then, the result will be found in the function argument (here it is test1). Using this logic you can chain several MySql queries and the code that is required to manipulate the result and do whatever you want.
the Connection object is created to be global because every object and variable created in every block are only local. Don't forget that you can chain more "then" blocks.
var config = {
host : 'host',
user : 'user',
password : 'pass',
database : 'database',
};
var mysql = require('promise-mysql');
var connection;
let thename =""; // which can also be an argument if you embed this code in a function
mysql.createConnection(config
).then(function(conn){
connection = conn;
let test = connection.query('select name from records WHERE name=? LIMIT 1',[thename]);
return test;
}).then(function(test1){
console.log("test1"+JSON.stringify(test1)); // result of previous block
var result = connection.query('select * from users'); // A second query if you want
connection.end();
connection = {};
return result;
}).catch(function(error){
if (connection && connection.end) connection.end();
//logs out the error from the previous block (if there is any issue add a second catch behind this one)
console.log(error);
});
To answer your initial question: How can this be done in node.js in a readable way?
There is a library called co, which gives you the possibility to write async code in a synchronous workflow. Just have a look and npm install co.
The problem you face very often with that approach, is, that you do not get Promise back from all the libraries you like to use. So you have either wrap it yourself (see answer from #Joshua Holbrook) or look for a wrapper (for example: npm install mysql-promise)
(Btw: its on the roadmap for ES7 to have native support for this type of workflow with the keywords async await, but its not yet in node: node feature list.)
This can be achieved quite simply, for example with bluebird, as you asked:
var Promise = require('bluebird');
function getLastRecord(name)
{
return new Promise(function(resolve, reject){
var connection = getMySQL_connection();
var query_str =
"SELECT name, " +
"FROM records " +
"WHERE (name = ?) " +
"LIMIT 1 ";
var query_var = [name];
var query = connection.query(query_str, query_var, function (err, rows, fields) {
//if (err) throw err;
if (err) {
//throw err;
console.log(err);
logger.info(err);
reject(err);
}
else {
resolve(rows);
//console.log(rows);
}
}); //var query = connection.query(query_str, function (err, rows, fields) {
});
}
getLastRecord('name_record')
.then(function(rows){
if (rows > 20) {
console.log("action");
}
})
.error(function(e){console.log("Error handler " + e)})
.catch(function(e){console.log("Catch handler " + e)});
May be helpful for others, extending #Dillon Burnett answer
Using async/await and params
db.promise = (sql, params) => {
return new Promise((resolve, reject) => {
db.query(sql,params, (err, result) => {
if(err){reject(new Error());}
else{resolve(result);}
});
});
};
module.exports = db;
async connection(){
const result = await db.promise("SELECT * FROM users WHERE username=?",[username]);
return result;
}