ROLLBACK doesn't rollback transaction [NodeJS, MySQL] - mysql

connection.query(`START TRANSACTION;`, async function (err) {
if (err) {
req.flash("flash", "Something went wrong while deleting. Try again.");
return res.redirect("back");
} else {
await connection.query(
`INSERT INTO... ; SELECT LAST_INSERT_ID();`,
async (error, results1) => {
if (error) {
await connection.query(`ROLLBACK;`, function (err) {
req.flash("flash", "There was an error while posting.");
return res.redirect("/post");
});
} else {
var post_id = await results1[0].insertId;
await connection.query(
`INSERT INTO...`,
[],
async (error, results) => {
if (error) {
await connection.query(`ROLLBACK;`, function (err) {
req.flash("flash", "There was an error while posting.");
return res.redirect("/post");
});
} else {
await connection.query(
`INSERT INTO...`,
async (error, results) => {
if (error) {
await connection.query(`ROLLBACK;`, function (err) {
req.flash("flash", "There was an error while posting.");
return res.redirect("/post");
});
}
await connection.query(`ROLLBACK;`, function(err){
req.flash("flash", "Went through...");
return res.redirect("back");
})
...
});
First I start the transaction. Then I have the code in the middle to insert data into 3 tables. I wanted to test the rollback which is supposed to undo everything after START TRANSACTION but it didn't do anything. What did I do wrong?
(The code works by itself so please ignore if I missplaced any brackets).

The problem was that I was using a pool directly. You need to get a new connection and then use that to make all the queries and transactions. That is also the reason beginTransaction wasn't working for me before. What I should have done is:
connection.getConnection(function(err, con) {
con.query...
con.beginTransaction()...
con.query..
con.rollback()...
})

You can check the multiple-transaction-manager library to manage your transaction.
https://www.npmjs.com/package/#multiple-transaction-manager/mysql
Example;
// init manager & context
const txnMngr: MultiTxnMngr = new MultiTxnMngr();
const mysqlContext = new MysqlDBContext(txnMngr, pool);
// Add first step
mysqlContext.addTask("DELETE FROM test_table");
// Add second step
mysqlContext.addTask("INSERT INTO test_table(id, name) VALUES (:id, :name)", { "id": 1, "name": "Dave" });
// Add third step
mysqlContext.addTask("INSERT INTO test_table(id, name) VALUES (:id, :name)", { "id": 2, "name": "Kevin" });
// jest test
await expect(txnMngr.exec()).resolves.not.toBeNull();

Related

Variable doesn't get set in connection query [Node.js, MySQL]

I am having an issue with the session variable not setting. I am assuming the problem has something to do with the functions being asynchronous(async/await).
How do I make sure that session is set before jwt.sign and committing (how should I use async await or promises here)?
connection.getConnection(function(err, conn) {
try {
conn.beginTransaction(function(err) {
if (err) {
throw err
}
conn.query(`INSERT INTO USER SET username = ?, email = ?; SELECT LAST_INSERT_ID() AS id;`, [username, hashedEmail], (error, results) => {
if(error){
throw error;
} else {
// this doesn't set
req.session.verify = results[0].id;
jwt.sign(
{},
process.env.gmail_secret,
{
expiresIn: '1d',
},
(err, emailToken) => {
//...
}
)
await conn.commit(async function(error) {
if (error) {
throw err;
}
await conn.destroy();
req.flash("flash", "You have successfully created an account. The last step is to confirm your email so we know you are legit =].");
return res.redirect('/register');
});
}
})
})
} catch {
return conn.rollback(function() {
conn.destroy();
req.flash("flash", 'A server error has occurred.');
res.redirect('/register');
});
}
});

Multiple Transactions in mysql for Node

I'm using node's driver for mysql and need to execute 'n' number of transactions one after the other and not simultaneously.
I've tried using a for/forEach loop but the transactions seem to happen concurrently and that causes my api to crash.Here's the error :-
throw err; // Rethrow non-MySQL errors
^
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
One single transactions seems to work just fine.
Each transaction has 4 queries, req.body is an array of objects:-
router.post('/production/add', (req, res) => {
for (const obj of req.body) {
pool.getConnection(function (err, connection) {
connection.beginTransaction(function (err) {
if (err) throw err;
const query1 = `select qty from production where prc_id = ${obj.prc_id}`;
console.log(query1);
connection.query(query1, function (error, result1, fields) {
if (error) {
return connection.rollback(function () {
res.status(400).send({ query: 1, message: error.sqlMessage, code: error.code, errno: error.errno });
return;
});
}
const new_prod_qty = result1[0].qty - obj.auth_prod_qty;
const query2 = new_prod_qty > 0 ? `update production set qty = ${new_prod_qty} where prc_id = ${obj.prc_id}` : `delete from production where prc_id = ${obj.prc_id}`;
console.log(query2);
connection.query(query2, function (error, results2, fields) {
if (error) {
return connection.rollback(function () {
res.status(400).send({ message: error.sqlMessage, code: error.code, errno: error.errno });
return;
});
}
const query3 = `update prc set auth_prod_qty = ${obj.auth_prod_qty} where prc_id = ${obj.prc_id}`;
console.log(query3);
connection.query(query3, function (error, results3, fields) {
if (error) {
return connection.rollback(function () {
res.status(400).send({ message: error.sqlMessage, code: error.code, errno: error.errno });
return;
});
}
const query4 = "select * from store";
connection.query(query4, function (error, results3, fields) {
if (error) {
return connection.rollback(function () {
res.status(400).send({ message: error.sqlMessage, code: error.code, errno: error.errno });
return;
});
}
connection.commit(function (err) {
if (err) {
return connection.rollback(function () {
res.status(400).send({ message: error.sqlMessage, code: error.code, errno: error.errno });
return;
});
}
res.status(201).send(results2);
});
});
});
});
});
});
});
};
});
Based off some research Sequelize ORM seems to promisify transactions but however I'm hoping to use it as a last resort. Any sort of solution with or without Sequelize would be appreciated!
Thanks in advance!
You need to use async / await to run your txs sequentially. How to do this?
Use npm mysql2 in place of npm mysql. That gets you promisified (awaitable) versions of the APIs when you require('mysql2/promise'). Plus, this is much more fun to program and debug than those miserable nested callbacks. Just don't forget the awaits.
Use this basic outline for your code's data processing loop. Everything will go in order sequentially. The way you create your pool is a little different; read the npm page. This is not debugged.
const mysql = require('mysql2/promise');
router.post('/production/add', async (req, res) => {
const connection = await pool.getConnection()
for (const obj of req.body) {
try {
await connection.beginTransaction()
const query1 = 'whatever'
const result1 = await connection.query(query1)
const query2 = 'something else'
const result 2 = await connection.query(query2)
/* etcetera etcetera */
await connection.commit()
}
catch (error) {
await connection.rollback()
pool.releaseConnection()
res.status(400).send({ something })
}
}
pool.releaseConnection()
}
mysql2/promise is exactly the package I was looking for, works with mysql and uses promise() method to upgrade mysql connection to a promise based mysql2 connection.
router.post('/stock/add', async (req, res) => {
const connection = pool.getConnection(async function (err, connection) {
if (err) {
connection.release();
res.status(400).send(err);
return;
}
else {
for (const obj of req.body) {
try {
await connection.promise().beginTransaction();
const [result1, fields1] = await connection.promise().query(query1)
const [result2, fields2] = await connection.promise().query(query2);
const [result3, fields3] = await connection.promise().query(query3);
const [result4, fields4] = await connection.promise().query(query4);
await connection.promise().commit();
}
catch (error) {
await connection.promise().rollback();
connection.release();
res.status(400).send(error);
return;
}
}
res.status(200).send('Transaction Complete');
}
});
});

Nodejs async/await for MySQL queries

I trying to execute 2 MySQL queries sequentially in Node.JS. MySQL queries work properly by itself.
I would like to do it with async/await function to be sure record is inserted before it's updated.
Here is the code:
router.post('/assign_new_item_id', async (req, res) => {
.....
try {
let qr1= "INSERT INTO foo1 ........;"
await pool.query( qr1, (err) => {
if (err) throw err;
});
let qr2= "UPDATE foo1 .....;"
await pool.query( qr2, (err) => {
if (err) throw err;
});
}catch(err){
console.log(err)
}
It seems that execution "hangs" within first await await block. What is the best way the ensure that both queries are executed consequently.
Thanks in advance for any help.
To await you need a Promise, Not Callback. In your case you are not returning a promise to await.
router.post('/assign_new_item_id', async (req, res) => {
// .....
try {
let qr1 = "INSERT INTO foo1 ........;"
await new Promise((res, rej) => {
pool.query(qr1, (err, row) => {
if (err) return rej(err);
res(row);
});
});
let qr2 = "UPDATE foo1 .....;"
await new Promise((res, rej) => {
pool.query(qr2, (err, row) => {
if (err) return rej(err);
res(row);
});
});
} catch (err) {
console.log(err)
}
});
Here I am promisifing the pool.query method and returning a promise.

nodejs- unable to return result to controller function

From my Model, I fetch some articles from a MySQL database for a user.
Model
var mysql = require('mysql');
var db = mysql.createPool({
host: 'localhost',
user: 'sampleUser',
password: '',
database: 'sampleDB'
});
fetchArticles: function (user, callback) {
var params = [user.userId];
var query = `SELECT * FROM articles WHERE userId = ? LOCK IN SHARE MODE`;
db.getConnection(function (err, connection) {
if (err) {
throw err;
}
connection.beginTransaction(function (err) {
if (err) {
throw err;
}
return connection.query(query, params, function (err, result) {
if (err) {
connection.rollback(function () {
throw err;
});
}
//console.log(result);
});
});
});
}
This is working and the function fetches the result needed. But it's not returning the result to the controller function (I am returning it but I'm not able to fetch it in the controller function. I guess, I did something wrong here).
When I did console.log(result) this is what I got.
[ RowDataPacket {
status: 'New',
article_code: 13362,
created_date: 2017-10-22T00:30:00.000Z,
type: 'ebook'} ]
My controller function looks like this:
var Articles = require('../models/Articles');
exports.getArticle = function (req, res) {
var articleId = req.body.articleId;
var article = {
userId: userId
};
Articles.fetchArticles(article, function (err, rows) {
if (err) {
res.json({ success: false, message: 'no data found' });
}
else {
res.json({ success: true, articles: rows });
}
});
};
Can anyone help me figure out what mistakes I made here?
I'm pretty new to nodejs. Thanks!
The simple answer is that you're not calling the callback function, anywhere.
Here's the adjusted code:
fetchArticles: function (user, callback) {
var params = [user.userId];
var query = `SELECT * FROM articles WHERE userId = ? LOCK IN SHARE MODE`;
db.getConnection(function (err, connection) {
if (err) {
// An error. Ensure `callback` gets called with the error argument.
return callback(err);
}
connection.beginTransaction(function (err) {
if (err) {
// An error. Ensure `callback` gets called with the error argument.
return callback(err);
}
return connection.query(query, params, function (err, result) {
if (err) {
// An error.
// Rollback
connection.rollback(function () {
// Once the rollback finished, ensure `callback` gets called
// with the error argument.
return callback(err);
});
} else {
// Query success. Call `callback` with results and `null` for error.
//console.log(result);
return callback(null, result);
}
});
});
});
}
There's no point in throwing errors inside the callbacks on the connection methods, since these functions are async.
Ensure you pass the error to the callback instead, and stop execution (using the return statement).
One more thing, without knowing the full requirements of this:
I'm not sure you need transactions for just fetching data from the database, without modifying it; so you can just do the query() and skip on using any beginTransaction(), rollback() and commit() calls.

Node JS Inserting array of objects to mysql database when using transactions

Am using node-mysql to add records to a database but am facing a challenge when the records to be inserted are an array of objects and I need the operation to be a transaction. I have simplified my problem by creating a test project to better explain my problem.
Lets say I have to tables users and orders and the data to be inserted looks like this
var user = {
name: "Dennis Wanyonyi",
email: "example#email.com"
};
var orders = [{
order_date: new Date(),
price: 14.99
}, {
order_date: new Date(),
price: 39.99
}];
I want to first insert a user to the database and use the insertId to add the each of the orders for that user. Am using a transaction since in case of an error, I want to rollback the whole process. Here is how I try to insert all the records using node-mysql transactions.
connection.beginTransaction(function(err) {
if (err) { throw err; }
connection.query('INSERT INTO users SET ?', user, function(err, result) {
if (err) {
return connection.rollback(function() {
throw err;
});
}
for (var i = 0; i < orders.length; i++) {
orders[i].user_id = result.insertId;
connection.query('INSERT INTO orders SET ?', orders[i], function(err, result2) {
if (err) {
return connection.rollback(function() {
throw err;
});
}
connection.commit(function(err) {
if (err) {
return connection.rollback(function() {
throw err;
});
}
console.log('success!');
});
});
}
});
});
However I have a problem iterating over the array of orders without having to call connection.commit multiple times within the for loop
I would suggest to construct a simple string for multiple row insert query for orders table in the for loop first and then execute it outside the for loop. Use the for loop to only construct the string. So you can rollback the query whenever you want or on error. By multiple insert query string i mean as follows:
INSERT INTO your_table_name
(column1,column2,column3)
VALUES
(1,2,3),
(4,5,6),
(7,8,9);
You can use Promise.all functionality of Bluebird for this.
var promiseArray = dataArray.map(function(data){
return new BluebirdPromise(function(resolve, reject){
connection.insertData(function(error, response){
if(error) reject(error);
else resolve(response);
}); //This is obviously a mock
});
});
And after this:
BluebirdPromise.all(promiseArray).then(function(result){
//result will be the array of "response"s from resolve(response);
database.commit();
});
This way, you can work all the inserts asyncronously and then use database.commit() only once.
Some kind of task in Node.js are Asynchronous( like I/O , DB and etc..), and there is a lots of LIBS that help to handle it.
but if you want don't use any lib,for iterating an array in JS and use it in an asynchronous functionality its better to implement it as recursive function.
connection.beginTransaction(function(err) {
if (err) {
throw err;
}
connection.query('INSERT INTO users SET ?', user, function(err, result) {
if (err) {
return connection.rollback(function() {
throw err;
});
}
// console.log(result.insertId) --> do any thing if need with inserted ID
var insertOrder = function(nextId) {
console.log(nextId);
if ((orders.length - 1) < nextId) {
connection.commit(function(err) {
if (err) {
return connection.rollback(function() {
throw err;
})
}
console.log(" ok");
});
} else {
console.log(orders[nextId]);
connection.query('INSERT INTO orders SET ?', orders[nextId], function(err, result2) {
if (err) {
return connection.rollback(function() {
throw err;
});
}
insertOrder(nextId + 1);
});
}
}
insertOrder(0);
});
});
as you can see I rewrite your for loop as a recursive function inside.
I would use the async.each to do the iteration and to fire all the queries in parallel. If some of the queries will fail, the asyncCallback will be called with an error and the program will stop processing the queries. This will indicate that we should stop executing queries and rollback. If there is no error we can call the commit.
I' ve decoupled the code a bit more and split it into functions:
function rollback(connection, err) {
connection.rollback(function () {
throw err;
});
}
function commit(connection) {
connection.commit(function (err) {
if (err) {
rollback(connection, err);
}
console.log('success!');
});
}
function insertUser(user, callback) {
connection.query('INSERT INTO users SET ?', user, function (err, result) {
return callback(err, result);
});
}
function insertOrders(orders, userId, callback) {
async.each(orders, function (order, asyncCallback) {
order.user_id = userId;
connection.query('INSERT INTO orders SET ?', order, function (err, data) {
return asyncCallback(err, data);
});
}, function (err) {
if (err) {
// One of the iterations above produced an error.
// All processing will stop and we have to rollback.
return callback(err);
}
// Return without errors
return callback();
});
}
connection.beginTransaction(function (err) {
if (err) {
throw err;
}
insertUser(user, function (err, result) {
if (err) {
rollback(connection, err);
}
insertOrders(orders, result.insertId, function (err, data) {
if (err) {
rollback(connection, err);
} else {
commit(connection);
}
});
});
});
you need to use async library for these kind of operation.
connection.beginTransaction(function(err) {
if (err) { throw err; }
async.waterfall([
function(cb){
createUser(userDetail, function(err, data){
if(err) return cb(err);
cb(null, data.userId);
});
},
function(userid,cb){
createOrderForUser(userid,orders, function() {
if(err) return cb(err);
cb(null);
});
}
], function(err){
if (err)
retrun connection.rollback(function() {
throw err;
});
connection.commit(function(err) {
if (err) {
return connection.rollback(function() {
throw err;
});
}
console.log('success!');
});
});
});
var createUser = function(userdetail, cb){
//-- Creation of Orders
};
var createOrderForUser = function (userId, orders, cb) {
async.each(orders, function(order, callback){
//-- create orders for users
},function(err){
// doing err checking.
cb();
});
};
See if you can write a Stored Procedure to encapsulate the queries, and have START TRANSACTION ... COMMIT in the SP.
The tricky part comes with needing to pass a list of things into the SP, since there is no "array" mechanism. One way to achieve this is to have a commalist (or use some other delimiter), then use a loop to pick apart the list.
currentLogs = [
{ socket_id: 'Server', message: 'Socketio online', data: 'Port 3333', logged: '2014-05-14 14:41:11' },
{ socket_id: 'Server', message: 'Waiting for Pi to connect...', data: 'Port: 8082', logged: '2014-05-14 14:41:11' }
];
console.warn(currentLogs.map(logs=>[ logs.socket_id , logs.message , logs.data , logs.logged ]));