Kill running query with Sequelize - mysql

I am working with sequelize and MySQL DB. I have some heavy queries that the users can cancel by clicking a 'cancel' button in the GUI.
I tried to do it with transaction but when I do t.rollback() the query doesn't getting killed in the DB. Is there any way to kill a query using sequelize?
I prefer to use sequlize to do it, but even getting the query ID and manually kill it is fine.
.transaction(async (t) => {
if (transaction) {
transaction.rollback();
}
transaction = t;
return db.myTable.findAll(data);
})
.then((data) => {
transaction = {};
return data;
})
.catch((error) => {
transaction = {};
throw error;
});

Not natively supported with Sequelize. There is an open, inactive issue to add this ability there, with a workaround used for PostgreSQL.
const killableQuery = async (query, req, options = {}) => {
const connection = await sequelize.connectionManager.getConnection();
req.on('close', () => {
sequelize.query(`SELECT pg_terminate_backend(${connection.processID});`)
.catch((error) => {
console.error('Unable to terminate query!', error);
});
});
const result = await sequelize.query(query, {
...options,
transaction: { connection },
});
await sequelize.connectionManager.releaseConnection(connection);
return result;
};
Sorry, I cannot find detail about the processID property and whether it corresponds to a property that MySQL tracks. If yes, one could replace the PostgreSQL statement with a MySQL counterpart like:
SELECT GROUP_CONCAT(CONCAT('KILL ',id,';') SEPARATOR ' ')
FROM information_schema.processlist
WHERE user <> 'system user'
AND -- <condition based on processID>;

Related

Using MySQL db functions (?) with SQLite (Node.js)

I'm using a tutorial to do JWT/bcryptjs auth and then INSERT into a SQlite
table.
Thing is the tutorial is for MySQL and I get errors like db.query is not a function
and db.escape is not a function
The db :
const sqlite3 = require('sqlite3').verbose()
const DBSOURCE = "./src/db/db.sqlite"
let db = new sqlite3.Database(DBSOURCE, (err) => {
if (err) {
// Cannot open database
console.error(err.message)
throw err
}else{
console.log('Connected to the SQLite database.')
}
});
module.exports = db
Example query :
db.query(
`SELECT * FROM users WHERE LOWER(username) = LOWER(${db.escape(
req.body.username
)});`,
(err, result) => {
if (result.length) {
return res.status(409).send({
msg: 'This username is already in use!'
});
} else { .........
My best guess is that the functions are different?
How do I get this right?
There are a lot of proprietary functions in MySQL that will not work with standard SQL in other database systems.
That is just the beginning of the differences between Mysql and SQLite
Provide some query examples and we may be able to assist you with each one.
-- update after your addition of query code...
Here is an example of sqlite-nodejs
const sqlite3 = require('sqlite3').verbose();
// open the database
let db = new sqlite3.Database('./db/chinook.db');
let sql = `SELECT * FROM users WHERE LOWER(username) = LOWER(?)`;
db.all(sql, [req.body.username], (err, rows) => {
if (err) {
throw err;
}
rows.forEach((row) => {
console.log(row.name);
});
});
// close the database connection
db.close();

How do I disconnect from my database gracefully in a callback / promise - based environment?

I need to know how to disconnect from my MySQL database after lots of individual callbacks have finished. I have a node.js cron script running on AWS EC2 which accesses s3 buckets and MySQL databases on AWS RDS. The cron script looks something like this:
const mysql = require("mysql2"),
AWS = require("aws-sdk"),
s3 = new AWS.S3(),
connection = mysql.connect({...});
connection.connect();
connection.query(`SELECT ... LIMIT 100`, (error, results) => {
if (error) throw new Error(error);
for (let idx in results) {
const row = results[idx],
Key = `my/key/${row.id}`;
s3.getObject({Bucket, Key}, (error, object) => {
// do more things, with more callbacks
});
}
});
setTimeout(() => connection.end(), 10000); // disconnect database in 10 seconds
The script doesn't exit until I disconnect from the database using connection.end(). I can't disconnect as normal e.g. after the for loop, because the various callbacks are still running. I need to know when they're all finished. Currently I just disconnect after 10 seconds because everything should have completed by then. If I don't do that then I end up with lots of never-ending processes running.
Do I need to set flags & counts of each thing, and then use setInterval or something until they're all finished and it's safe to disconnect? OK to do but is that the right approach when using callbacks, promises & thens?
You can do it with counters or flags as you said, or with Promise.all:
const mysql = require("mysql2"),
AWS = require("aws-sdk"),
s3 = new AWS.S3(),
connection = mysql.connect({...});
function doQuery(){
connection.connect();
return new Promise((resolve, reject)=>{
connection.query(`SELECT ... LIMIT 100`, (error, results) => {
if (error) { return reject(new Error(error)); }
resolve(results)
});
})
}
doQuery()
.then(results => {
const jobs = results.map(row => {
const Key = `my/key/${row.id}`;
return new Promise((resolve, reject) => {
s3.getObject({Bucket, Key}, (error, object) => {
// do more things, with more callbacks
resolve('ok')
});
})
})
return Promise.all(jobs)
})
.finally(()=>{
connection.end()
})
I just wanted to post as well that Promise.all() is definitely a great way to go, however it's not the only approach.
In this day & age, where the cost of connecting to & disconnecting from your database can be very cheap, I find it simpler to just connect on every query and disconnect after:
const dbOneQuery = (sql, bindVars, callback) => {
const dbConnection = getConnection(); // mysql2.createConnection etc
dbConnection.query(sql, bindVars, (error, result) => {
dbConnection.end();
if (callback) callback(error, result);
});
};
and that way there aren't any connections held open to be closed.
If in future I move to persistent connections again, I can just change what getConnection() does and use something that overrides .end() etc..
For me this approach has been simpler overall compared to managing a single shared connection to the database, with no real downsides.

node js script exits prematurely

Promise newbie here.
I'm trying to retrieve icon_name field from asset database, Equipment table in mongodb
and update icon_id field in equipments database, equipments table in mysql.
I have about 12,000 records with icon_name field in Equipment.
The script runs successfully however it doesn't seem to go through all the records.
When I check the equipments table there are only about 3,000 records updated.
I tried running the script several times and it appears to update a few more records each time.
My suspicion is the database connection is close before all the queries are finished but since I use Promise.all I don't know why it happened.
Here is the script
const _ = require('lodash'),
debug = require('debug')('update'),
Promise = require('bluebird')
const asset = require('../models/asset'),
equipments = require('../models/equipments')
const Equipment = asset.getEquipment(),
my_equipments = equipments.get_equipments(),
icons = equipments.get_icons()
Promise.resolve()
.then(() => {
debug('Retrieve asset equipments, icons')
return Promise.all([
icons.findAll(),
Equipment.find({ icon_name: { $ne: null } })
])
})
.then(([my_icons, asset_equipments]) => {
debug('Update equipments')
const updates = []
console.log(asset_equipments.length)
asset_equipments.forEach((aeq, i) => {
const icon_id = my_icons.find(icon => icon.name === aeq.icon_name).id
up = my_equipments.update(
{ icon_id },
{ where: { code: aeq.eq_id } }
)
updates.push(up)
})
return Promise.all(updates)
})
.then(() => {
debug('Success: all done')
asset.close()
equipments.close()
})
.catch(err => {
debug('Error:', err)
asset.close()
equipments.close()
})
Thanks in advance.
Code looks fine but spawning 12000 promises in parallel might cause some trouble on the database connection level. I would suggest to batch the concurrent requests and limit them to let's say 100. You could use batch-promises (https://www.npmjs.com/package/batch-promises)
Basically something like
return batchPromises(100, asset_equipments, aeq => {
const icon_id = my_icons.find(icon => icon.name === aeq.icon_name).id;
return my_equipments.update({ icon_id }, { where: { code: aeq.eq_id } });
});

discord.js/node.js make code wait until sql query returns result

I am working on a discord.js bot, and I'm storing a bunch of information on various servers in a database. The problem is, that the code doesn't wait for the database to return the results. In the current situation, I'm trying to check if the server specific prefix checks out.
I tried using async and await at various places, but those didn't work. If I could, I'd rather not use .then(), because I don't really want to put all the commands inside a .then().
const { Client, Attachment, RichEmbed } = require('discord.js');
const client = new Client();
const mysql = require("mysql");
const config = require("./config.json")
var con = mysql.createConnection({
host: 'localhost',
user: 'root',
password: '',
database: 'botdb'
})
client.on("ready", () => {
console.log("I'm ready")
})
client.on("message", message => {
if (message.author.bot) return;
if (message.channel.type === 'dm') return;
let msg = message.content.split(" ");
let command = msg[0];
let prefix;
con.query(`SELECT * FROM serversettings WHERE ServerID = ${message.guild.id}`, (err, rows) => {
if (err) throw err;
prefix = rows[0].Prefix;
console.log(prefix)
})
console.log(`Prefix: ${prefix}, Command: ${command}`)
if (command === `${prefix}examplecommand`) {
//Do something
}
//Other code that uses prefix and command
}
It should log the prefix first, and then the Prefix: ${prefix}, Command: ${command} part, but it does it the other way around, so the examplecommand doesn't work.
Your result is caused by the fact that what's outside your query callback is executed immediately after the call. Keep in mind the mysql module is callback-based.
Possible Solutions
Place the code inside the callback so it's executed when the query is completed.
Wrap the query in a promise and await it.
function getGuild(guildID) {
return new Promise((resolve, reject) => {
con.query(`SELECT * FROM serversettings WHERE ServerID = '${guildID}', (err, rows) => {
if (err) return reject(err);
resolve(rows);
});
});
}
const [guild] = await getGuild(message.guild.id) // destructuring 'rows' array
.catch(console.error);
console.log(guild.prefix);
Use a Promise-based version of a MySQL wrapper, like promise-mysql. You could use it the same way as the code above, without worrying about coding your own Promises.
const [guild] = await con.query(`SELECT * FROM serversettings WHERE serverID = '${message.guild.id}'`)
.catch(console.error);
console.log(guild.prefix);

How do you open/close mysql connection multiple times?

I'm using node with mysql and I have a route that does:
const mysql = require("./mysql");
router.post("/register_user", (req, res) => {
mysql.register(req.body).then((result) => {
// stuff
});
});
mysql.js:
const mysql = require("mysql");
const connection = mysql.createConnection("mysql://...");
exports.register = (req) => {
const user = { name: req.name };
return new Promise((resolve, reject) => {
// make sure user doesn't exist already
connection.query('...', [user], (err, data) => {
...
if (isNewUser) {
connection.query('INSERT INTO USER...', user, (insertErr, rows) => {
...
resolve(rows);
connection.end();
}
}
});
});
}
This works perfectly when I register the first user in my app. But immediately after, if I log out (on the web app), then register a new user, I get an error saying:
Error: Cannot enqueue Query after invoking quit.
Why doesn't this create a new connection?
I assume you are using the following NPM module mysql
If it is the case then could you simply use MySQL pooling connections ?
Rather than creating and managing connections one-by-one, this module also provides built-in connection pooling using mysql.createPool(config).
So instead of calling connection.end(); you would be calling connection.release(); instead to return connection to the pool of open connections.