Too many Connections on Node-MySQL - mysql

I get Error Too many connections when there to much items in a loop with INSERT IGNORE INTO.
function insertCases(cases) {
for(var i in cases) {
var thequery = 'INSERT IGNORE INTO `cases` SET keysused='+cases.keysused+'
pool.query(thequery, function(ee, rr) {
if(ee) {
logger.info(ee);
throw ee;
}
});
}
}
When there are now 100+ cases so 100 times INSERT IGNORE INTO than I get too many connections. I don't know how much exactly crash it but with 100 it's working.
What I read is, that query runs the query and also closes the connection after done, so I read, I don't need to close it after than.
If I run 100, waiting short time and run 100 again and so on, it don't get Too many connections error.
It's only when there runs so much time at once.
That's my DB settings
var db_config = {
connectionLimit : 5000,
host: 'localhost',
user: 'userexample',
password: '*******',
database: 'example.com'
};
and that's the function that createPool.
function database_connection() {
pool = mysql.createPool(db_config);
pool.getConnection(function(err, connection) {
if(err) {
logger.error('[ERROR] Connecting to database "' + err.toString() + '"');
setTimeout(function() { database_connection(); }, 2500);
}
else
{
pool.query('SET NAMES utf8');
pool.query('SET CHARACTER SET utf8');
logger.trace('[INFO] Connected to database and set utf8!');
}
});
}
All queries on the node are simply run with pool.query
The database_connection() is called on start of the node

You've set connectionLimit to five thousand! That means the node-mysql subsystem will keep trying to handle multiple query requests by adding new connections until it has five thousand. Your MySQL server has a connection limit of 100, it seems, so your nodejs app blows out when that limit is reached.
Set connectionLimit: 10,
Then you'll use ten pool connections, and when they're all in use your pool.query invocations will wait until one becomes available.
Why not set the limit to 100? Two reasons:
You want to leave some MySQL connections for other client software.
If you hammer the database with many similar queries (in your case INSERT queries) in parallel, it will spend time avoiding contention. With fewer connections in parallel, each query will finish much faster.

Related

Use 1 single MySQL connection to provide result to multiple Users

I have used one signle connection object of MYSQL in node JS to serve for multiple users.
I mean to say that MySQL connection will be created upon starting the script & it will remain same until the life of the node script/server.
Practically, this is possible to do & i have done the same. Please take a look at below code of NodeJS/MySQL script.
#################################
var http = require('http');
var mysql = require('mysql');
var con = mysql.createConnection({
host: "192.168.1.105",
user: "root",
password: "XXXXXX",
database: "mydb"
});
con.connect(function(err) {
if (err) {
console.error('error: ' + err.message);
process.exit(1);
}
http.createServer(function (req, res) {
continueExecution(req,res);
}).listen(8082);
});
async function continueExecution(req,res){
res.write('calledddd\n');
for (let step = 0; step < 50; step++) {
// Runs 5 times, with values of step 0 through 4.
var bar = `Company Inc ${step}`;
var sql = `INSERT INTO customers (name, address) VALUES ('${bar}', 'Highway 37')`;
res.write(sql + "\n");
con.query(sql, function (err, result) {
if (err) throw err;
res.write("1 record inserted\n");
});
}
res.write('reached\n');
for (let ste = 0; ste < 50; ste++) {
res.write('started Update\n');
var bar = `Company Inc ${ste}`;
var sql = `UPDATE customers SET name = 'UPDATE RECORD' WHERE name = '${bar}'`;
con.query(sql, function (err, result) {
if (err) throw err;
res.write(result.affectedRows + " record(s) updated\n");
if(ste == 50) {
res.writeHead(200, {'Content-Type': 'text/html\n'});
res.write('Databse connected\n');
res.end();
}
});
}
}
#################################
I have several questions in my mind as i am technical expert. But i didn't find any resources over my questions. Please help me on this
Q1. Are there any type of consequences of using one single MySQL connection to provide response to multiple users?
Q2. Let's take an example.
100 users wants to access table name "users_data" at the same time. 25 users are updating their records on the same table with unique primary key. 50 users are selecting their records. another 25 users deleting their records.
All these operations are being done at the same time via parallel Node Script calls from remote device.
To complete all these MySQL transactions, system is using only 1 database connection.
What will happen in this case?
To answer your questions, one of the consequences of using a unique connection is that it can lead to slower request execution.
In fact, even if node will make the requests asynchronously, your database will execute all those requests synchronously, so one after the other in the order they came. As node makes the requests asynchronously, the order in which they are executed by your database is not granted, and the issue you are referencing to might happen.
One easy way to avoid this is to use a connection pool which will create a given number of connection, using the same db user. Here are some links that might help you with this :
using a connection pool with node.js
connect a mysql database with node.js

How do we close the database `connection` in sequelize

How do we close the database connection in the below app.post. Do sequelize will automatically taken care of it ?
server.js
const sequelize = new Sequelize(DB_NAME, DB_USERNAME, DB_PASSWORD, {
host: DB_HOST,
dialect: DB_DIALECT,
pool: DB_POOL,
port: DB_PORT
});
const Availability = availabilitySchema(sequelize, DataTypes);
app.post('/service/availability', async (req, res) => {
try {
const userEmail = req.query.email;
const dailyStatus = req.body.dailystatus;
var playerData = {email:userEmail, dailystatus: dailyStatus};
const playerDailyStatus = await Availability.create(playerData);
res.status(200).json({ success: true });
} catch (e) {
res.status(500).json({ message: e.message });
}
});
As i understand it (and I only started looking at sequelize yesterday - comments if I'm wrong, please) Sequelize pools its connections, so there isn't really anything for you to close; it opens and closes connections as necessary much like any other ORM, and mostly those connections are open and live in a pool, are leased from the pool and do some work, then are returned to the pool. You can configure the pool options (looks like you already have) if you want to limit the number of concurrently open connections to your DB but if you're looking in your DB manager and seeing "omg, my sequelize appp has 5 open connections.. now it has 10.. now 15!" that's just how it is; it opens as many connections as necessary (up to the max) to service the workload and leaves them open because it's a huge waste of time to actually open (TCP connect) and close them constantly (TCP disconnect).
When using an ORM you don't micromanage the connections, you just carry out queries using the modeled objects and let the ORM deal with the low level stuff (opening and closing a connection is a level below crafting the SQL to run, and you hand that off to the ORM too). Even in something you've been used to elsewhere, like C# new SqlConnection("connstr").Open() might not actually be opening a TCP cnnection to the DB and closing it; it's probably just leasing and returning to a pool and the underlying framework manages the actual TCP connections and their state

How does mysql connection pooling works with Node microservices?

I have two node microservices talking to one common mysql database. Both microservices have this below code to create a connection pool with a connectionlimit of 10 as follows:
// Initializing pool
var pool = mysql.createPool({
connectionLimit: 10,
host: 'localhost',
port: '3306',
user: 'root',
password: 'root'
});
function addConnection(req, res) {
pool.getConnection(function (err, connection) {
if (err) {
connection.release();
res.json({ "code": 500, "status": "Error" });
return;
}
connection.query("select * from user", function (err, rows) {
connection.release();
if (!err) {
res.json(rows);
}
});
connection.on('error', function (err) {
res.json({ "code": 500, "status": "Error" });
return;
});
});
}
For mysql database I have the max_connections set to 200(SHOW VARIABLES LIKE 'max_connections'; returns 200).
With the pool connectionLimit set to 10 for each of the microservice, in which cases or scenarios will the number of connections for any of the microservice will go above 10?
i.e. When and how the node services would be able to maintain more connections then expected?
If I have 3 instances running of same microservice then how does the pool connectionLimit works? What would be the limit of connections for each instance of microservice?
In one of the microservice say I have two apis which does database transactions and both connects to the database(getting connection) through two different functions having
same implementation of mysql.createPool({}) as above. What would happen if both apis are called concurrently and the number of requests made for each of them per second is 100 or more?
Will the number of connections made available be 10 or 20(since there are two mysql pools created with a connectionLimit of 10 each)?
Ideally it would not; but it can go above 10; if suppose some connections become stale i.e. they are closed from client end but are still open on Server end.
If you have multiple instances of same micro-service deployed in multiple VM or docker containers; they are like independent services.. and they have no connection among each other.. Hence, each of them will create its own 10 connection.
Firstly if u set connection pool limit as 10; that does NOT mean that during first moment 10 connections would be created.. While creating a pool; you also specify initial connection parameter suppose 5.. so, when service starts only 5 connections would be created.. and more created only when needed.. with UPPER Limit set as defined by parameter max_connections. Coming back to your question; well if you have NOT implemented synchronization etc. properly then yes it is possible that both pools will initialize their INITIAL_CONNECTIONS..

Two *identical* SQL statements behaving differently in PhpMyAdmin vs NodeJS

I'm trying to get the smallest ID (by world) that is not used via this SQL query:
"SELECT MAX(`objects`.`id`) as nextID FROM `objects` WHERE `objects`.`world`='1'"
NodeJS:
var mysql = require('mysql');
var pool = mysql.createPool({
connectionLimit: 10,
host: '********',
user: '*********',
password: '*******',
database: databaseName
});
function getNextObjectID(worldID, cb) {
var q = "SELECT MAX(`objects`.`id`) as nextID FROM `objects` WHERE `objects`.`world`='"+worldID+"'";
console.log(q);
pool.query(q, function(err, results, fields) {
console.log(err);
console.log(results);
console.log(fields);
});
}
Previously, I had a more in depth approach that included ids used previously, but it also was having this issue so I've reverted down to this simpler method.
I run this through node and phpmyadmin. When node is doing it, it automatically inserts the world id (and yes I print out the actual query and get that it is identical upon execution). When phpmyadmin executes it returns 14. When node executes its rarely 14 and most of the time null. I have no idea why it would change. All other queries behave normally.
There was an asynchronous delete being called by someone else, it was reading next id before a large amount of rows got inserted.

MySQL SELECT from one table and INSERT in another - Performance

The situation is: In one http GET request I need to select from one table the information I need and send to the client, and at the same time I need to retrieve the user IP and insert into a database. I'm using Node.js for this experiment.
The thing is: Is there a way to make the two actions together? Or do I have to connect and make two separate queries? Is there a way to render the page and do the other INSERT action in the background? What is the fastest option?
app.get('/', function({
connect.query("SELECT column1, column2 FROM table;", function(err, ...
render("index", ...);
});
connect.query("INSERT INTO table2 SET ip=11111111;");
});
The procedure approach suggested by #skv is nice but you have to wait for the write before doing the read and eventually returning a result to the user.
I would argue for another approach.
Queue the ip-address and a timestamp internally in something like an array or list.
Do the read from the database and return a result to the user
Create a background job that will nibble of the internal array and do the inserts
This has several benefits
The user gets a result faster
The writes can be done later if the system is being called in bursts
The writes can be done in batches of tens or hundreds of inserts reducing the time it takes to write one row.
You can make a stored procedure do this
Basically these are two different operations, but doing it in stored procedures might give you the assurance that it will surely happen, you can pass the IP address as the parameter into the stored procedure, this will also avoid any worries of performance in the code for you as the db takes care of insert, please remember that any select that does not insert into a table or a variable will produce a result set for you to use, hope this helps.
DELIMITER $
CREATE PROCEDURE AddIPandReturnInfo
(
#IPAddress varchar(20)
)
BEGIN
INSERT INTO Yourtable (IPAddress);
SELECT * FROM Tablename;
END $
DELIMITER ;
Well, I assume you're using this module https://github.com/felixge/node-mysql
The MySQL protocol is sequential, then, to execute paralell queries against mysql, you need multiple connections. You can use a Pool to manage the connections.(builtin in the module)
Example:
var mysql = require('mysql');
var pool = mysql.createPool({
host: 'example.org',
user: 'bob',
password: 'secret',
connectionLimit: 5 // maximum number of connections to create at once **10 by default**
});
app.get('/', function (req, res) {
// get a connection from the pool //async
pool.getConnection(function (err, connection) {
// Use the connection
connection.query('SELECT something FROM table1', function (err, rows) {
// Do somethig whith the mysql_response and end the client_response
res.render("index", {...
});
connection.release();
// Don't use the connection here, it has been closed.
});
});
//async
var userIp = req.connection.remoteAddress || req.headers['x-forwarded-for'] || null;
if (userIp) {
// get a connection from the pool again
pool.getConnection(function (err, connection) {
// Use the connection
connection.query('INSERT INTO table2 SET ip=?', [userIp], function (err, rows) {
// And done with the insert.
connection.release(); // Conn Close.
});
});
}
});