Node.js, ORM2 and Mysql. Connection are not closing - mysql

I am using node.js, express, ORM2 and Mysql. Everytime a page loads a new mysql connection is opened. The issue here is that the connection doesn't close, it stays open. So, each requests result a new "sleep" status connection in my mysql "show processlist" command.
Thanks,
Radu

Actually because i am new at node i did not realize that my application never ends the execution and i have to use singleton method for the mysql connection.
Using something like:
if (connection) return cb(null, connection);
orm.connect(settings.database, function (err, db) {
if (err) return cb(err);
connection = db;
db.settings.set('instance.returnAllErrors', true);
setup(db, cb);
});
This will keep just one mysql connection open.

Related

Why do we need to release connection when using connection pool in mysql?

I am trying to implement nodejs mysql database following this tutorial. I know that
pool.query() is a shortcut for pool.getConnection() +
connection.query() + connection.release().
In the article the database is configed as:
var mysql = require('mysql')
var pool = mysql.createPool({
connectionLimit: 10,
host: 'localhost',
user: 'matt',
password: 'password',
database: 'my_database'
})
pool.getConnection((err, connection) => {
if (err) {
if (err.code === 'PROTOCOL_CONNECTION_LOST') {
console.error('Database connection was closed.')
}
if (err.code === 'ER_CON_COUNT_ERROR') {
console.error('Database has too many connections.')
}
if (err.code === 'ECONNREFUSED') {
console.error('Database connection was refused.')
}
}
if (connection) connection.release()
return
})
module.exports = pool
This is can be used as:
pool.query('SELECT * FROM users', function (err, result, fields) {
if (err) throw new Error(err)
// Do something with result.
})
However, I really do not understand the point of
if (connection) connection.release()
Why do we need this if using pool releases the connection automatically?
Once you do pool.getConnection(), you are removing a connection from the pool which you can then use and nobody else can get access to that connection from the pool. When you are done with it, you put it back in the pool so others can use it.
So, when not using pool.query() (which as you know puts it back in the pool automatically), you have to get a connection, do whatever you want with it and then put it back into the pool yourself.
If all you need to do is a single query, then use pool.query() and let it automatically get a connection from the pool, run the query, then release it back to the pool. But, if you have multiple things you want to do with the connection, such as multiple queries or multiple inserts to the database, then get the connection, do your multiple operations with it and then release it back to the pool. Getting a connection manually from the pool also allows you to build up state on that connection and share that state among several operations. Two successive calls to pool.query() may actually use different connections from the pool. They might even run in parallel.
However, I really do not understand the point of
if (connection) connection.release()
Why do we need this if using pool releases the connection automatically?
If you manually get a connection from the pool, then you have to manually put it back in the pool when you're done with it with connection.release(). Otherwise, the pool will soon be empty of connections and you'll have a bunch of idle connections that can't be used by anyone.
If you use the automatic methods like pool.query(), then it will handle putting it back into the pool after the single query operation.
Think of it like an automatic mode vs. a manual mode. The manual mode gives you finer grain control over how you do things, but when the automatic mode lines up with your needs, it's easier to use. When the automatic mode (pool.query()) doesn't do exactly what you want, then manually get a connection from the pool, use it and the put it back.

intermittent 500 error coming from node.js websocket server

I have a websocket node.js app (game server) that runs a multiplayer html5 game.
The game has a website also. The game server and the website are on the same Apache VPS.
The game server uses mysql to store and retrieve data using mysql pooling from the node.js mysql package.
It works fine 99% of the time, but intermittently, at a random point, it will all of a sudden stop being able to get a mysql connection.
When this happens the website stops working and shows a 500 http error. I believe that this is whats causing the problem in the game server. Because of the 500 http error, mysql can no longer be connected to and thus pool.getConnection no longer works in the game server.
I find it strange that even though Apache is throwing up a 500 error, the game server can still be accessed successfully through a websocket as usual. The only thing that appears to have stopped working inside the game server is mysql. The game client connects to the game server via websocket and the functions work correctly, except for being able to connect to mysql.
If I ctrl+c the game server to stop the node.js app (game server) then the 500 error goes away. The website instantly serves up again, and then if I restarting the game server, mysql is now working again.
Obviously something in the game server is causing this to happen. So far i cannot find what it is. I am stuck now, i've spent a full week trying everything i could think of to debug this.
After running debug mode on mysql, im seeing this;
<-- ErrorPacket
ErrorPacket {
fieldCount: 255,
errno: 1203,
sqlStateMarker: '#',
sqlState: '42000',
message: 'User (i've hidden this) already has more than
\'max_user_connections\'
active connections' }
But I have it set to 100000 connections. No way is there that many being used. Every time I finish with a connection I use connection.release() to put it back into the pool. What do I need to do to fix this?
Please, any suggestion you have to debug this is greatly appreciated!
Thanks in advance.
here is the way i'm using mysql in the game server
const mysql = require('mysql');
const pool = mysql.createPool({
connectionLimit : 100000,
host : '***********',
user : '***********',
password : "'***********",
database : '***********',
debug : true
});
pool.getConnection(function(err,connection){
if (err) {
console.log(err);
return false;
}
connection.query("select * from aTable ",function(err,rows){
if(err) {
console.log(err);
connection.release();
return false;
}
// dos stuff here
connection.release();
})
})
1 thing i am wondering, if there is an error in the top error catch block
here ->
pool.getConnection(function(err,connection){
if (err) {
console.log(err);
return false;
}
Then the connection is not released right? So that would keep a connection alive and this is whats causing the problem? over time an error happens here and there, after a random amount of time, enough of these happen and that's what is causing it? it's like a build up of open connections???
This was the mistake i was making;
pool.getConnection(function(err,connection){
if (err) {
console.log(err);
return false;
}
connection.query("select * from aTable ",function(err,rows){
if(err) {
console.log(err);
connection.release();
return false;
}
if(somethingrelevant){
// do stuff
connection.release();
}
})
})
And that meant that if somethingrelevant didn't happen, then the connection would stay open.
My pool would continue to open new connections but they weren't always being put back.

Nodejs with persistent connection (MySQL/Redis)

I wonder what is the optimal way to establish/maintain connection with MySQL/Redis (from nodejs): store in one single object (conn) or create a new connection on every request? Namely:
1, Should we use a single connection for every nodejs http request? Use connection pool? Or a single connection on every new request (so reconnection should be important because the connection could be randomly lost)? How is the performance?
2, What is the difference between MySQL and Redis in term of maintaining such connection?
I will tell you how I used to manage to do this.
1, Should we use a single connection for every nodejs http request? Use connection pool? Or a single connection on every new request (so reconnection should be important because the connection could be randomly lost)? How is the performance?
You don't want to create connections manually for every nodejs http request. Always use connection pooling if you are using nodejs mysqlijs/mysql module. I use it.
Pools take care of server reconnections automatically.
I don't have a benchmarks, but performance should be better because within pools connections can be reused once released. In other factors, believe me, creating and managing connections manually is cumbersome and error-prone.
Eg:
Declare your mysql connection pool in a Db.js file like below and export it.
var mysql = require("mysql");
var pool = mysql.createPool({
connectionLimit : 5,
host : process.env.DB_HOST || 'localhost',
user : process.env.DB_USER,
password : process.env.DB_PASSWORD,
database : 'mydb'
});
module.exports = db;
And use it in your inside an end-point in another file.
var pool = require('./Db.js');
app.get('/endpoint', function (req, res) {
// ...
pool.query('<your-query-here>', function (err, res, fields) {
if (err) throw err;
// do something with result (res)
});
});
I prefer using both pool.query and pool.getConnection based on the scenario. Using query is safer because you don't need to consider releasing connection. It will be automatically handled by the library. getConnection is used only where several queries has to be run inside an end-point, so I can reuse the same connection to do that.
2, What is the difference between MySQL and Redis in term of maintaining such connection?
In short, You don't need pooling for redis. We don't think about pooling redis connections according to this.

node js - Is it need to close the connection when mysql task done?

I'm trying to learn mysql npm on my Node js server.
I care about my server need to close connection from mysql when it done a task or not ?
This is my code
let insertUserByEmail = function (data, callback){
db.query("INSERT INTO user (username, password) VALUES (?,?)",[data.email, data.password], callback);
// then close it
}
and the next task, i will connect again.
Is this my solution is good?
According to standard practice and security measures you should have to close any open database connection if made by your script. If you are leaving connection open then it will be auto time out by database server.
Specifically for node applications you can close connection after pool by
db.end(function (err) {
// all open connections in the db cluster have ended
});

NodeJS and Mysql. Would not calling connection.end() before terminate be acceptable?

I want to use mysql from AWS Lambda (hosted nodejs).
The nodejs instance will be automatically terminated by Lambda when no new request show up for a few minutes.
Due to this Lambda behavior, I don't want to call end() because otherwise it would turn every request doing connect-use-end cycle. I want the connection (or pool) to live over multiple requests.
Would that be a problem if connection.end() is not called and instance get terminated? (could there be leak or something)
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'me',
password : 'secret',
database : 'my_db'
});
connection.connect();
index.handler = function(){
connection.query('SELECT x', function(err, rows, fields) {
// do something here
});
};
// * cannot call because potential incoming request still need to use.
// connection.end();
You may endup getting "Too many connection" errors at somepoint.
Your best option till then is to open and close connection within the same invocation.
Connection pool is a use case which needs effective reuse of lamdba containers and is a pending feature request on aws.
https://forums.aws.amazon.com/thread.jspa?threadID=216000