npm mysql cannot acquire new connection once connection limit reaches - mysql

I am using mysql package of npm in my NodeJS project. I am using connection pool as below -
var pool = mysql.createPool({
connectionLimit: 50,
host: host,
user: user,
password: password,
database: database
});
And then I am using the pool as -
pool.query("Select ....", function (err, data) {
});
But sometimes our database server is stuck due to large queries and I think the connection limit of this connection gets crossed. Then after the stuck queries have executed successfully, the mysql library cannot acquire new connections. I cannot even see the queries in SHOW PROCESSLIST of MySQL. So there is issue in acquiring new connections. There is nothing in the logs too. I sort the issue by restarting the Node Server, but it isnt the ideal solution. Please help me in identifying the cause of the issue. Similar issue occurs with MSSQL connections in NodeJS and I just cannot identify the reason for this.

After you're done processing your query, you should send pool.end() to close the current connection.

Related

Heroku is not connecting to database NodeJS and Mysql

last week I created a Node server with MySQL. Everything went fine. I was using the configvars to connect and made the whole mobile app with the endpoints.
Suddenly, today, Heroku decided to close the connection to MySQL database (ClearDB) and throw status 503 and Connection lost.
const mysql = require('mysql');
const db = mysql.createPool({
connectionLimit : 120,
host : 'hostFromHeroku.cleardb.net',
user : 'user',
password : 'password',
database : 'databaseName',
debug : false
});
db.getConnection((err,connection)=> {
if(err)
throw err; //THIS LINE THORS THE ERROR
console.log('Database connected successfully');
connection.release();
});
module.exports = db;
This is the HEroku console
I tried to connect dozens of times again and again... Dont know whats wrong.
I checked the clearDB portal and noticed that on the portal the connection dissapeared. Tried also to connect with MySQL workbench but the connection still cannot be established...
In the figure above there are still no connections... Checked the credentials few dozens of times....
Any ideeas? If anyone wants more pieces of code please let me know
Thanks, Daniel
I am currently having the same issue with ClearDB + heroku using Python/Django and it started today as well.
I do believe this is an issue with ClearDB or perhaps Heroku and not on your/our application side. See if you can connect to the database using some client like MySQlWorkbench for example. It probably won't work either.
I opened a ticket with both heroku and clearDB. I suggest you do the same. They must have messed up something
ClearDB is currently experiencing some issues.
Having the same issue over here that I'm unable to connect to my database using TablePlus/Sequel with the following error message:
Lost connection to MySQL server at 'reading initial communication packet', system error: 0
And the Status/Dashboard indicates that the database is up and running.
Created an support ticket and got the following response:
Team is still working on the issue. We will update you once we have
next update.
Support response from 5/17/2022 08:36PM
Our support team continues to work on the issue on priority. they have
identified some issues with the shared MYSQL DB and AWS volumes. The
team is trying to fix the db node and resync the data which may take
some time. We will keep you updated with the service restoration
progress.
After 30h of downtime my database is back online again

After connecting to MySQL database, I'm getting "Error: Got packets out of order"

Currently I am trying to set up a simple REST API using Deno and MySQL. After succesfully creating database, table and inserting some values into it, I'm failing with getting those values from the Deno side. Here is my code:
import { Client } from "https://deno.land/x/mysql/mod.ts";
const client = await new Client().connect({
hostname: "127.0.0.1",
username: "root",
port: 3306,
db: "testDatabase",
password: "",
});
await client.execute('use Ponys');
await client.query('SELECT * FROM Students');
After execute/query I always get this messages:
INFO connecting 127.0.0.1:3306
INFO connected to 127.0.0.1
Error: Got packets out of order
I'm running the app with this command:
deno run --allow-all index.ts
My local SQL server is running all the time.
Can you help find me the answer why I cannot get the values? Thanks!
According to the developer, it's a bug.
https://github.com/manyuanrong/deno_mysql/issues/16
More specifically...
https://github.com/manyuanrong/deno_mysql/issues/16#issuecomment-639344637
Prepare for the current crushing reality of Deno not yet possessing a functional mysql driver. It doesn't support passwords!
But when it does man... but when it does, it will soon be a one stop shop of awesomeness.
Just imagine... Beastly NGINX as the SSL Proxy, Single file Deno as the runtime Gateway and MySQL as the relational database running spectacularly in a Digital Ocean $5.00 Droplet..
I simply cannot wait.
If you making SQL requests after a long period of time - you can get a same issue.
Fix of that possible by tune this values in mysql service config:
interactive_timeout
wait_timeout

How to properly use Knex / Bookshelf with MySQL on RDS

I have a Node.js application using MySQL on an AWS RDS with Bookshelf & Knex libraries. The RDS Instance have a max_connections value 90.
I am using the following as the connection object.
knex: {
client: 'mysql',
connection: {
host: 'xxxxxxx.rds.amazonaws.com',
user: 'xxx',
password: 'xxxxx',
database: 'xxxx',
charset: 'utf8'
},
debug: true,
pool: {
min: 2,
max: 20
},
acquireConnectionTimeout: 10000
},
const config = require('./environment');
const knex = require('knex')(config.knex);
module.exports = require('bookshelf')(knex).plugin('registry');
'use strict';
const bookshelf = require('../config/bookshelf');
const config = require('../config/environment');
module.exports = bookshelf.model(`TableA`, {
tableName: 'TableA'
}, {});
I have many requests coming along to the application and sometimes crashes with the following errors.
Unhandled rejection TimeoutError: Knex: Timeout acquiring a
connection. The pool is probably full. Are you missing a
.transacting(trx) call?
and
Error: ER_CON_COUNT_ERROR: Too many connections
Also I see a number of connections (40 to 50 on an average) in the server PROCESSLIST with Command as sleep.
I suspect these error happen when all the 90 connections on the server used fully / knex cannot acquire a new connection from he pool when it tries to. What could be a permanent solution for this, and best practices for handling these kind of applications.
I dont think it is the RDS max_connections that is causing the issue, assuming you only have one instance of the above application code running at any time.
Your application uses a DB connection pool, which can hold up to 20 connections. If all those connections are in use, then the application waits for up to acquireConnectionTimeout ms in your case that is 10000 before connection timeout.
So I suspect that your application either has a lot of db queries to be processed due to load or there are some slow queries hogging connections. This causes a backlog of queries waiting for connections that eventually times out. Investigate which might be the case and do update us.
Things you can try in the mean time.
Increase acquireConnectionTimeout.
Increase connection pool size.
If caused by slow queries then optimize them before trying the above.
Possible methods for logging slow queries:
Enable slow query log on RDS.
Knex query event to log transaction duration assuming you are using transactions.
When a client is finished with MySQL, have it disconnect.
Also, check the value of wait_timeout. Lowering it will force disconnections rather than "Sleeping" until you come back.

How is the correct way to handle MySQL connections in Node JS

I built a program with NodeJS where multiple users access it in the same time and do a lot of operations that queries the MySQL database.
My approach is very simple. I only open one connection when the app is started and leave it that way.
const dbConfig = require('./db-config');
const mysql = require('mysql');
// Create mySQL Connection
const db = mysql.createConnection({
host: dbConfig.host,
user: dbConfig.user,
password: dbConfig.password,
database: dbConfig.database,
multipleStatements: true
});
// Connect MySQL
db.connect((err) => {
if (err) {
throw err;
} else {
console.log('MySQL connected!');
}
});
module.exports = db;
And then, whenever the program needs to query the database, i do like this
db.query('query_in_here', (error, result) => {
*error_handling_and_doing_stuff*
}
I'm having trouble when noone access the app for a long period of time (some hours).
Because when this happens i think the connection is being closed automatically. And then, when a user try to access the app, i see in the console that the connection timed out.
My first thought was too handle the disconnection and connect again. But, it get me thinking if this is the correct approach.
Should i use pool connections instead? Because if i keep only one connection it means that two users can't query the database in the same time?
I tried to understand tutorials with pool connections but couldn't figure out when to create new connections and when should i end them.
UPDATE 1
Instead of create one connection when the app is started i changed to create a pool connection.
const dbConfig = require('./db-config');
const mysql = require('mysql');
// Create mySQL Connection
const db = mysql.createPool({
host: dbConfig.host,
user: dbConfig.user,
password: dbConfig.password,
database: dbConfig.database,
multipleStatements: true
});
module.exports = db;
It seems that when i use now "db.query(....)" the mysql connection and release of that connection is done automatically.
So, it should resolve my issue but i don't know if this is the correct approach.
Should i use pool connections instead?
Yes you should. Pooling is supported out-of-the-box with the mysql module.
var mysql = require('mysql');
var pool = mysql.createPool({
connectionLimit : 10,
host : 'example.org',
user : 'bob',
password : 'secret',
database : 'my_db'
});
pool.query('SELECT 1 + 1 AS solution', function (error, results, fields) {
// should actually use an error-first callback to propagate the error, but anyway...
if (error) return console.error(error);
console.log('The solution is: ', results[0].solution);
});
You're not supposed to know how pooling works. It's abstracted from you. All you need to do is use pool to dispatch queries. How it works internally is not something you're required to understand.
What you should pay attention to is the connectionLimit configuration option. This should match your MySQL server connection limit (minus one, in case you want to connect to it yourself while your application is running), otherwise you'll get "too many connections" errors. The default connection limit for MySQL is 100, so I'd suggest you set connectionLimit to 99.
Because if i keep only one connection it means that two users can't query the database in the same time?
Without pooling, you can't serve multiple user requests in-parallel. It's a must have for any non-hobby, data-driven application.
Now, if you really want to know how connection pooling works, this article sums it up pretty nicely.
In software engineering, a connection pool is a cache of database connections maintained so that the connections can be reused when future requests to the database are required. Connection pools are used to enhance the performance of executing commands on a database. Opening and maintaining a database connection for each user, especially requests made to a dynamic database-driven website application, is costly and wastes resources. In connection pooling, after a connection is created, it is placed in the pool and it is used again so that a new connection does not have to be established. If all the connections are being used, a new connection is made and is added to the pool. Connection pooling also cuts down on the amount of time a user must wait to establish a connection to the database.

How do I use Loopback's connection pool with Knex query builder?

I'm trying to utilize Knex with the Loopback framework. Currently, Loopback does not provide a good way to create advanced queries.
I'm using Knex's query builder, however by default Knex will initialize its own connection pool ON TOP of Loopback's. Instead, I want to use the connection pool already created by Loopback.
I've tried to use Knex's .connection() method to set the connection to be the one from loopback, however when I monitor the processes on my MySql server I notice that each time I make a call that uses Knex a new connection is created. Over time this is causing my server to run out of connections to the database.
I'm doing something like this:
var knex = require('knex')({
client: 'mysql',
connection: {
host: mysql.host,
port: mysql.port,
user: mysql.username,
password: mysql.password,
database: mysql.database,
debug: false
}
});
app.datasources.mysqldb.client.getConnection(function(err, connection){
knex.connection(connection).
//continue with the query building
}
My question is, how do I utilize Loopback's existing connection with Knex so that Knex doesn't burn through all the available connections in my database? I've also tried using knex's "pool" configuration but it doesnt seem to do anything...