I am trying to do the following in the NodeJS application which is running on MAC OSX: (Docker REST API's are used to interact with docker.
API Reference: https://docs.docker.com/engine/reference/api/docker_remote_api_v1.23/)
Create mysql docker container, mapping its port 3306 to one of the host port. This is the docker container configuration POST body:
{
'Env': ['MYSQL_ROOT_PASSWORD=root'],
'Image': 'test-mysql',
'ExposedPorts': {
'3306/tcp': {}
},
'HostConfig': {
'PortBindings': { '3306/tcp': [{}]}
}
Start this newly created mysql docker container.
Fetch the docker container port 3306 host mapping from the host config, by inspecting the container. (Assume the mapping maps mysql container port 3306 to port 32789 of the host)
Create a mysql connection using the following mysql npm module https://www.npmjs.com/package/mysql.
Connection config is:
{
host: dockerHostIp, // 192.168.99.100 -- docker host IP
port: portMappingFromStep3, //32789
user: mySQLUser,
password: mySQLUserPassword,
database: 'mysql'
}
Running all this step sequentially throws an error while setting up mysql connection. It produces following output:
{
[Error: connect ECONNREFUSED 192.168.99.100:32789]
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '192.168.99.100',
port: 32800,
fatal: true
}
But, if I provide a timeout of around 7-8 seconds before establishing up the mysql connection, connection establishes successfully.
Not sure, what is happening over here. I am not able to understand, why without timeout mysql connection is failing. I think it is something related to port mapping. Port mapping is taking some time to setup and providing timeout of roughly 7seconds, is covering up that mapping time.
I am trying to set up an end to end test environment where multiple mysql containers will be running in parallel. Waiting time of 7-8 seconds for the connection setup is a big overhead.
Any help to unravel this mystery will be very helpful.
Thanks
Related
{Solved}. New Bug
Error from Heroku logs:
Error: connect ECONNREFUSED 127.0.0.1:3306
2021-09-23T18:24:12.236657+00:00 app[web.1]: at TCPConnectWrap.afterConnect. [as oncomplete] (node:net:1146:16) {
2021-09-23T18:24:12.236658+00:00 app[web.1]: errno: -111,
2021-09-23T18:24:12.236658+00:00 app[web.1]: code: 'ECONNREFUSED',
2021-09-23T18:24:12.236658+00:00 app[web.1]: syscall: 'connect',
2021-09-23T18:24:12.236659+00:00 app[web.1]: address: '127.0.0.1',
2021-09-23T18:24:12.236659+00:00 app[web.1]: port: 3306,
2021-09-23T18:24:12.236659+00:00 app[web.1]: fatal: true
Information / Background:
React js front end (hosted now on Netlifty)
Javascript Node backend using Express and MYSQL2 (Hosted on Heroku)
Goal:
To connect Neflifty Frontend POST request with Heroku Backend fetching POST payload data and inserting it into MYSQL table.
Update: 09/24/2021
I have done all that was suggested. I have created a new database with clearDB. Added it and tested the connection in mysql Workbench. Created the table needed. Updated the backend code for creating the connection to the new database. Checked heroko variables and made sure they reflected correctly to the new database. Now there is an authorization issue. {solved}
New Backend Code with corrections:
const express = require('express');
const app = express();
const port = process.env.Port || 8000
app.listen(port);
console.log(`server is listing on ${port}`);
Time Out
Question:
If this connects locally on workbench why wouldn't Heroku connect if they are added as variables? {Answered}
New question:
why is it trying to use port 8000 when it should be using the environment? Why is it timing out?
Any help on this would be greatly appreciated.
You can't connect to the database because you don't have any database instance running at Heroku. Once you pushed your code using Heroku CLI, Heroku sets up a NodeJS application for you, but it does not mean that it will set up a database for you aswell.
You can add a database to your app through Heroku's interface or CLI. From CLI (we are going to setup ClearDB but any MySQL database addon may work):
heroku addons:create cleardb:ignite
After done that, you wanna have your new database URL (which won't be localhost) with:
heroku config | grep CLEARDB_DATABASE_URL
The last command's output will be something like:
mysql://<user>:<password>#<host>/<database>?reconnect=true
Now, with this in hand, you shall modify your code a little bit with that new information. You don't want to expose your database credentials in your versioning control, so you may use environment variables to get it done:
const db = mysql.createConnection({
connectionaLimit: 50,
user: process.env.DB_USER,
host: process.env.DB_HOST,
password: process.env.DB_PASSWORD,
database: process.env.DATABASE,
port: 3306
});
Also, you need to set the environment variables for your running Heroku's app:
heroku config:set DB_USER=<user>
heroku config:set DB_PASSWORD=<password>
heroku config:set DB_HOST=<host>
heroku config:set DATABASE=<database>
Now, you have a database instance running at Heroku and a NodeJS app instance running that can connect to this database.
For further reading, you may wanna take a look at those links:
https://lo-victoria.com/build-a-mysql-nodejs-crud-app-4-deploying-to-heroku-finale
https://www.bezkoder.com/deploy-node-js-app-heroku-cleardb-mysql/
https://raddy.co.uk/blog/how-to-deploy-node-js-express-ejs-mysql-website-on-heroku-cleardb/ (this one uses Heroku's interface)
{Solved}. New Bug
Error from Heroku logs:
Error: connect ECONNREFUSED 127.0.0.1:3306
2021-09-23T18:24:12.236657+00:00 app[web.1]: at TCPConnectWrap.afterConnect. [as oncomplete] (node:net:1146:16) {
2021-09-23T18:24:12.236658+00:00 app[web.1]: errno: -111,
2021-09-23T18:24:12.236658+00:00 app[web.1]: code: 'ECONNREFUSED',
2021-09-23T18:24:12.236658+00:00 app[web.1]: syscall: 'connect',
2021-09-23T18:24:12.236659+00:00 app[web.1]: address: '127.0.0.1',
2021-09-23T18:24:12.236659+00:00 app[web.1]: port: 3306,
2021-09-23T18:24:12.236659+00:00 app[web.1]: fatal: true
Information / Background:
React js front end (hosted now on Netlifty)
Javascript Node backend using Express and MYSQL2 (Hosted on Heroku)
Goal:
To connect Neflifty Frontend POST request with Heroku Backend fetching POST payload data and inserting it into MYSQL table.
Update: 09/24/2021
I have done all that was suggested. I have created a new database with clearDB. Added it and tested the connection in mysql Workbench. Created the table needed. Updated the backend code for creating the connection to the new database. Checked heroko variables and made sure they reflected correctly to the new database. Now there is an authorization issue. {solved}
New Backend Code with corrections:
const express = require('express');
const app = express();
const port = process.env.Port || 8000
app.listen(port);
console.log(`server is listing on ${port}`);
Time Out
Question:
If this connects locally on workbench why wouldn't Heroku connect if they are added as variables? {Answered}
New question:
why is it trying to use port 8000 when it should be using the environment? Why is it timing out?
Any help on this would be greatly appreciated.
You can't connect to the database because you don't have any database instance running at Heroku. Once you pushed your code using Heroku CLI, Heroku sets up a NodeJS application for you, but it does not mean that it will set up a database for you aswell.
You can add a database to your app through Heroku's interface or CLI. From CLI (we are going to setup ClearDB but any MySQL database addon may work):
heroku addons:create cleardb:ignite
After done that, you wanna have your new database URL (which won't be localhost) with:
heroku config | grep CLEARDB_DATABASE_URL
The last command's output will be something like:
mysql://<user>:<password>#<host>/<database>?reconnect=true
Now, with this in hand, you shall modify your code a little bit with that new information. You don't want to expose your database credentials in your versioning control, so you may use environment variables to get it done:
const db = mysql.createConnection({
connectionaLimit: 50,
user: process.env.DB_USER,
host: process.env.DB_HOST,
password: process.env.DB_PASSWORD,
database: process.env.DATABASE,
port: 3306
});
Also, you need to set the environment variables for your running Heroku's app:
heroku config:set DB_USER=<user>
heroku config:set DB_PASSWORD=<password>
heroku config:set DB_HOST=<host>
heroku config:set DATABASE=<database>
Now, you have a database instance running at Heroku and a NodeJS app instance running that can connect to this database.
For further reading, you may wanna take a look at those links:
https://lo-victoria.com/build-a-mysql-nodejs-crud-app-4-deploying-to-heroku-finale
https://www.bezkoder.com/deploy-node-js-app-heroku-cleardb-mysql/
https://raddy.co.uk/blog/how-to-deploy-node-js-express-ejs-mysql-website-on-heroku-cleardb/ (this one uses Heroku's interface)
I'm finished with my project and I'm trying to deploy it to AWS. I have an ec2 instance as my webserver with the following configuration details:
NodeJS using port 5000
PM2 (keeping server alive at all times)
NGINX as web server reading from my build file
MySQL within ec2 instance as my database. (using port 3306)
My problem is I'm having trouble establishing a connection from my local machine to my AWS ec2 instance that has the MYSQL db inside of it. I opened up MYSQL workbench and I can connect to it just fine there but when I try and establish a connection string to the DB from node.js it gives me an error.
I was able to successfully connect to the DB within MYSQL workbench but how can I connect to it now from nodejs connection string?
What I already tried was the following:
1) In AWS security group opening up TCP Rule for all incoming traffic at port 5000
2) In AWS security group opening up MYSQL/Aurora Rule for all incoming traffic at port 3306
3) Granting all privileges on . to user and flushing and restarting mysql server.
Error it gives me in the console.
`{ Error: connect ECONNREFUSED 14.54.xxx.xx:3306
at Object._errnoException (util.js:1019:11)
at _exceptionWithHostPort (util.js:1041:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1175:14)
--------------------
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '14.54.xxx.xxx',
port: 3306,
fatal: true }`
Here is my code trying to establish the connection:
```var mysql = require("mysql");
// Local Works just fine
// var connection = mysql.createConnection({
// host: "localhost",
// user: "root",
// password: "xxx",
// database: "devdb",
// charset: "utf8mb4"
// });
// Production Connection to AWS MYSQL instance (stuck on)
var connection = mysql.createConnection({
host: "14.54.xxx.xxx",
port: "3306",
user: "jordan",
password: "xxx",
database: "productiondb"
charset: "utf8mb4"
});
// Test the db connection
connection.connect(function(err) {
if (err) {
console.log(err);
} else {
console.log("Connected!");
}
});
module.exports = connection;
```
I expect to be able to successfully connect to the db instance from my NodeJS
Make sure again, I think your security groups have something wrong, maybe your server listening internally so It's happening. Go your EC2 security group and select Inbound and add rules as type=mysql, proto=tcp, port range=3306, source=0.0.0.0/0,::/0 (allow all)
There are a couple of reasons due to which this might be happening -
Configure MySQL database
#start MySQL server sudo service mysqld start
#run configuration sudo mysql_secure_installation
In the prompt, follow the following steps:
Enter current password for the root account: press Enter key
Set root password? Y
New password: yourpassword
Re-enter new password: yourpassword
Remove anonymous users? Y
Disallow root login remotely? n
Remove test database and access to it? Y
Reload privilege tables now? Y
If you are using RDS then you will have to provide NAT access to the VPC which holds your database. For more info please refer here
Actually I think I just figured it out.
So the default mysql configuration file has the ip binded to 127.0.0.1. However, I changed the binding to my ec2 public ip and also changed the default "mysql" to "jordan" and I saved the configuration file, restarted the mysql service and it now works.
Thank you for the suggestions I'll write this down in my documentation to check for in the future.
I've got two docker services, one running a simple node server and the other a mysql (mariadb actually) database server.
All instances of a socket file mentioned anywhere in /etc/mysql/ say
/var/run/mysqld/mysqld.sock
This will be important soon.
My node server is running some Sequelize code that is trying to connect to the MySQL server.
Whenever I try and connect via Sequelize, I get:
{"statusCode":500,"error":"Internal Server Error","message":"connect ENOENT /var/run/mysqld/mysqld.sock"}
However, if I log into the Node docker container I can successfully connect to MySQL on the other docker container using the mysql CLI client.
I think I understand that the mysql client is using a tcp connection, while Sequelize is using a socket connection. But, when Sequelize is throwing that error, it is showing the correct socket path, as far as I know. Here is my Sequelize config:
const options = {
host: "mysql",
dialect: "mysql",
dialectOptions: {
socketPath: "/var/run/mysqld/mysqld.sock"
}
};
let sequelize = new Sequelize("ibbr_dev", "devuser", "password", options);
The MySQL socket file is not available in your Node container, it is only available in the MySQL container as it is a file. Rather than setting up unix socket based connection, you should use a TCP connection (skipping the dialectOptions).
Starting from scratch, I googled how to connect to a mysql database over ssh using node.js and the mysql library, and I came across this:
Node.js connecting through ssh
So I started a "screen" session, connected with the ssh command, and created the connection in a node script. However, I was getting an error. A comment below the accepted answer had the same issue:
I'm using a mac terminal, I typed 'screen', entered in the information you provided with my domain and password,and succesfully connected into my server via ssh. However, when I run my server.js node file the problem still persists. I'm receiving: { [Error: connect ECONNREFUSED] code: 'ECONNREFUSED', errno: 'ECONNREFUSED', syscall: 'connect', fatal: true } Is there a step here that I missed? I'm able to query successfully with this code on servers that don't require ssh.
And the response led me somewhere but did not completely explain what I need to do:
After you connected via ssh you need to connect your node.ja app to localhost. Because with that ssh command in screen you make port 3306 from mysql server available on your local machine
How exactly does one "connect your node.js app to localhost"? I saw that on the remote server side, I was getting channel 3: open failed: connect failed: Connection refused. So some sort of request was getting successfully sent to my remote server. However, something was failing. Googling led me to this answer:
SSH -L connection successful, but localhost port forwarding not working "channel 3: open failed: connect failed: Connection refused"
The simplest explanation for the rejection is that, on server.com, there's nothing listening for connections on localhost port 8783. In other words, the server software that you were trying to tunnel to isn't running, or else it is running but it's not listening on that port.
So now I'm stuck. How does one cause a server to "listen" so that mysql can work over ssh?
Thanks!
FWIW tunneling mysql over ssh can be accomplished in-process with the mysql2 and ssh2 modules. For example:
var mysql = require('mysql2');
var Client = require('ssh2').Client;
var ssh = new Client();
ssh.on('ready', function() {
ssh.forwardOut(
// source address, this can usually be any valid address
'127.0.0.1',
// source port, this can be any valid port number
12345,
// destination address (localhost here refers to the SSH server)
'127.0.0.1',
// destination port
3306,
function (err, stream) {
if (err) throw err;
var sql = mysql.createConnection({
user: 'foo',
database: 'test',
stream: stream // <--- this is an important part
});
// use `sql` connection as usual
});
}).connect({
// ssh connection config ...
});
Also, since there is overhead with creating ssh connections, you might want to create an ssh connection pool for better reuse.