Docker(docker-compose) node + MySQL: ECONNREFUSED - mysql

I have a docker-compose.yml file, that is running three containers. My problem is when I start my containers it seems like my node(api) container is starting before my MySQL container, even if I declare depends_on on my docker-compose.yml, giving me the following error:
error connecting: Error: connect ECONNREFUSED xxx.xx.x.x:3306
at TCPConnectWrap.afterConnect [as oncomplete]
After I got this error I can see in my console that the MySQL container is just starting. My database is ok, I can access it without any problems. If I make some change in my nodejs code, then this will make my node server to refresh and when the server is up again I don't have any connections problems, because the MySQL container is already up.
I even tried to use solutions as wait-for-it.sh (https://github.com/vishnubob/wait-for-it/blob/master/wait-for-it.sh), but the result was the same, my node backend tries to make a mysql connection, but the mysql container is not ready.
This is my docker-compose.yml
version: "3"
services:
mysql:
image: my_mysql
build: ./db
restart: always
container_name: my_mysql
volumes:
- /var/lib/mysql
- ./db:/db
ports:
- "3307:3306"
environment:
- MYSQL_ROOT_PASSWORD=x
- MYSQL_USER=x
- MYSQL_PASSWORD=x
- MYSQL_DATABASE=x
networks:
- my_network
command: --default-authentication-plugin=mysql_native_password
api:
container_name: my_api
build: ./api
restart: always
ports:
- "9000:9000"
environment:
DB_HOSTNAME: mysql
working_dir: /api
volumes:
- ./api:/api
depends_on:
- mysql
networks:
- my_network
client:
container_name: my_client
image: mhart/alpine-node:12
build: ./client
restart: always
ports:
- "3000:3000"
working_dir: /client
volumes:
- ./client:/client
entrypoint: ["npm", "start"]
depends_on:
- api
networks:
- my_network
networks:
my_network:
driver: bridge
Dockerfile for my nodejs backend:
FROM mhart/alpine-node:12
WORKDIR /api
COPY package*.json /api/
RUN npm i -G nodemon
RUN npm install
COPY . /api/
EXPOSE 9000
CMD ["npm", "run", "dev"]
Dockerfile for my react front:
FROM mhart/alpine-node:12
WORKDIR /client
COPY package*.json /client/
RUN npm install
COPY . /client/
EXPOSE 3000
CMD ["npm", "start"]
and Dockerfile for mysql:
FROM mysql:8.0.19
Calling mysql connection in nodejs:
const config = require('config');
const express = require('express');
const router = express.Router();
const mysql = require('mysql');
const connection = mysql.createConnection({
host : config.get('mysql.config.host'),
user : config.get('mysql.config.user'),
password : config.get('mysql.config.password'),
database : config.get('mysql.config.database'),
port : config.get('mysql.config.port')
});
connection.connect(function(err) {
if (err) {
console.error('error connecting: ' + err.stack);
return;
}
console.log('connected as id ' + connection.threadId);
});
router.get("/", function(req, res, next) {
connection.query('SELECT 1 + 1 AS solution', function (error, results, fields) {
if (error) {
throw error;
}
res.send(`MySQL OK: ${results[0].solution}`);
});
});
module.exports = router;
Thanks for any help.

I could not test it, but the docker recommendation is to create a script to wait until the other container accepts connections.
Alternatively, write your own wrapper script to perform a more application-specific health check. For example, you might want to wait until Postgres is definitely ready to accept commands:
#!/bin/sh
# wait-for-postgres.sh
set -e
host="$1"
shift
cmd="$#"
until PGPASSWORD=$POSTGRES_PASSWORD psql -h "$host" -U "postgres" -c '\q'; do
>&2 echo "Postgres is unavailable - sleeping"
sleep 1
done
>&2 echo "Postgres is up - executing command"
exec $cmd
You could check the documentation here:
Control startup and shutdown order in Compose
Hope this helps.

Related

ECONNREFUSED 3306 in Node.js connect to MySQL Container using Docker-Compose [duplicate]

Before you flag this question as a duplicate, please note that I did read other answers, but it didn't solve my problem.
I have a Docker compose file consisting of two services:
version: "3"
services:
mysql:
image: mysql:5.7
environment:
MYSQL_HOST: localhost
MYSQL_DATABASE: mydb
MYSQL_USER: mysql
MYSQL_PASSWORD: 1234
MYSQL_ROOT_PASSWORD: root
ports:
- "3307:3306"
expose:
- 3307
volumes:
- /var/lib/mysql
- ./mysql/migrations:/docker-entrypoint-initdb.d
restart: unless-stopped
web:
build:
context: .
dockerfile: web/Dockerfile
volumes:
- ./:/web
ports:
- "3000:3000"
environment:
NODE_ENV: development
PORT: 3000
links:
- mysql:mysql
depends_on:
- mysql
expose:
- 3000
command: ["./wait-for-it.sh", "mysql:3307"]
/web/Dockerfile:
FROM node:6.11.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
CMD [ "npm", "start" ]
After docker-compose up --build the services start up, however the "wait-for-it.sh" script times out when waiting for mySQL to start (so temporarily I am not using it when testing for DB connectivity, I just wait until the console shows that MySQL is ready for accepting incoming connections)
When MySQL is running from the host machine I can login using Sequel Pro and query the DB and get the sample records from ./mysql/migrations
I can also SSH into the running MySQL container and do the same.
However, my Node.js app yields ECONNREFUSED 127.0.0.1:3307 when connecting
MySQL init:
import * as mysql from 'promise-mysql'
const config = {
host: 'localhost',
database: 'mydb',
port: '3307',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
export let db = mysql.createPool(config);
MySQL query:
import { db } from '../db/client'
export let get = () => {
return db.query('SELECT * FROM users', [])
.then((results) => {
return results
})
.catch((e) => {
return Promise.reject(e)
})
}
Route invoked when hitting url /
import { Router } from 'express';
import * as repository from '../repository'
export let router = Router();
router.get('/', async (req, res) => {
let users;
try{
users = await repository.users.get();
} catch(e){
// ECONNREFUSED 127.0.0.1:3307
}
res.render('index', {
users: users
});
});
It's unlikely to be a race condition because at the same time when Node.js fails I can query using Sequel Pro or SSH into the running Docker container and query. So it's probably a case of Node.js not being able to access to MySQL container?
{
error: connect ECONNREFUSED 127.0.0.1:3307
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 3307,
fatal: true
}
This:
mysql:
image: mysql:5.7
environment:
...
ports:
- "3307:3306"
Means that Docker will map the 3307 port of the host to the 3306 port of the container. So you can access from Sequel to localhost:3307.
However, it does not mean that the container is listenting to 3307; the container is in fact still listening to 3306. When others containers tries to access the mysql DNS, it gets translated to the internal container IP, therefore you must connect to 3306.
So your node config should look like:
const config = {
host: 'mysql',
database: 'mydb',
port: '3306',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
And this in your docker-compose.yml:
command: ["./wait-for-it.sh", "mysql:3306"]
Note: wait-for-it.sh script comes from: https://github.com/vishnubob/wait-for-it

Error database connect is not a function Nodejs with Mysql in docker

I have deployed my docker application (to DigitalOcean). Everything work's, but I can't connect mysql with nodejs.
When I run docker-compose up I get error database.connect is not a function.
My server.js file is look like this
const mysql = require("mysql");
const database = mysql.createPool({
host: process.env.MYSQL_HOST_IP,
user: "db_user",
password: "db_user_pass",
database: "guess-game",
port: 3306,
});
database.connect((err) => {
if (err) {
console.error("error connecting: " + err.stack);
return;
}
console.log("connected as id " + db.threadId);
});
module.exports = db;
I don't know what I need to write this line to make it work.
host: process.env.MYSQL_HOST_IP,
I tried to add droplet IP as host, but this is also don't work.
host: "http://46.101.162.111/",
Also, I try this.
host: "46.101.162.111",
My docker-compose.yml file
version: "3"
networks:
dbnet:
services:
phpmyadmin:
image: phpmyadmin/phpmyadmin
container_name: phpmyadmin1
environment:
- PMA_ARBITRARY=1
- PMA_HOST=db
restart: always
links:
- db
ports:
- 8899:80
depends_on:
- db
networks:
- dbnet
api:
build: ./api
container_name: api1
command: npm run start
restart: unless-stopped
ports:
- "3005:3005"
environment:
- PORT=3005
- MYSQL_HOST_IP=172.18.0.2
depends_on:
- phpmyadmin
networks:
- dbnet
db:
image: mysql:latest
container_name: db
command: --default-authentication-plugin=mysql_native_password
environment:
- MYSQL_ROOT_PASSWORD=my_secret_password
- MYSQL_DATABASE=guess-game
- MYSQL_USER=db_user
- MYSQL_PASSWORD=db_user_pass
restart: always
ports:
- 6033:3306
networks:
- dbnet
Struggling almost 3 days. 😣
You just need to indicate a DB container name instead of IP like this:
MYSQL_HOST_IP=db

Node.js able to connect to MySQL container when app is ran locally but unable to connect when app is ran in a container

Docker-Compose file:
version: '3.8'
networks:
app-tier:
driver: bridge
services:
mysql_node:
image: mysql:5.7
restart: always
environment:
MYSQL_DATABASE: 'sample_db'
MYSQL_USER: 'user'
MYSQL_PASSWORD: 'password'
MYSQL_ROOT_PASSWORD: 'password'
ports:
- '3306:3306'
expose:
- '3306'
volumes:
- mysql_db:/var/lib/mysql
app:
depends_on:
- mysql_node
build: .
ports:
- 3000:3000
environment:
- MYSQL_HOST=mysql_node
- MYSQL_USER=root
- MYSQL_PASSWORD=password
- MYSQL_NAME=sample_db
- MYSQL_PORT=3306
- MYSQL_HOST_IP=mysql_node
networks:
- app-tier
volumes:
- .:/app
volumes:
mysql_db:
docker file:
FROM node:latest
RUN mkdir app
WORKDIR /app
COPY . .
RUN npm install
EXPOSE 3000
CMD ["node", "app.js"]
Sourcecode:
const http = require('http')
var mysql = require('mysql');
const hostname = '0.0.0.0'
const port = 3000
console.log(process.env.MYSQL_HOST_IP)
console.log(process.env.MYSQL_PORT)
var con = mysql.createConnection({
host: "process.env.MYSQL_HOST_IP",
user: "user",
password: "password",
port: 3306
});
con.connect(function(err) {
if (err) throw err;
console.log("Connected!");
});
When I run docker-compose up the MySQL container successfully launches. But the app exits, with error code Error: connect ETIMEDOUT.
When I run the app locally via node app.js I end up getting a successful "Connected!" message output to my console.
I'm able to connect to the database via MySQLWorkbench as well. It's just the app is unable to connect to the db when the app is ran as a container.
You've specified that the app should be on a named network and the database shouldn't. Then the database service goes on the default network and the two containers can't talk to each other.
I'd remove the named network from the app, since it's probably not needed.
app:
depends_on:
- mysql_node
build: .
ports:
- 3000:3000
environment:
- MYSQL_HOST=mysql_node
- MYSQL_USER=root
- MYSQL_PASSWORD=password
- MYSQL_NAME=sample_db
- MYSQL_PORT=3306
- MYSQL_HOST_IP=mysql_node
volumes:
- .:/app

How to run Sequelize migrations inside Docker

I'm trying to docerize my NodeJS API together with a MySQL image. Before the initial run, I want to run Sequelize migrations and seeds to have the tables up and ready to be served.
Here's my docker-compose.yaml:
version: '3.8'
services:
mysqldb:
image: mysql
restart: unless-stopped
environment:
MYSQL_ROOT_USER: myuser
MYSQL_ROOT_PASSWORD: mypassword
MYSQL_DATABASE: mydb
ports:
- '3306:3306'
networks:
- app-connect
volumes:
- db-config:/etc/mysql
- db-data:/var/lib/mysql
- ./db/backup/files/:/data_backup/data
app:
build:
context: .
dockerfile: ./Dockerfile
image: node-mysql-app
depends_on:
- mysqldb
ports:
- '3030:3030'
networks:
- app-connect
stdin_open: true
tty: true
volumes:
db-config:
db-data:
networks:
app-connect:
driver: bridge
Here's my app's Dockerfile:
FROM node:lts-alpine
WORKDIR /app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3030
ENV PORT 3030
ENV NODE_ENV docker
RUN npm run db:migrate:up
RUN npm run db:seeds:up
CMD [ "npm", "start" ]
And here's my default.db.json that the Sequelize migration uses (shortened):
{
"development": {
},
"production": {
},
"docker": {
"username": "myuser",
"password": "mypassword",
"database": "mydb",
"host": "mysqldb",
"port": "3306",
"dialect": "mysql"
}
}
Upon running compose up the DB installs well, the image deploys, but when it reaches the RUN npm run db:migrate:up (which translates into npx sequelize-cli db:migrate) I get the error:
npx: installed 81 in 13.108s
Sequelize CLI [Node: 14.17.0, CLI: 6.2.0, ORM: 6.6.2]
Loaded configuration file "default.db.json".
Using environment "docker".
ERROR: getaddrinfo EAI_AGAIN mysqldb
npm ERR! code ELIFECYCLE
npm ERR! errno 1
If I change the "host" in the default.db.json to "127.0.0.1", I get ERROR: connect ECONNREFUSED 127.0.0.1:3306 in place of the ERROR: getaddrinfo EAI_AGAIN mysqldb.
What am i doing wrong, and what host should I specify so the app can see the MySQL container? Should I remove the network? Should I change ports? (I tried combinations of both to no avail, so far).
I solved my issue by using Docker Compose Wait. Essentially, it adds a wait loop that samples the DB container, and only when it's up, runs migrations and seeds the DB.
My next problem was: those seeds ran every time the container was run - I solved that by instead running a script that runs the seeds, and touchs a semaphore file. If the file exists already, it skips the seeds.
The following configuration worked for me, I am adding the .env, sequelize configuration along with mysql database and docker. And finally don't forget to run docker-compose up --build cheers 🎁 🎁 🎁
.env
DB_NAME="testdb"
DB_USER="root"
DB_PASS="root"
DB_HOST="mysql"
.sequelizerc now we can use config.js rather than config.json for sequelize
const path = require('path');
module.exports = {
'config': path.resolve('config', 'config.js')
}
config.js
require("dotenv").config();
module.exports = {
development: {
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
dialect: "mysql"
},
test: {
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
dialect: "mysql"
},
production: {
username: process.env.DB_USER,
password: process.env.DB_PASS,
database: process.env.DB_NAME,
host: process.env.DB_HOST,
dialect: "mysql"
}
}
database-connection with sequelize
import Sequelize from 'sequelize';
import dbConfig from './config/config';
const conf = dbConfig.development;
const sequelize = new Sequelize(
conf.database,
conf.username,
conf.password,
{
host: conf.host,
dialect: "mysql",
operatorsAliases: 0,
logging: 0
}
);
sequelize.sync();
(async () => {
try {
await sequelize.authenticate();
console.log("Database connection setup successfully!");
} catch (error) {
console.log("Unable to connect to the database", error);
}
})();
export default sequelize;
global.sequelize = sequelize;
docker-compose.yaml
version: "3.8"
networks:
proxy:
name: proxy
services:
mysql:
image: mysql
networks:
- proxy
ports:
- 3306:3306
environment:
- MYSQL_ROOT_PASSWORD=root
- MYSQL_DATABASE=testdb
healthcheck:
test: "mysql -uroot -p$$MYSQL_ROOT_PASSWORD -e 'SHOW databases'"
interval: 10s
retries: 3
api:
build: ./node-backend
networks:
- proxy
ports:
- 3000:3000
depends_on:
mysql:
condition: service_healthy
Dockerfile
FROM node:16
WORKDIR /api
COPY . /api
RUN npm i
EXPOSE 3000
RUN chmod +x startup.sh
RUN npm i -g sequelize-cli
RUN npm i -g nodemon
ENTRYPOINT [ "./startup.sh" ]
startup.sh
#!/bin/bash
npm run migrate-db
npm run start
I found a really clean solution wanted to share it. First of all I used docker-compose, so if you are using only docker, it might not help.
First thig first, I created a docker file which looks like that.I am using typescript, so if you are using js, you don't need to download typescript and build it!
FROM node:current-alpine
WORKDIR /app
COPY . ./
COPY .env.development ./.env
RUN npm install
RUN npm install -g typescript
RUN npm install -g sequelize-cli
RUN npm install -g nodemon
RUN npm run build
RUN rm -f .npmrc
RUN cp -R res/ dist/
RUN chmod 755 docker/entrypoint.sh
EXPOSE 8000
EXPOSE 3000
EXPOSE 9229
CMD ["sh", "-c","--","echo 'started';while true; do sleep 1000; done"]
Till here it is standart. In order to do things in right order, I need a docker compose and entrypoint file. Entrypoint file is a file, that runs when your containers start. Here is docker-compose file.
version: '3'
services:
app:
build:
context: ..
dockerfile: docker/Dockerfile.development
entrypoint: docker/development-entrypoint.sh
ports:
- 3000:3000
env_file:
- ../.env.development
depends_on:
- postgres
postgres:
image: postgres:alpine
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=test
volumes:
- ./docker_postgres_init.sql:/docker-entrypoint-initdb.d/docker_postgres_init.sql
As you can see, I am using postgresql for db. My docker file, docker-compose and also entrypoint files are in a folder called docker, thats why the paths starts wtih docker, change it according to your file structure. Last and the best part is the entrypoint file. It is really simple.
#!/bin/sh
echo "Starting get ready!!!"
sequelize db:migrate
nodemon ./dist/index.js
Ofcourse change the path of the index.js file according to your settings.
Hope it helps!

Node.js connect to MySQL Docker container ECONNREFUSED

Before you flag this question as a duplicate, please note that I did read other answers, but it didn't solve my problem.
I have a Docker compose file consisting of two services:
version: "3"
services:
mysql:
image: mysql:5.7
environment:
MYSQL_HOST: localhost
MYSQL_DATABASE: mydb
MYSQL_USER: mysql
MYSQL_PASSWORD: 1234
MYSQL_ROOT_PASSWORD: root
ports:
- "3307:3306"
expose:
- 3307
volumes:
- /var/lib/mysql
- ./mysql/migrations:/docker-entrypoint-initdb.d
restart: unless-stopped
web:
build:
context: .
dockerfile: web/Dockerfile
volumes:
- ./:/web
ports:
- "3000:3000"
environment:
NODE_ENV: development
PORT: 3000
links:
- mysql:mysql
depends_on:
- mysql
expose:
- 3000
command: ["./wait-for-it.sh", "mysql:3307"]
/web/Dockerfile:
FROM node:6.11.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
CMD [ "npm", "start" ]
After docker-compose up --build the services start up, however the "wait-for-it.sh" script times out when waiting for mySQL to start (so temporarily I am not using it when testing for DB connectivity, I just wait until the console shows that MySQL is ready for accepting incoming connections)
When MySQL is running from the host machine I can login using Sequel Pro and query the DB and get the sample records from ./mysql/migrations
I can also SSH into the running MySQL container and do the same.
However, my Node.js app yields ECONNREFUSED 127.0.0.1:3307 when connecting
MySQL init:
import * as mysql from 'promise-mysql'
const config = {
host: 'localhost',
database: 'mydb',
port: '3307',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
export let db = mysql.createPool(config);
MySQL query:
import { db } from '../db/client'
export let get = () => {
return db.query('SELECT * FROM users', [])
.then((results) => {
return results
})
.catch((e) => {
return Promise.reject(e)
})
}
Route invoked when hitting url /
import { Router } from 'express';
import * as repository from '../repository'
export let router = Router();
router.get('/', async (req, res) => {
let users;
try{
users = await repository.users.get();
} catch(e){
// ECONNREFUSED 127.0.0.1:3307
}
res.render('index', {
users: users
});
});
It's unlikely to be a race condition because at the same time when Node.js fails I can query using Sequel Pro or SSH into the running Docker container and query. So it's probably a case of Node.js not being able to access to MySQL container?
{
error: connect ECONNREFUSED 127.0.0.1:3307
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 3307,
fatal: true
}
This:
mysql:
image: mysql:5.7
environment:
...
ports:
- "3307:3306"
Means that Docker will map the 3307 port of the host to the 3306 port of the container. So you can access from Sequel to localhost:3307.
However, it does not mean that the container is listenting to 3307; the container is in fact still listening to 3306. When others containers tries to access the mysql DNS, it gets translated to the internal container IP, therefore you must connect to 3306.
So your node config should look like:
const config = {
host: 'mysql',
database: 'mydb',
port: '3306',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
And this in your docker-compose.yml:
command: ["./wait-for-it.sh", "mysql:3306"]
Note: wait-for-it.sh script comes from: https://github.com/vishnubob/wait-for-it