Redis connection to my-redis:6379 failed - getaddrinfo ENOTFOUND when running seeds - mysql

I am using Docker for the container service.
I have created a seed file and run it by npx sequelize-cli db:seed:all, then error occur:
Sequelize CLI [Node: 13.12.0, CLI: 6.2.0, ORM: 6.5.1]
Loaded configuration file "migrations/config.js".
Using environment "development".
events.js:292
throw er; // Unhandled 'error' event
^
Error: Redis connection to my-redis:6379 failed - getaddrinfo ENOTFOUND my-redis
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:66:26)
Emitted 'error' event on RedisClient instance at:
at RedisClient.on_error (/Users/CCCC/Desktop/Source Tree/my-server/node_modules/redis/index.js:342:14)
at Socket.<anonymous> (/Users/CCCC/Desktop/Source Tree/my-server/node_modules/redis/index.js:223:14)
at Socket.emit (events.js:315:20)
at Socket.EventEmitter.emit (domain.js:485:12)
at emitErrorNT (internal/streams/destroy.js:84:8)
at processTicksAndRejections (internal/process/task_queues.js:84:21) {
errno: -3008,
code: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'my-redis'
}
It seems to show that my redis is not found/not running in port 6379.
Then I run docker ps, it shows my-redis run in port 6379.
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
...
f637ee218d03 redis:6 "docker-entrypoint.s…" 18 minutes ago Up 18 minutes 0.0.0.0:6379->6379/tcp my-server_my-redis_1
docker-compose.yml
version: '2.1'
services:
my-db:
image: mysql:5.7
...
ports:
- 3306:3306
my-redis:
image: redis:6
ports:
- 6379:6379
my-web:
restart: always
environment:
- NODE_ENV=dev
- PORT=3030
build: .
command: >
sh -c "npm install && ./wait-for-db-redis.sh my-db my-redis npm run dev"
ports:
- "3030:3030"
volumes:
- ./:/server
depends_on:
- my-db
- my-redis
.sequelizerc
const path = require('path');
module.exports = {
'config': path.resolve('migrations/config.js'),
'seeders-path': path.resolve('migrations/seeders'),
'models-path': path.resolve('migrations/models.js')
};
migrations/model.js
const Sequelize = require('sequelize');
const app = require('../src/app');
const sequelize = app.get('sequelizeClient');
const models = sequelize.models;
module.exports = Object.assign({
Sequelize,
sequelize
}, models);
config.js
const app = require('../src/app');
const env = process.env.NODE_ENV || 'development';
const dialect = 'mysql';
module.exports = {
[env]: {
dialect,
url: app.get(dialect),
migrationStorageTableName: '_migrations'
}
};

Are you running the migration within the Docker Compose container for your app, or on the Docker host machine?
From the host machine's point of view, there is no such hostname as my-redis (it's only a thing within a Docker overlay network with that container in it).
Since you've exposed the Redis port 6379 to your host (and in fact the whole wide world), you'd use localhost:6379 on the host machine.

Related

ECONNREFUSED 3306 in Node.js connect to MySQL Container using Docker-Compose [duplicate]

Before you flag this question as a duplicate, please note that I did read other answers, but it didn't solve my problem.
I have a Docker compose file consisting of two services:
version: "3"
services:
mysql:
image: mysql:5.7
environment:
MYSQL_HOST: localhost
MYSQL_DATABASE: mydb
MYSQL_USER: mysql
MYSQL_PASSWORD: 1234
MYSQL_ROOT_PASSWORD: root
ports:
- "3307:3306"
expose:
- 3307
volumes:
- /var/lib/mysql
- ./mysql/migrations:/docker-entrypoint-initdb.d
restart: unless-stopped
web:
build:
context: .
dockerfile: web/Dockerfile
volumes:
- ./:/web
ports:
- "3000:3000"
environment:
NODE_ENV: development
PORT: 3000
links:
- mysql:mysql
depends_on:
- mysql
expose:
- 3000
command: ["./wait-for-it.sh", "mysql:3307"]
/web/Dockerfile:
FROM node:6.11.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
CMD [ "npm", "start" ]
After docker-compose up --build the services start up, however the "wait-for-it.sh" script times out when waiting for mySQL to start (so temporarily I am not using it when testing for DB connectivity, I just wait until the console shows that MySQL is ready for accepting incoming connections)
When MySQL is running from the host machine I can login using Sequel Pro and query the DB and get the sample records from ./mysql/migrations
I can also SSH into the running MySQL container and do the same.
However, my Node.js app yields ECONNREFUSED 127.0.0.1:3307 when connecting
MySQL init:
import * as mysql from 'promise-mysql'
const config = {
host: 'localhost',
database: 'mydb',
port: '3307',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
export let db = mysql.createPool(config);
MySQL query:
import { db } from '../db/client'
export let get = () => {
return db.query('SELECT * FROM users', [])
.then((results) => {
return results
})
.catch((e) => {
return Promise.reject(e)
})
}
Route invoked when hitting url /
import { Router } from 'express';
import * as repository from '../repository'
export let router = Router();
router.get('/', async (req, res) => {
let users;
try{
users = await repository.users.get();
} catch(e){
// ECONNREFUSED 127.0.0.1:3307
}
res.render('index', {
users: users
});
});
It's unlikely to be a race condition because at the same time when Node.js fails I can query using Sequel Pro or SSH into the running Docker container and query. So it's probably a case of Node.js not being able to access to MySQL container?
{
error: connect ECONNREFUSED 127.0.0.1:3307
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 3307,
fatal: true
}
This:
mysql:
image: mysql:5.7
environment:
...
ports:
- "3307:3306"
Means that Docker will map the 3307 port of the host to the 3306 port of the container. So you can access from Sequel to localhost:3307.
However, it does not mean that the container is listenting to 3307; the container is in fact still listening to 3306. When others containers tries to access the mysql DNS, it gets translated to the internal container IP, therefore you must connect to 3306.
So your node config should look like:
const config = {
host: 'mysql',
database: 'mydb',
port: '3306',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
And this in your docker-compose.yml:
command: ["./wait-for-it.sh", "mysql:3306"]
Note: wait-for-it.sh script comes from: https://github.com/vishnubob/wait-for-it

Docker(docker-compose) node + MySQL: ECONNREFUSED

I have a docker-compose.yml file, that is running three containers. My problem is when I start my containers it seems like my node(api) container is starting before my MySQL container, even if I declare depends_on on my docker-compose.yml, giving me the following error:
error connecting: Error: connect ECONNREFUSED xxx.xx.x.x:3306
at TCPConnectWrap.afterConnect [as oncomplete]
After I got this error I can see in my console that the MySQL container is just starting. My database is ok, I can access it without any problems. If I make some change in my nodejs code, then this will make my node server to refresh and when the server is up again I don't have any connections problems, because the MySQL container is already up.
I even tried to use solutions as wait-for-it.sh (https://github.com/vishnubob/wait-for-it/blob/master/wait-for-it.sh), but the result was the same, my node backend tries to make a mysql connection, but the mysql container is not ready.
This is my docker-compose.yml
version: "3"
services:
mysql:
image: my_mysql
build: ./db
restart: always
container_name: my_mysql
volumes:
- /var/lib/mysql
- ./db:/db
ports:
- "3307:3306"
environment:
- MYSQL_ROOT_PASSWORD=x
- MYSQL_USER=x
- MYSQL_PASSWORD=x
- MYSQL_DATABASE=x
networks:
- my_network
command: --default-authentication-plugin=mysql_native_password
api:
container_name: my_api
build: ./api
restart: always
ports:
- "9000:9000"
environment:
DB_HOSTNAME: mysql
working_dir: /api
volumes:
- ./api:/api
depends_on:
- mysql
networks:
- my_network
client:
container_name: my_client
image: mhart/alpine-node:12
build: ./client
restart: always
ports:
- "3000:3000"
working_dir: /client
volumes:
- ./client:/client
entrypoint: ["npm", "start"]
depends_on:
- api
networks:
- my_network
networks:
my_network:
driver: bridge
Dockerfile for my nodejs backend:
FROM mhart/alpine-node:12
WORKDIR /api
COPY package*.json /api/
RUN npm i -G nodemon
RUN npm install
COPY . /api/
EXPOSE 9000
CMD ["npm", "run", "dev"]
Dockerfile for my react front:
FROM mhart/alpine-node:12
WORKDIR /client
COPY package*.json /client/
RUN npm install
COPY . /client/
EXPOSE 3000
CMD ["npm", "start"]
and Dockerfile for mysql:
FROM mysql:8.0.19
Calling mysql connection in nodejs:
const config = require('config');
const express = require('express');
const router = express.Router();
const mysql = require('mysql');
const connection = mysql.createConnection({
host : config.get('mysql.config.host'),
user : config.get('mysql.config.user'),
password : config.get('mysql.config.password'),
database : config.get('mysql.config.database'),
port : config.get('mysql.config.port')
});
connection.connect(function(err) {
if (err) {
console.error('error connecting: ' + err.stack);
return;
}
console.log('connected as id ' + connection.threadId);
});
router.get("/", function(req, res, next) {
connection.query('SELECT 1 + 1 AS solution', function (error, results, fields) {
if (error) {
throw error;
}
res.send(`MySQL OK: ${results[0].solution}`);
});
});
module.exports = router;
Thanks for any help.
I could not test it, but the docker recommendation is to create a script to wait until the other container accepts connections.
Alternatively, write your own wrapper script to perform a more application-specific health check. For example, you might want to wait until Postgres is definitely ready to accept commands:
#!/bin/sh
# wait-for-postgres.sh
set -e
host="$1"
shift
cmd="$#"
until PGPASSWORD=$POSTGRES_PASSWORD psql -h "$host" -U "postgres" -c '\q'; do
>&2 echo "Postgres is unavailable - sleeping"
sleep 1
done
>&2 echo "Postgres is up - executing command"
exec $cmd
You could check the documentation here:
Control startup and shutdown order in Compose
Hope this helps.

ECONNREFUSED when trying to connect NodeJS app to MySQL image via docker-compose

I have a project that uses NodeJS as a server (with ExpressJS) and MySQL to handle databases. To load them both together, I am using Docker. Although this project includes a ReactJS client (and I have a client folder for the react and a server folder for the nodejs), I have tested communication between the server and client and it works. Here is the code that pertains to both the server and mysql services:
docker-compose.yml
mysql:
image: mysql:5.7
environment:
MYSQL_HOST: localhost
MYSQL_DATABASE: sampledb
MYSQL_USER: gfcf14
MYSQL_PASSWORD: xxxx
MYSQL_ROOT_PASSWORD: root
ports:
- 3307:3306
restart: unless-stopped
volumes:
- /var/lib/mysql
- ./db/greendream.sql:/docker-entrypoint-initdb.d/greendream.sql
.
.
.
server:
build: ./server
depends_on:
- mysql
expose:
- 8000
environment:
API_HOST: "http://localhost:3000/"
APP_SERVER_PORT: 8000
ports:
- 8000:8000
volumes:
- ./server:/app
links:
- mysql
command: yarn start
Then there is the Dockerfile for the server:
FROM node:10-alpine
RUN mkdir -p /app
WORKDIR /app
COPY package.json /app
COPY yarn.lock /app
RUN yarn install
COPY . /app
CMD ["yarn", "start"]
In the server's package.json, the script start is simply this: "start": "nodemon index.js"
And the file index.js that gets executed is this:
const express = require('express');
const cors = require('cors');
const mysql = require('mysql');
const app = express();
const con = mysql.createConnection({
host: 'localhost',
user: 'gfcf14',
password: 'xxxx',
database: 'sampledb',
});
app.use(cors());
app.listen(8000, () => {
console.log('App server now listening on port 8000');
});
app.get('/test', (req, res) => {
con.connect(err => {
if (err) {
res.send(err);
} else {
res.send(req.query);
}
})
});
So all I want to do for now is confirm that a connection takes place. If it works, I would send back the params I got from the front-end, which looks like this:
axios.get('http://localhost:8000/test', {
params: {
test: 'hi',
},
}).then((response) => {
console.log(response.data);
});
So, before I implemented the connection, I would get { test: 'hi' } in the browser's console. I expect to get that as soon as the connection is successful, but what I get instead is this:
{
address: "127.0.0.1"
code: "ECONNREFUSED"
errno: "ECONNREFUSED"
fatal: true
port: 3306
syscall: "connect"
__proto__: Object
}
I thought that maybe I have the wrong privileges, but I also tried it using root as user and password, but I get the same. Weirdly enough, if I refresh the page I don't get an ECONNREFUSED, but a PROTOCOL_ENQUEUE_AFTER_FATAL_ERROR (with a fatal: false). Why would this happen if I am using the right credentials? Please let me know if you have spotted something I may have missed
In your mysql.createConnection method, you need to provide the mysql host. Mysql host is not localhost as mysql has its own container with its own IP. Best way to achieve this is to externalize your mysql host and allow docker-compose to resolve the mysql service name(in your case it is mysql) to its internal IP which is what we need. Basically, your nodejs server will connect to the internal IP of the mysql container.
Externalize the mysql host in nodejs server:
const con = mysql.createConnection({
host: process.env.MYSQL_HOST_IP,
...
});
Add this in your server service in docker-compose:
environment:
MYSQL_HOST_IP: mysql // the name of mysql service in your docker-compose, which will get resolved to the internal IP of the mysql container

docker-compose: nodejs + mysql can't connect mysql

I try to dockerlized my own node application, but can't connect the mysql container. my codes:
docker-compose.yml
version: '3.2'
services:
node:
build: ./
ports:
- "8787:8787"
depends_on:
- db
networks:
- docker_xxx
environment:
- PORT=8787
- DATABASE_HOST=db
- DATABASE_PASSWORD=xxx
- EGG_SERVER_ENV=local
- NODE_ENV=development
# command: ["./wait-for-it.sh", "db:3306", "--", "npm run docker"]
db:
build: ./db
networks:
- docker_xxx
environment:
- MYSQL_ROOT_PASSWORD=xxx
- MYSQL_DATABASE=database
- MYSQL_USER=user
- MYSQL_PASSWORD=passwd
networks:
docker_xxx:
driver: bridge
./Dockerfile (for nodejs)
FROM node:8.9.4-alpine
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install --production
COPY . /usr/src/app
# COPY wait-for-it.sh /usr/src/app
EXPOSE 8787
CMD npm run docker
db/Dockerfile (for mysql)
FROM mysql:5.6
ADD honggang.sql /docker-entrypoint-initdb.d
config/config.default.js
config.mysql = {
// mysql settings
client: {
// host
host: process.env.DATABASE_HOST || '127.0.0.1',
// port
port: '3306',
// username
user: 'root',
// password
password: 'xxx',
database: 'xxx',
charset: 'utf8',
dialectOptions: {
collate: 'utf8_general_ci',
},
},
app: true,
agent: false,
};
I run docker-compose up -d , there's only db running.
I run docker logs hash to find errors, it shows the following info:
2018-04-30 14:43:51,334 ERROR 54 nodejs.ECONNREFUSEDError: connect ECONNREFUSED 172.19.0.2:3306
at Object._errnoException (util.js:1022:11)
at _exceptionWithHostPort (util.js:1044:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1182:14)
--------------------
at Protocol._enqueue (/usr/src/app/node_modules/mysql/lib/protocol/Protocol.js:145:48)
at Protocol.handshake (/usr/src/app/node_modules/mysql/lib/protocol/Protocol.js:52:23)
at PoolConnection.connect (/usr/src/app/node_modules/mysql/lib/Connection.js:130:18)
at Pool.getConnection (/usr/src/app/node_modules/mysql/lib/Pool.js:48:16)
at /usr/src/app/node_modules/ali-rds/node_modules/pify/index.js:29:7
at new Promise (<anonymous>)
at Pool.<anonymous> (/usr/src/app/node_modules/ali-rds/node_modules/pify/index.js:12:10)
at Pool.ret [as getConnection] (/usr/src/app/node_modules/ali-rds/node_modules/pify/index.js:56:34)
at Pool.query (/usr/src/app/node_modules/mysql/lib/Pool.js:202:8)
at /usr/src/app/node_modules/ali-rds/node_modules/pify/index.js:29:7
sql: select now() as currentTime;
code: 'ECONNREFUSED'
errno: 'ECONNREFUSED'
syscall: 'connect'
address: '172.19.0.2'
port: 3306
fatal: true
name: 'ECONNREFUSEDError'
pid: 54
hostname: d9cd95667a5d
I added CMD ping db, it responded.
I tried to use wait-for-it.sh (the code just commented out), but got the error info:
env: can't execute 'bash': No such file or directory
I solved https://github.com/ycjcl868/eggjs-mysql-docker
there are the points:
1. apk add --no-cache bash to run the wait-for-it.sh waiting the mysql server is ok
2. hostname is 0.0.0.0 not localhost/127.0.0.1

Node.js connect to MySQL Docker container ECONNREFUSED

Before you flag this question as a duplicate, please note that I did read other answers, but it didn't solve my problem.
I have a Docker compose file consisting of two services:
version: "3"
services:
mysql:
image: mysql:5.7
environment:
MYSQL_HOST: localhost
MYSQL_DATABASE: mydb
MYSQL_USER: mysql
MYSQL_PASSWORD: 1234
MYSQL_ROOT_PASSWORD: root
ports:
- "3307:3306"
expose:
- 3307
volumes:
- /var/lib/mysql
- ./mysql/migrations:/docker-entrypoint-initdb.d
restart: unless-stopped
web:
build:
context: .
dockerfile: web/Dockerfile
volumes:
- ./:/web
ports:
- "3000:3000"
environment:
NODE_ENV: development
PORT: 3000
links:
- mysql:mysql
depends_on:
- mysql
expose:
- 3000
command: ["./wait-for-it.sh", "mysql:3307"]
/web/Dockerfile:
FROM node:6.11.1
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY package.json /usr/src/app/
RUN npm install
COPY . /usr/src/app
CMD [ "npm", "start" ]
After docker-compose up --build the services start up, however the "wait-for-it.sh" script times out when waiting for mySQL to start (so temporarily I am not using it when testing for DB connectivity, I just wait until the console shows that MySQL is ready for accepting incoming connections)
When MySQL is running from the host machine I can login using Sequel Pro and query the DB and get the sample records from ./mysql/migrations
I can also SSH into the running MySQL container and do the same.
However, my Node.js app yields ECONNREFUSED 127.0.0.1:3307 when connecting
MySQL init:
import * as mysql from 'promise-mysql'
const config = {
host: 'localhost',
database: 'mydb',
port: '3307',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
export let db = mysql.createPool(config);
MySQL query:
import { db } from '../db/client'
export let get = () => {
return db.query('SELECT * FROM users', [])
.then((results) => {
return results
})
.catch((e) => {
return Promise.reject(e)
})
}
Route invoked when hitting url /
import { Router } from 'express';
import * as repository from '../repository'
export let router = Router();
router.get('/', async (req, res) => {
let users;
try{
users = await repository.users.get();
} catch(e){
// ECONNREFUSED 127.0.0.1:3307
}
res.render('index', {
users: users
});
});
It's unlikely to be a race condition because at the same time when Node.js fails I can query using Sequel Pro or SSH into the running Docker container and query. So it's probably a case of Node.js not being able to access to MySQL container?
{
error: connect ECONNREFUSED 127.0.0.1:3307
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '127.0.0.1',
port: 3307,
fatal: true
}
This:
mysql:
image: mysql:5.7
environment:
...
ports:
- "3307:3306"
Means that Docker will map the 3307 port of the host to the 3306 port of the container. So you can access from Sequel to localhost:3307.
However, it does not mean that the container is listenting to 3307; the container is in fact still listening to 3306. When others containers tries to access the mysql DNS, it gets translated to the internal container IP, therefore you must connect to 3306.
So your node config should look like:
const config = {
host: 'mysql',
database: 'mydb',
port: '3306',
user: 'mysql',
password: '1234',
connectionLimit: 10
}
And this in your docker-compose.yml:
command: ["./wait-for-it.sh", "mysql:3306"]
Note: wait-for-it.sh script comes from: https://github.com/vishnubob/wait-for-it