I need to create database before connection and work with db. I'm using nest.js typeorm, provided all configurations. When I start my application it says
"Unable to connect to the database. Error: ER_BAD_DB_ERROR: Unknown database 'test'".
Once again: there is not DB "Test" in my MySQL Workbench => when I start the application
I want the application to create the database itself (not by me manually)
Is it possible?
I found a way to achieve this for postgresql. Also I'm using Nest.js and Typeorm too. Firstly, I created two sql files (one for check if database exists and one for create db) and then a config file for database. These files contents are like below.
checkDbIfExists.sql
SELECT 1 FROM pg_database WHERE datname = 'test'
createDB.sql
CREATE DATABASE test
config.ts
import { DynamicModule } from '#nestjs/common';
import { TypeOrmModule, TypeOrmModuleOptions } from '#nestjs/typeorm';
import { createConnection, getManager } from 'typeorm';
import { PostgresConnectionOptions } from 'typeorm/driver/postgres/PostgresConnectionOptions';
import * as path from 'path';
const fs = require('fs');
const checkDBScript: string = fs
.readFileSync(path.join(__dirname, '../script/checkDBIfExists.sql'))
.toString();
const createDb: string = fs.readFileSync(path.join(__dirname, '../script/createDB.sql')).toString();
const CreateDBIfNotExists = async (options: TypeOrmModuleOptions): Promise<void> => {
const connection = await createConnection(options as PostgresConnectionOptions);
const manager = getManager();
const result = await manager.query(checkDBScript);
if (result.length === 0) await manager.query(createDb);
connection.close();
};
const DBConfig = async (): Promise<DynamicModule> => {
let options: TypeOrmModuleOptions = {
type: 'postgres',
host: 'localhost',
port: 5432,
username: 'postgres',
password: 'asd',
entities: [],
synchronize: true,
cli: {
migrationsDir: 'persistence/migrations'
}
};
await CreateDBIfNotExists(options);
options = { ...options, database: 'test' };
return TypeOrmModule.forRoot(options);
};
export default DBConfig;
Then I added these lines
nest-cli.json
"compilerOptions": {
"assets": ["persistence/script/*"]
}
app.module.ts
#Module({
imports: [DBConfig()]
...
NestJS can't create the database, you need to manually create it before starting your application.
If you want it to be automatic, you can use a docker service to create your database when starting your docker compose with docker-compose up
version: "3.6"
services:
db:
image: mysql:8.0.20
command: --default-authentication-plugin=mysql_native_password
restart: always
ports:
- 3306:3306
environment:
- MYSQL_DATABASE=<database-name>
- MYSQL_ROOT_PASSWORD=<password>
I am using TypeORM version 0.3.11, so while M. Erim Tuzcuoglu's solution works for now, it's using deprecated methods.
Here is my function that is based on the newer DataSource approach.
export const createDBIfNotExists = async (): Promise<void> => {
const dbOptions = dbConfig().db;
const { createDatabase, database } = dbOptions;
if (!createDatabase) {
return;
}
const dataSource = new DataSource({
type: 'postgres',
...dbOptions,
database: 'postgres',
});
await dataSource.initialize();
const result = await dataSource.query(
`SELECT 1 FROM pg_database WHERE datname = '${database}'`
);
if (!result.length) {
console.log(`Creating database with name "${database}"`);
await dataSource.query(`CREATE DATABASE "${database}"`);
}
await dataSource.destroy();
}
Specifying postgres as the database for the connection is a bit "hacky" because it assumes it's existence. However in my case that is acceptable. Without it it would fail to connect.
I should also mention that dbConfig function is very similar to the approach provided in Nest documentation:
// Use process.env for the values, I've hardcoded them just for clarity here
export const dbConfig = (): { db: IDBConfig } => ({
db: {
host: 'localhost',
port: 5432,
database: 'your-db-name',
username: 'username',
password: 'password',
migrationsRun: true,
createDatabase: true,
logging: true,
synchronize: false,
},
});
Then in my main.ts I just call the function:
async function bootstrap() {
await createDBIfNotExists();
// ...
}
This allows me to simply change POSTGRES_DB env variable and it would create and use this DB automatically when I start the app (while also checking for the RUN_MIGRATION env).
It's quite specific to the requirements I had, but I hope the general implementation example could help someone in the future.
Related
Currently cypress supports mysql connection without using ssh as seen in the link below
https://docs.cypress.io/api/commands/task#Allows-a-single-argument-only
But I am trying to connect cypress to mysql through an ssh tunneling.
I am using the npm package mysql-ssh to establish the connection.
I am able to achieve this directly using node.js but I am facing issues while implementing through cypress. Here's the snippet I tried in node.js.
const mysqlssh = require('mysql-ssh');
const fs = require('fs');
mysqlssh.connect(
{
host: 'x.x.x.x',
user: 'xyz',
privateKey: fs.readFileSync('filePath') //this is the ssh filePath
},
{
host: 'HOST_NAME',
user: 'USER_NAME',
password: 'xxxx',
database: 'DB_NAME'
}
)
.then(client => {
client.query('select * from TABLE_NAME', function (err, results, fields) {
if (err)
{
console.log(err)
}
console.log(results);
mysqlssh.close()
})
})
.catch(err => {
console.log(err)
})
I want to do this either through the cypress/plugins/index.js file or directly in cypress/integration. Is there a simple way to do this?
I have found the solution. Here is my code for cypress/plugins/index.js file:
const dotenvPlugin = require('cypress-dotenv');
const mysqlssh = require('mysql-ssh');
const fs = require('fs');
module.exports = (on, config) => {
// `config` is the resolved Cypress config
config = dotenvPlugin(config);
on('task', {
executeSql (sql, ...args) {
return new Promise(async (resolve, reject) => {
try {
let connection = await mysqlssh.connect( {
host: process.env.SSH_HOST,
user: process.env.SSH_USER,
privateKey: fs.readFileSync(process.env.HOME + '/.ssh/id_rsa_old')
},
{
host: process.env.MYSQL_HOST,
user: process.env.MYSQL_USER,
password: process.env.MYSQL_PASSWORD,
database: process.env.MYSQL_DB
});
let result = await connection.promise().query(sql, args);
mysqlssh.close();
resolve(result[0][0]);
} catch (err) {
reject(err);
}
});
}
})
return config
}
So this connection has to be established in this file b/c cypress does not communicate with node process supplied by the host. So we need to use cypress tasks to run a Node code. see docs here - https://docs.cypress.io/api/commands/task#Examples
And in a test file example, I used it like so:
describe('Db Test', () => {
it('Query Test', () => {
cy.task('executeSql', 'SELECT count(id) as cnt FROM table_name').then(result => {
expect(result.cnt, 'Does not equal to 8').to.equal(2000);
})
})
})
P.S. Additional cypress-dotenv package is just used to load env vars from .env file.
I have the following code in my db.js file:
const mysql2 = require('mysql2/promise');
// Connect to server (locally for development mode) ----- NEW VERSIN
const pool = mysql2.createPool({
host : "ENDPOINT",
user : "admin",
password : "PASSWORD",
port : "3306",
database : "DATABASE",
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0
});
module.exports = pool;
And in my app.js file I have:
const pool = require('./db');
async function checkUser(username) {
const result = await pool.query('SELECT * from users WHERE username = ?', [username]);
if (result[0].length < 1) {
throw new Error('Row with this username was not found');
}
return result[0][0];
}
async function check() {
let user = await checkUser("username");
console.log(user);
}
check();
But I'm getting the error:
(node:42037) UnhandledPromiseRejectionWarning: TypeError: pool.query is not a function
This is weird, because when I run all the code in the db.js file it works fine, so I'm probably messed up the export/require bit of it, please help!
ANSWER: HOW TO EXPORT MULTIPLE FUNCTIONS
In the document you are exporting, write as follows:
module.exports = {FunctionName1, FunctionName2, FunctionName3};
In the document you are importing to, write the following:
const {FunctionName1, FunctionName2, FunctionName3} = require('./whereyouareimportingfrom');
I had done the exports the wrong way, I tried exporting two functions by doing: "module.exports = ConnectDb, pool;" And that didn't work. When I removed the first function and only exported "pool" it worked.
I've been unable to find a documented way of connecting to multiple MySQL databases in Feathers.js using Sequelize. Is there a way to do this? My use case is to be able to insert and get rows of data into multiple DBs from the same action but the DBs won't necessarily be the same schema.
Thanks!
I made some local test and it is possible. You need to define 2 different sequelize clients.
If you are using the CLI generator and you set up a service based on sequelize you should have a connection string (my example is a mysql db):
a db connection string in the config/default.json
"mysql" : "mysql://user:password#localhost:3306/your_db"
a sequelize.js in the src root folder
In order to create a second sequelize client
create a new connection string in the config/default.json
"mysql2" : "mysql://user:password#localhost:3306/your_db_2"
create a copy of sequelize.js and name it sequelize2.js
const Sequelize = require('sequelize');
module.exports = function (app) {
const connectionString = app.get('mysql2');
const sequelize2 = new Sequelize(connectionString, {
dialect: 'mysql',
logging: false,
operatorsAliases: false,
define: {
freezeTableName: true
}
});
const oldSetup = app.setup;
app.set('sequelizeClient2', sequelize2);
app.setup = function (...args) {
const result = oldSetup.apply(this, args);
// Set up data relationships
const models = sequelize2.models;
Object.keys(models).forEach(name => {
if ('associate' in models[name]) {
models[name].associate(models);
}
});
// Sync to the database
sequelize2.sync();
return result;
};
};
add the new sequelize configuration to your app.js
const sequelize2 = require('./sequelize2');
app.configure(sequelize2);
Then in your model to a second db :
const Sequelize = require('sequelize');
const DataTypes = Sequelize.DataTypes;
module.exports = function (app) {
//load the second client you defined above
const sequelizeClient = app.get('sequelizeClient2');
//to check if connect to a different db
console.log ( sequelizeClient )
//your model
const tbl = sequelizeClient.define('your_table', {
text: {
type: DataTypes.STRING,
allowNull: false
}
}, {
hooks: {
beforeCount(options) {
options.raw = true;
}
}
});
// eslint-disable-next-line no-unused-vars
tbl.associate = function (models) {
// Define associations here
// See http://docs.sequelizejs.com/en/latest/docs/associations/
};
return tbl;
};
In order to work you need 2 different services, each one working with a different db.
If you want to put or get with a single action you can create a before/after hook in one of the service and call inside the hook the second service.
For the get you need to add the result of the second service to your hook result
Here's a working example of AWS Lambda and MySQL, but I'd like it to work with Sequelize. How do I initialize Sequelize to work with AWS Lambda? I have the authenticated IAM role working too.
https://dzone.com/articles/passwordless-database-authentication-for-aws-lambd
'use strict';
const mysql = require('mysql2');
const AWS = require('aws-sdk');
// TODO use the details of your database connection
const region = 'eu-west-1';
const dbPort = 3306;
const dbUsername = 'lambda'; // the name of the database user you created in step 2
const dbName = 'lambda_test'; // the name of the database your database user is granted access to
const dbEndpoint = 'lambdatest-cluster-1.cluster-c8o7oze6xoxs.eu-west-1.rds.amazonaws.com';
module.exports.handler = (event, context, cb) => {
var signer = new AWS.RDS.Signer();
signer.getAuthToken({ // uses the IAM role access keys to create an authentication token
region: region,
hostname: dbEndpoint,
port: dbPort,
username: dbUsername
}, function(err, token) {
if (err) {
console.log(`could not get auth token: ${err}`);
cb(err);
} else {
var connection = mysql.createConnection({
host: dbEndpoint,
port: dbPort,
user: dbUsername,
password: token,
database: dbName,
ssl: 'Amazon RDS',
authSwitchHandler: function (data, cb) { // modifies the authentication handler
if (data.pluginName === 'mysql_clear_password') { // authentication token is sent in clear text but connection uses SSL encryption
cb(null, Buffer.from(token + '\0'));
}
}
});
connection.connect();
// TODO replace with your SQL query
connection.query('SELECT * FROM lambda_test.test', function (err, results, fields) {
connection.end();
if (err) {
console.log(`could not execute query: ${err}`);
cb(err);
} else {
cb(undefined, results);
}
});
}
});
};
Instead of using mysql.createConnection() and use your RDS Signer token:
var sequelize = require('sequelize')
const Sequelize = new sequelize(
process.env.database_name,
process.env.databse_user,
token,
{
dialect: 'mysql',
dialectOptions: {
ssl: 'Amazon RDS',
authPlugins: { // authSwitchHandler is deprecated
mysql_clear_password: () => () => {
return token
}
}
},
host: process.env.db_proxy_endpoint,
port: process.env.db_port,
pool: {
min: 0, //default
max: 5, // default
idle: 3600000
},
define: {
charset: 'utf8mb4'
}
}
// then return your models (defined in separate files usually)
await Sequelize.authenticate() // this just does a SELECT 1+1 as result;
await Sequelize.sync() // DO NOT use this in production, this tries to create tables defined by your models. Consider using sequelize migrations instead of using sync()
Also it's a good idea to keep your database connection parameters in a config file so no one can see them. (process.env)
We are working with Sequelize and Lambda, but you will need to reserve more resources, in our case we need at least 1GB to run a lambda with Sequelize. Without it, just with mysql2 it runs just with 128MB.
But if you really wanna use Sequelize just replace your createConnection for something like what you will find in sequelize doc
Probably you will use the context.callbackWaitsForEmptyEventLoop=true because you may have some issues when you call the callback function and you get nothing because your Event Loop probably will never be empty.
I am trying to create a mysql database connection for my node app and ran sequelize.authenticate().then(function(errors) { console.log(errors) }); to test if the connection worked. The response that is logged to my console is Executing (default): SELECT 1+1 AS result undefined The undefined portion makes me think that the connection either didn't work or that there isn't any database found. I can't seem to figure that out. I thought by creating a database through Sequel Pro and connecting to localhost via the Socket, I can use the same credentials for connecting with my Node app. Do I need to create a file within my app for the database and not use the Sequel Pro database?
Controller file (where the connection is created):
var Sequelize = require('sequelize');
var sequelize = new Sequelize('synotate', 'root', '', {
host:'localhost',
port:'3306',
dialect: 'mysql'
});
sequelize.authenticate().then(function(errors) { console.log(errors) });
var db = {}
db.Annotation = sequelize.import(__dirname + "/ann-model");
db.sequelize = sequelize;
db.Sequelize = Sequelize;
module.exports = db;
ann-model.js (where my table is being defined):
module.exports = function(sequelize, DataTypes) {
var Ann = sequelize.define('annotations', {
ann_id: {
type: DataTypes.INTEGER,
primaryKey: true
},
ann_date: DataTypes.DATE,
}, {
freezeTableName: true
});
return Ann;
}
Try this one, Executing (default): SELECT 1+1 AS result means that everything okay
sequelize
.authenticate()
.then(function(err) {
if (!!err) {
console.log('Unable to connect to the database:', err)
} else {
console.log('Connection has been established successfully.')
}
});
But i didn't know where you get undefined