nodeJS development and production environment config file loading - json

Based on this answer here https://stackoverflow.com/a/22524056/777700 I have set exactly the same configuration options, but it doesn't work.
My (partial) app.js file:
console.log('environment: '+process.env.NODE_ENV);
const config = require('./config/db.json')[process.env.NODE_ENV || "development"];
console.log(config);
My ./config/db.json file:
{
"development":{
"host":"localhost",
"port":"3306",
"username":"root",
"password":"",
"database":"dbname"
},
"production":{
"host":"production-host",
"port":"3306",
"username":"user",
"password":"pwd",
"database":"dbname"
}
}
Console.log outputs:
environment: development
undefined
and app crashes. Any idea why? File is there, if I remove the [...] part of require(), it does print out the db.json file, with it, it prints out undefined.
EDIT
I tried to add console.log(typeof config) just after require() to see what I'm getting and I have noticed that if I require('./config/db.json')[process.env.NODE_ENV] I get undefined, but if I require('./config/db.json')["development"] I get back proper object.
Versions:
nodeJS 6.11.4
express 4.16.2

After more debugging and searching online, I have finally found the solution. The problem is that I'm on Windows machine and I was using npm run dev command while my "dev" command looked like SET NODE_ENV=development && nodemon server.js.
Experienced eye will notice a space before &&, which added a space behind the variable development, so the variable I was comparing against was "development " and not "development" as I was thinking.
So, the original answer from other question does work and it does load proper config!

You should export configuration as a variable:
const config = {
"development":{
"host":"localhost",
"port":"3306",
"username":"root",
"password":"",
"database":"dbname"
},
"production":{
"host":"production-host",
"port":"3306",
"username":"user",
"password":"pwd",
"database":"dbname"
}
};
module.exports = config;
This way it will be found :)

If you want to do it via JSON:
const fs = require('fs')
let localConfig
try {
localConfig = JSON.parse((fs.readFileSync('./config/db.json', 'utf-8'))
} catch (e) {
console.log('Could not parse local config.')
localConfig = false
}
module.exports = localConfig
You could then add logic for production, if there's no local configuration localConfig will return false and you can look for environment variables injected at that point.
Update:
I see that you're giving the production config yourself, in that case you can just access the key you need based on the environment. Just import localConfig and use the keys you need.

Its better to use dotenv package for this
npm i dotenv
Step 1: In package.json add this
"scripts": {
"start": "nodemon app.js",
"dev": "NODE_ENV=dev nodemon app.js"
"prod": "NODE_ENV=prod nodemon app.js"
},
Step 2: Add .env.prod and .env.dev files
.env.dev
PORT=7200
# Set your database/API connection information here
DB_URI=localhost
DB_USERNAME=root
DB_PASSWORD=password
DB_DEFAULT=dbName
Step 3: Add this in config.js
const dotenv = require('dotenv').config({ path: `.env.${process.env.NODE_ENV}` });
const result = dotenv;
if (result.error) {
throw result.error;
}
const { parsed: envs } = result;
// console.log(envs);
module.exports = envs;
Step 4: Use like this when needed
const {
DB_URI, DB_USERNAME, DB_PASSWORD, DB_DEFAULT,
} = require('../config');
Now if u want for development, run
npm run dev
For prod, use
npm run prod

Related

Can we change the cypress.env.json file name

I want to use different environment variable file for prod environment and non prod environments. Currently I'm maintaining a single file for all environment and going forward each env file content will be get different according the environment. Hence is there a possibility to rename the file according the environment and pass it at run time or define the respective env file at a configuration file (cypress.json)
Sample env file names:
cypress.env.nonprod.json
cypress.env.prod.json
You can add scripts into project.json file:
{
"scripts": {
"setEnvDev": "cp cypress.env.dev.json cypress.env.json",
"setEnvStaging": "cp cypress.env.staging.json cypress.env.json",
"setEnvSproduction": "cp cypress.env.production.json cypress.env.json"
}
}
And run
$ npm run setEnvDev
$ npx cypress run
You can create cypress configuration files in your project root. Like for example if you have three config files:
production.json
staging.json
dev.json
Then depending on what configuration file you want to use, you can directly run the command:
npx cypress run --config-file staging.json
Set up cypress.env.nonprod.json and cypress.env.prod.json for envrionment-specific, and cypress.env.json for common variables.
In cypress/support/index.js add
const env = Cypress.env() // configured env from common cypress.env.json + command line
if (env.nonprod) { // has nonprod environment been set?
const addEnv = require('cypress.env.nonprod.json')
const merged = {...env, ...addEnv}
Cypress.env(merged)
}
if (env.prod) { // has prod environment been set?
const addEnv = require('cypress.env.prod.json')
const merged = {...env, ...addEnv}
Cypress.env(merged)
}
Run from command line (or set up script)
npx cypress run --env nonprod=true
This allows "stacking" where multiple files can be merged
npx cypress run --env nonprod=true,prod=true
Using cypress/plugins.index.js to merge named environment variable file
// plugins/index.js
module.exports = (on, config) => {
if (config.env.environment) {
const envars = require(`cypress.env.${config.env.environment}.json`
config.env = {
...config.env,
...envars
}
}
return config
}
To add cypress.env.nonprod.json to base config (cypress.json)
npx cypress run --env environment=nonprod

composer in a gulp build and deployment

How do you get composer to install dependencies correctly whenusing a gulp build?
My build process set up that outputs to a set location, either ../sites/www/public_html or ../sites/dev/public_html dependent on if an environment argument is passed to a gulp task. These locations essentially mirror my remote host.
I'm wanting to automate composer installs, updates and optimisation to output the correct vendor files in either ../sites/www/vendor or ../sites/dev/vendor whenever the build is initially run or to just optimise based on any watched php files being changed.
My build folder has the following structure:
source/
bower.json
composer.json
composer.lock
gulpfile.json
package.json
My example composer.json has the following:
{
"name": "mycomposer/mycomposer",
"version": "1.0.0",
"autoload": {
"psr-4" : {
"mycomposer\\": "public_html/app/mycomposer"
}
},
"require": {
"rollbar/rollbar": "^1.3",
"vlucas/phpdotenv": "^2.4.0"
}
}
I have tried a gulp-compose and task to install the composer libraries and to run dumpautoload for local first party libraries
gulp.task('composer', function() {
var dest = argv.live ? 'www' : 'devsite',
env = '../sites/' + dest + '/public_html';
$.composer('config vendor-dir ' + env.replace('public_html', 'vendor') );
$.composer({
"no-ansi": true,
"no-dev": true,
"no-interaction": true,
"no-progress": true,
"no-scripts": true,
"optimize-autoloader": true
});
$.composer('dumpautoload ', {
optimize: true
});
});
When the task is complete I'm finding is that the $baseDir variable references the build directory
Expected
$vendorDir = dirname(dirname(__FILE__));
$baseDir = dirname($vendorDir);
Output
$vendorDir = dirname(dirname(__FILE__));
$baseDir = dirname(dirname(dirname($vendorDir))).'/mybuild';
Is this something I can achieve, or should I really be running composer separately from my build process?
Thanks
I had a similar problem.
I use gulp-composer package.
gulpfile.js
const composer = require('gulp-composer');
gulp.task('composer-deployed', async function() {
let opts = {
"working-dir": 'my-path-to-composer.json',
"self-install": true, // false for my case
optimize: true,
"classmap-authoritative": true
};
composer("dumpautoload", opts);
});
In my case, I use composer installed globaly but you can choose the bin path with bin: 'path-to-composer.phar' in opts{}.

LoopBack does not read port property from config.json (or other config files)

So I want to run multiple LoopBacks listening different ports (makes dev easier). I can achieve this by using PORT=808x node ., but I would prefer a configured alternative.
When I tried to use the configs, I noticed strange behavior. Other configs, such as the restApiRoot matches whatever I write for it in the server/config.json, but the port is always 8080, unless I use env variables or such. I checked the documentation for all configuration files LoopBack reads, non of them has new value for port. Where does that port value come from? How can I force it to use the one in server/config.json or in similar official configuration files?
UPDATE: My server/server.js server/config.json and package.json files
When I start this with node . command, the port variable is 8080 instead of 8082 and when I wget, the response (404) comes from 8080 and 8082 gives no response, as there is no server serving that port.
package.json
{
"name": "external-server",
"version": "1.0.0",
"main": "server/server.js",
"scripts": {
"pretest": "jshint ."
},
"dependencies": {
"compression": "^1.0.3",
"cors": "^2.5.2",
"loopback": "^2.22.0",
"loopback-boot": "^2.6.5",
"loopback-component-explorer": "^2.1.0",
"loopback-connector-mysql": "^2.4.1",
"loopback-datasource-juggler": "^2.39.0",
"serve-favicon": "^2.0.1"
},
"devDependencies": {
"jshint": "^2.5.6"
}
}
server/server.js
var loopback = require('loopback');
var boot = require('loopback-boot');
var app = module.exports = loopback();
app.start = function() {
// start the web server
return app.listen(function() {
app.emit('started');
console.log(app.get('port'))
var baseUrl = app.get('url').replace(/\/$/, '');
console.log('Web server listening at: %s', baseUrl);
if (app.get('loopback-component-explorer')) {
var explorerPath = app.get('loopback-component-explorer').mountPath;
console.log('Browse your REST API at %s%s', baseUrl, explorerPath);
}
});
};
// Bootstrap the application, configure models, datasources and middleware.
// Sub-apps like REST API are mounted via boot scripts.
boot(app, __dirname, function(err) {
if (err) throw err;
// start the server if `$ node server.js`
if (require.main === module)
app.start();
});
server/config.json
{
"restApiRoot": "/api",
"host": "0.0.0.0",
"port": 8082
}
Okay, it seems that when I run LoopBack with sudo, the configuration files are applied. Very confusing.
So the command node . will lead to wrong port and sudo node . will read the port from the server/config.json. It seems like other configuration parameters are read correctly, even without sudo, for some reason PORT is a special case. There is a realted answer, which shows that this is a NodeJS + Express issue, not a LoopBack issue.

Spectron testing producing a JScript syntax error

I'm trying to test out spectron for electron in terms of testing but as I'm going through a tutorial, I keep getting this error message whenever I run npm run test:e2e. My test file syntactically correct but im not sure why i run into an error through compilation
Specs:
Nodejs 6.10.3
Electron 1.6.1
here's the error message
here's the json file package.json
{
"name": "your-app",
"version": "0.1.0",
"main": "main.js",
"scripts": {
"start": "C:/Users/Livs/Documents/imdc/logger/node_modules/.bin/electron .",
"test:e2e": "C:/Users/Livs/Documents/imdc/logger/test.js"
},
"devDependencies": {
"electron-chromedriver": "^1.7.1",
"electron-prebuilt": "^1.4.13",
"electron-rebuild": "^1.5.11",
"chai": "^3.5.0",
"chai-as-promised": "^5.3.0",
"electron": "^1.3.4",
"mocha": "^3.0.2",
"spectron": "^3.4.0"
}
}
Heres the testing file test.js
const Application = require('spectron').Application;
const path = require('path');
const chai = require('chai');
const chaiAsPromised = require('chai-as-promised');
var electronPath = path.join(__dirname, '..', 'node_modules', '.bin', 'electron');
if (process.platform === 'win32') {
electronPath += '.cmd';
}
var appPath = path.join(__dirname, '..');
var app = new Application({
path: electronPath,
args: [appPath]
});
Your npm run e2e just calls the test.js file. You'll need a test runner, mocha for instance. Then you would run mocha test.js. Or change the e2e script inside package.json to run that command.
All your file paths for the scripts inside package.json should be relative to the package root, ie logger/test.js. Regarding the npm bins you only need to type the bin name, ie electron.
To solve your problem you should change your package.json test:e2e command to mocha test.js.
(You can also change your start command to electron . since custom npm commands will always look for binaries in ./node_modules/.bin

can't watch multiple files with json-server

I've read about Fake json-server and I'd like to watch more than 1 file.
In the instructions it is listed
--watch, -w
Watch file(s)
but I'm not able to make it working if I launch it as
json-server -w one.json two.json more.json
create files as shown below
db.js
var firstRoute = require('./jsonfile1.json');
var secondRoute = require('./jsonfile2.json');
var thirdRoute = require('./jsonfile3.json');
var fourthRoute = require('./jsonfile4.json');
// and so on
module.exports = function() {
return {
firstRoute : firstRoute,
secondRoute : secondRoute,
thirdRoute : thirdRoute,
fourthRoute : fourthRoute
// and so on
}
}
server.js
var jsonServer = require('json-server')
var server = jsonServer.create()
var router = jsonServer.router(require('./db.js')())
var middlewares = jsonServer.defaults()
server.use(middlewares)
server.use(router)
server.listen(3000, function () {
console.log('JSON Server is running')
})
Now go to the directory where you have created both these files and open command line and run the code below
node server.js
That's it now go to the browser and go to localhost:3000 ,You shall see the routes that created for different files,you may use it directly.
You can open multipe port for differents json files with json-server.In my case
I open multiples cmd windows and launch it as.
json-server --watch one.json -p 4000
json-server --watch two.json -p 5000
json-server --watch more.json -p 6000
One for cmd window, this work for me.
It can only watch one file. You have to put the all info you need into the same file. So if you need cars for one call and clients for another, you would add a few objects from each into one file. It's unfortunate, but it's just supposed to be a very simple server.
1 - create database file e.g db.json
{
"products": [
{
"id": 1,
"name": "Caneta BIC Preta",
"price": 2500.5
},
],
"users":[
id:1,
name:"Derson Ussuale,
password: "test"
]
}
2 - In Package.json inside script
"scripts": {
"start": "json-server --watch db.json --port 3001"
},
3 - Finally Run command > npm start
Resources
http://localhost:3001/products
http://localhost:3001/users
Home
http://localhost:3001
You can do that with this things:
Step 1: Install concurrently
npm i concurrently --save-dev
Step 2: Create multiple json file, for an example db-users.json, db-companies.json and so on, and so on
Step 3: Add command line to your package.json scripts, for an example:
servers: "concurrently --kill-others \"json-server --host 0.0.0.0 --watch db-users.json --port 3000\" \"json-server --host 0.0.0.0 --watch db-companies.json --port 3001\""
Step 4: Now you can run npm run servers to run your multiple json file.
After that, you can access your server with: localhost:3000 and localhost:3001 or with your network ip address.
Note: You can add more files and more command into your package.json scripts.
That's it.
As you can watch only one file simulteniously, because it is database, you can first read the database file and then append new data to database JSON:
const mockData = jsf(mockDataSchema);
const dataBaseFilePath = path.resolve(__dirname, {YOUR_DATABASE_FILE});
fs.readFile(dataBaseFilePath, (err, dbData) => {
const json = JSON.parse(dbData);
resultData = JSON.stringify(Object.assign(json, mockData));
fs.writeFile(dataBaseFilePath, resultData, (err) => {
if (err) {
return console.log(err);
}
return console.log('Mock data generated.);
});
});