how to use pm2 to launch strapi with AutoReload on? - pm2

i followed the official tutorial and successfully use pm2 to run a small script to launch strapi.
const strapi = require('strapi');
strapi().start();
But the AutoReload is off, and the admin panel doesn't allow me to edit any content type? How should i launch with AutoReload on?
strapi has poped up an alert telling me to launch it with "yarn develop" command, but it doesn't go through pm2, and it shot down when i log out my terminal. So i don't think it's the correct way to launch in real use.

to fix auto reload issue use yarn strapi dev instead of yarn strapi

If strapi is asking you to use yarn develop, it means; you are not in the development environment so you are not able to make changes in Api.
You need to use a special config file in your root folder called ecosystem.config.js and you need to set autorestart: true.
module.exports = {
apps : [{
name: 'nameofyourapp',
script: 'server.js',
// Options reference: https://pm2.keymetrics.io/docs/usage/application-declaration/
// args: 'one two',
instances: 1,
autorestart: true,
watch: false,
max_memory_restart: '1G',
// not production or staging
env: {
NODE_ENV: 'development'
},
env_production: {
NODE_ENV: 'production'
}
}],
// deploy : {
// production : {
// user : 'node',
// host : '212.83.163.1',
// ref : 'origin/master',
// repo : 'git#github.com:repo.git',
// path : '/var/www/production',
// 'post-deploy' : 'npm install && pm2 reload ecosystem.config.js --env production'
// }
// }
};
Then start it with pm2 start ecosystem.config.js command

Related

composer in a gulp build and deployment

How do you get composer to install dependencies correctly whenusing a gulp build?
My build process set up that outputs to a set location, either ../sites/www/public_html or ../sites/dev/public_html dependent on if an environment argument is passed to a gulp task. These locations essentially mirror my remote host.
I'm wanting to automate composer installs, updates and optimisation to output the correct vendor files in either ../sites/www/vendor or ../sites/dev/vendor whenever the build is initially run or to just optimise based on any watched php files being changed.
My build folder has the following structure:
source/
bower.json
composer.json
composer.lock
gulpfile.json
package.json
My example composer.json has the following:
{
"name": "mycomposer/mycomposer",
"version": "1.0.0",
"autoload": {
"psr-4" : {
"mycomposer\\": "public_html/app/mycomposer"
}
},
"require": {
"rollbar/rollbar": "^1.3",
"vlucas/phpdotenv": "^2.4.0"
}
}
I have tried a gulp-compose and task to install the composer libraries and to run dumpautoload for local first party libraries
gulp.task('composer', function() {
var dest = argv.live ? 'www' : 'devsite',
env = '../sites/' + dest + '/public_html';
$.composer('config vendor-dir ' + env.replace('public_html', 'vendor') );
$.composer({
"no-ansi": true,
"no-dev": true,
"no-interaction": true,
"no-progress": true,
"no-scripts": true,
"optimize-autoloader": true
});
$.composer('dumpautoload ', {
optimize: true
});
});
When the task is complete I'm finding is that the $baseDir variable references the build directory
Expected
$vendorDir = dirname(dirname(__FILE__));
$baseDir = dirname($vendorDir);
Output
$vendorDir = dirname(dirname(__FILE__));
$baseDir = dirname(dirname(dirname($vendorDir))).'/mybuild';
Is this something I can achieve, or should I really be running composer separately from my build process?
Thanks
I had a similar problem.
I use gulp-composer package.
gulpfile.js
const composer = require('gulp-composer');
gulp.task('composer-deployed', async function() {
let opts = {
"working-dir": 'my-path-to-composer.json',
"self-install": true, // false for my case
optimize: true,
"classmap-authoritative": true
};
composer("dumpautoload", opts);
});
In my case, I use composer installed globaly but you can choose the bin path with bin: 'path-to-composer.phar' in opts{}.

nodeJS development and production environment config file loading

Based on this answer here https://stackoverflow.com/a/22524056/777700 I have set exactly the same configuration options, but it doesn't work.
My (partial) app.js file:
console.log('environment: '+process.env.NODE_ENV);
const config = require('./config/db.json')[process.env.NODE_ENV || "development"];
console.log(config);
My ./config/db.json file:
{
"development":{
"host":"localhost",
"port":"3306",
"username":"root",
"password":"",
"database":"dbname"
},
"production":{
"host":"production-host",
"port":"3306",
"username":"user",
"password":"pwd",
"database":"dbname"
}
}
Console.log outputs:
environment: development
undefined
and app crashes. Any idea why? File is there, if I remove the [...] part of require(), it does print out the db.json file, with it, it prints out undefined.
EDIT
I tried to add console.log(typeof config) just after require() to see what I'm getting and I have noticed that if I require('./config/db.json')[process.env.NODE_ENV] I get undefined, but if I require('./config/db.json')["development"] I get back proper object.
Versions:
nodeJS 6.11.4
express 4.16.2
After more debugging and searching online, I have finally found the solution. The problem is that I'm on Windows machine and I was using npm run dev command while my "dev" command looked like SET NODE_ENV=development && nodemon server.js.
Experienced eye will notice a space before &&, which added a space behind the variable development, so the variable I was comparing against was "development " and not "development" as I was thinking.
So, the original answer from other question does work and it does load proper config!
You should export configuration as a variable:
const config = {
"development":{
"host":"localhost",
"port":"3306",
"username":"root",
"password":"",
"database":"dbname"
},
"production":{
"host":"production-host",
"port":"3306",
"username":"user",
"password":"pwd",
"database":"dbname"
}
};
module.exports = config;
This way it will be found :)
If you want to do it via JSON:
const fs = require('fs')
let localConfig
try {
localConfig = JSON.parse((fs.readFileSync('./config/db.json', 'utf-8'))
} catch (e) {
console.log('Could not parse local config.')
localConfig = false
}
module.exports = localConfig
You could then add logic for production, if there's no local configuration localConfig will return false and you can look for environment variables injected at that point.
Update:
I see that you're giving the production config yourself, in that case you can just access the key you need based on the environment. Just import localConfig and use the keys you need.
Its better to use dotenv package for this
npm i dotenv
Step 1: In package.json add this
"scripts": {
"start": "nodemon app.js",
"dev": "NODE_ENV=dev nodemon app.js"
"prod": "NODE_ENV=prod nodemon app.js"
},
Step 2: Add .env.prod and .env.dev files
.env.dev
PORT=7200
# Set your database/API connection information here
DB_URI=localhost
DB_USERNAME=root
DB_PASSWORD=password
DB_DEFAULT=dbName
Step 3: Add this in config.js
const dotenv = require('dotenv').config({ path: `.env.${process.env.NODE_ENV}` });
const result = dotenv;
if (result.error) {
throw result.error;
}
const { parsed: envs } = result;
// console.log(envs);
module.exports = envs;
Step 4: Use like this when needed
const {
DB_URI, DB_USERNAME, DB_PASSWORD, DB_DEFAULT,
} = require('../config');
Now if u want for development, run
npm run dev
For prod, use
npm run prod

Is there a way to switch cwd by changing environment in PM2 - node.js

I am using PM2 to manage the execution of a couple of micro-apps on node.
Goal:
However I would like to be able to automatically switch settings and the cwd value based on the environment the app is executing in.
For example: on my local machine CWD should be ~/user/pm2, while on the server it needs to be E:\Programs\PM2.
Is there any way to do this using JSON config options with PM2? Is there a better way to manage the variables for different environments?
you can save a shell script, say pm2_dev.sh containing the cd command as first line.
#!/bin/bash
cd /foo/bar
pm2-dev run my-app.js
OR you can add input to your script:
# pm2_dev.sh ~/user/pm2
file should be:
#!/bin/bash
cd $1
pm2-dev run my-app.js
If you do not want to change environment by shell script, you can follow documentation way:
{ "apps" : [{
"script" : "worker.js",
"watch" : true,
"env": {
"NODE_ENV": "development",
},
"env_production" : {
"NODE_ENV": "production"
} },{
"name" : "api-app",
"script" : "api.js",
"instances" : 4,
"exec_mode" : "cluster" }] }
When running your application you should use --env option as it is written here:
--env specify environment to get
specific env variables (for JSON declaration)
Finally you can wrap configuration in a js object that conditionally returns parameters basing on current environment:
module.exports = (function(env){
if( env === 'development' )
returnĀ { folder: '~/user/pm2' };
else if( env === 'production' )
returnĀ { folder: 'E:\Programs\PM2' };
}(process.env.NODE_ENV));
Then you can require the config file and access it being sure that it returns always the correct config.

can't watch multiple files with json-server

I've read about Fake json-server and I'd like to watch more than 1 file.
In the instructions it is listed
--watch, -w
Watch file(s)
but I'm not able to make it working if I launch it as
json-server -w one.json two.json more.json
create files as shown below
db.js
var firstRoute = require('./jsonfile1.json');
var secondRoute = require('./jsonfile2.json');
var thirdRoute = require('./jsonfile3.json');
var fourthRoute = require('./jsonfile4.json');
// and so on
module.exports = function() {
return {
firstRoute : firstRoute,
secondRoute : secondRoute,
thirdRoute : thirdRoute,
fourthRoute : fourthRoute
// and so on
}
}
server.js
var jsonServer = require('json-server')
var server = jsonServer.create()
var router = jsonServer.router(require('./db.js')())
var middlewares = jsonServer.defaults()
server.use(middlewares)
server.use(router)
server.listen(3000, function () {
console.log('JSON Server is running')
})
Now go to the directory where you have created both these files and open command line and run the code below
node server.js
That's it now go to the browser and go to localhost:3000 ,You shall see the routes that created for different files,you may use it directly.
You can open multipe port for differents json files with json-server.In my case
I open multiples cmd windows and launch it as.
json-server --watch one.json -p 4000
json-server --watch two.json -p 5000
json-server --watch more.json -p 6000
One for cmd window, this work for me.
It can only watch one file. You have to put the all info you need into the same file. So if you need cars for one call and clients for another, you would add a few objects from each into one file. It's unfortunate, but it's just supposed to be a very simple server.
1 - create database file e.g db.json
{
"products": [
{
"id": 1,
"name": "Caneta BIC Preta",
"price": 2500.5
},
],
"users":[
id:1,
name:"Derson Ussuale,
password: "test"
]
}
2 - In Package.json inside script
"scripts": {
"start": "json-server --watch db.json --port 3001"
},
3 - Finally Run command > npm start
Resources
http://localhost:3001/products
http://localhost:3001/users
Home
http://localhost:3001
You can do that with this things:
Step 1: Install concurrently
npm i concurrently --save-dev
Step 2: Create multiple json file, for an example db-users.json, db-companies.json and so on, and so on
Step 3: Add command line to your package.json scripts, for an example:
servers: "concurrently --kill-others \"json-server --host 0.0.0.0 --watch db-users.json --port 3000\" \"json-server --host 0.0.0.0 --watch db-companies.json --port 3001\""
Step 4: Now you can run npm run servers to run your multiple json file.
After that, you can access your server with: localhost:3000 and localhost:3001 or with your network ip address.
Note: You can add more files and more command into your package.json scripts.
That's it.
As you can watch only one file simulteniously, because it is database, you can first read the database file and then append new data to database JSON:
const mockData = jsf(mockDataSchema);
const dataBaseFilePath = path.resolve(__dirname, {YOUR_DATABASE_FILE});
fs.readFile(dataBaseFilePath, (err, dbData) => {
const json = JSON.parse(dbData);
resultData = JSON.stringify(Object.assign(json, mockData));
fs.writeFile(dataBaseFilePath, resultData, (err) => {
if (err) {
return console.log(err);
}
return console.log('Mock data generated.);
});
});

gulp and karma, file karma.conf.js does not exist

I have a basic AngularJS app and want to have all my terminal command run with gulp tasks eg $ gulp dev for the development server and $ gulp unitTest for testing etc.
I have installed Gulp as per the Docs using $ npm install --save-dev gulp with my gulpfile.js in the root of the project file. I have also done the same for karma's install and config file.
It is worth stating now that I want all the npm installs tagged with --save for easily move the project around the office and servers.
When it comes to adding the task to Gulp I have to us a relative (to the karma module) path for the configFile option to find the config but then It does not find the tests.
the following gulpfile.js produces the error ERROR [config]: File karma.conf.js does not exist!
var gulp = require('gulp'),
// ....
karma = require('karma').Server;
gulp.task('test', function(done) {
var karmaServerOptions = {
configFile: 'karma.conf.js', // works if relative path from ./node_modules/karma/lib/config.js
singleRun: true
};
karma.start(
karmaServerOptions,
function(exitStatus) {
done(exitStatus ? 'There are failing tests' : undefined);
}
);
});
karma.conf.js:
// Karma configuration
// Generated on Thu Aug 06 2015 13:38:12 GMT+0100 (BST)
module.exports = function(config) {
config.set({
// base path that will be used to resolve all patterns (eg. files, exclude)
basePath: './',
// frameworks to use
// available frameworks: https://npmjs.org/browse/keyword/karma-adapter
frameworks: ['jasmine'],
// list of files / patterns to load in the browser
files: [
// '**/*js',
'node_modules/angular/angular.js',
'app/**/*.js',
// 'unitTests/**/*Spec.js',
// 'unitTests/**/*spec.js'
'unitTests/**/*.js'
],
// list of files to exclude
exclude: [],
// preprocess matching files before serving them to the browser
// available preprocessors: https://npmjs.org/browse/keyword/karma-preprocessor
preprocessors: {},
// test results reporter to use
// possible values: 'dots', 'progress'
// available reporters: https://npmjs.org/browse/keyword/karma-reporter
reporters: ['progress'],
// web server port
port: 9876,
// enable / disable colors in the output (reporters and logs)
colors: true,
// level of logging
// possible values: config.LOG_DISABLE || config.LOG_ERROR || config.LOG_WARN || config.LOG_INFO || config.LOG_DEBUG
logLevel: config.LOG_INFO,
// enable / disable watching file and executing tests whenever any file changes
autoWatch: true,
// start these browsers
// available browser launchers: https://npmjs.org/browse/keyword/karma-launcher
browsers: ['Chrome'],
// Continuous Integration mode
// if true, Karma captures browsers, runs the tests and exits
singleRun: false
})
}
note: The files array is a bit of a mess as it still has some, but not all, of my experiments in it.
See gulp task can't find karma.conf.js for an explantation about __dirname.
Or use:
var Server = require('karma').Server
gulp.task('test', function (done) {
new Server({
configFile: require('path').resolve('karma.conf.js'),
singleRun: true
}, done).start();
});