How do you get composer to install dependencies correctly whenusing a gulp build?
My build process set up that outputs to a set location, either ../sites/www/public_html or ../sites/dev/public_html dependent on if an environment argument is passed to a gulp task. These locations essentially mirror my remote host.
I'm wanting to automate composer installs, updates and optimisation to output the correct vendor files in either ../sites/www/vendor or ../sites/dev/vendor whenever the build is initially run or to just optimise based on any watched php files being changed.
My build folder has the following structure:
source/
bower.json
composer.json
composer.lock
gulpfile.json
package.json
My example composer.json has the following:
{
"name": "mycomposer/mycomposer",
"version": "1.0.0",
"autoload": {
"psr-4" : {
"mycomposer\\": "public_html/app/mycomposer"
}
},
"require": {
"rollbar/rollbar": "^1.3",
"vlucas/phpdotenv": "^2.4.0"
}
}
I have tried a gulp-compose and task to install the composer libraries and to run dumpautoload for local first party libraries
gulp.task('composer', function() {
var dest = argv.live ? 'www' : 'devsite',
env = '../sites/' + dest + '/public_html';
$.composer('config vendor-dir ' + env.replace('public_html', 'vendor') );
$.composer({
"no-ansi": true,
"no-dev": true,
"no-interaction": true,
"no-progress": true,
"no-scripts": true,
"optimize-autoloader": true
});
$.composer('dumpautoload ', {
optimize: true
});
});
When the task is complete I'm finding is that the $baseDir variable references the build directory
Expected
$vendorDir = dirname(dirname(__FILE__));
$baseDir = dirname($vendorDir);
Output
$vendorDir = dirname(dirname(__FILE__));
$baseDir = dirname(dirname(dirname($vendorDir))).'/mybuild';
Is this something I can achieve, or should I really be running composer separately from my build process?
Thanks
I had a similar problem.
I use gulp-composer package.
gulpfile.js
const composer = require('gulp-composer');
gulp.task('composer-deployed', async function() {
let opts = {
"working-dir": 'my-path-to-composer.json',
"self-install": true, // false for my case
optimize: true,
"classmap-authoritative": true
};
composer("dumpautoload", opts);
});
In my case, I use composer installed globaly but you can choose the bin path with bin: 'path-to-composer.phar' in opts{}.
Related
Hi i'm trying to run some gulp task on netlify for building Hugo web.
I wonder how to run serial gulp task on netlify,
by the way this is my gulpfile.js
var gulp = require('gulp');
var removeEmptyLines = require('gulp-remove-empty-lines');
var prettify = require('gulp-html-prettify');
var rm = require( 'gulp-rm' );
var minifyInline = require('gulp-minify-inline');
gulp.task('tojson', function () {
gulp.src('public/**/*.html')
.pipe(removeEmptyLines())
.pipe(gulp.dest('public/./'));
});
gulp.task('htmlClean', function () {
gulp.src('public/**/*.html')
.pipe(removeEmptyLines({
removeComments: true
}))
.pipe(gulp.dest('public/./'));
});
gulp.task('templates', function() {
gulp.src('public/**/*.html')
.pipe(prettify({indent_char: ' ', indent_size: 2}))
.pipe(gulp.dest('public/./'))
});
gulp.task('minify-inline', function() {
gulp.src('public/**/*.html')
.pipe(minifyInline())
.pipe(gulp.dest('public/./'))
});
where should i put the command to run all my gulps task in Netlify?
There are two places to setup your build commands in Netlify.
Admin Option
Put your commands in the online admin under the Settings section of your site and go to Build & Deploy (Deploy settings) and change the Build command:
Netlify Config file (netlify.toml) Option
Edit/add a netlify.toml file to the root of your repository and put your build commands into the context you want to target.
netlify.toml
# global context
[build]
publish = "public"
command = "gulp build"
# build a preview (optional)
[context.deploy-preview]
command = "gulp build-preview"
# build a branch with debug (optional)
[context.branch-deploy]
command = "gulp build-debug"
NOTE:
The commands can be any valid command string. Serializing gulp commands would work fine if you do not want to create a gulp sequence to run them. In example, gulp htmlClean && hugo && gulp tojson would be a valid command.
Commands in the netlify.toml will overwrite the site admin command.
You can string your tasks together like this:
add another plugin with NPM:
https://www.npmjs.com/package/run-sequence
var runSequence = require('run-sequence');
gulp.task('default', function (callback) {
runSequence(['tojson', 'htmlClean', 'templates', 'minify-inline'],
callback
)
})
Then run $ gulp
There's a section on run-sequence on this page that will help:
https://css-tricks.com/gulp-for-beginners/
Based on this answer here https://stackoverflow.com/a/22524056/777700 I have set exactly the same configuration options, but it doesn't work.
My (partial) app.js file:
console.log('environment: '+process.env.NODE_ENV);
const config = require('./config/db.json')[process.env.NODE_ENV || "development"];
console.log(config);
My ./config/db.json file:
{
"development":{
"host":"localhost",
"port":"3306",
"username":"root",
"password":"",
"database":"dbname"
},
"production":{
"host":"production-host",
"port":"3306",
"username":"user",
"password":"pwd",
"database":"dbname"
}
}
Console.log outputs:
environment: development
undefined
and app crashes. Any idea why? File is there, if I remove the [...] part of require(), it does print out the db.json file, with it, it prints out undefined.
EDIT
I tried to add console.log(typeof config) just after require() to see what I'm getting and I have noticed that if I require('./config/db.json')[process.env.NODE_ENV] I get undefined, but if I require('./config/db.json')["development"] I get back proper object.
Versions:
nodeJS 6.11.4
express 4.16.2
After more debugging and searching online, I have finally found the solution. The problem is that I'm on Windows machine and I was using npm run dev command while my "dev" command looked like SET NODE_ENV=development && nodemon server.js.
Experienced eye will notice a space before &&, which added a space behind the variable development, so the variable I was comparing against was "development " and not "development" as I was thinking.
So, the original answer from other question does work and it does load proper config!
You should export configuration as a variable:
const config = {
"development":{
"host":"localhost",
"port":"3306",
"username":"root",
"password":"",
"database":"dbname"
},
"production":{
"host":"production-host",
"port":"3306",
"username":"user",
"password":"pwd",
"database":"dbname"
}
};
module.exports = config;
This way it will be found :)
If you want to do it via JSON:
const fs = require('fs')
let localConfig
try {
localConfig = JSON.parse((fs.readFileSync('./config/db.json', 'utf-8'))
} catch (e) {
console.log('Could not parse local config.')
localConfig = false
}
module.exports = localConfig
You could then add logic for production, if there's no local configuration localConfig will return false and you can look for environment variables injected at that point.
Update:
I see that you're giving the production config yourself, in that case you can just access the key you need based on the environment. Just import localConfig and use the keys you need.
Its better to use dotenv package for this
npm i dotenv
Step 1: In package.json add this
"scripts": {
"start": "nodemon app.js",
"dev": "NODE_ENV=dev nodemon app.js"
"prod": "NODE_ENV=prod nodemon app.js"
},
Step 2: Add .env.prod and .env.dev files
.env.dev
PORT=7200
# Set your database/API connection information here
DB_URI=localhost
DB_USERNAME=root
DB_PASSWORD=password
DB_DEFAULT=dbName
Step 3: Add this in config.js
const dotenv = require('dotenv').config({ path: `.env.${process.env.NODE_ENV}` });
const result = dotenv;
if (result.error) {
throw result.error;
}
const { parsed: envs } = result;
// console.log(envs);
module.exports = envs;
Step 4: Use like this when needed
const {
DB_URI, DB_USERNAME, DB_PASSWORD, DB_DEFAULT,
} = require('../config');
Now if u want for development, run
npm run dev
For prod, use
npm run prod
I have my gulp tasks:
var gulp = require('gulp');
var sass = require('gulp-sass');
gulp.task('styles', function() {
gulp.src('dev/sass/files/*.scss')
.pipe(sass().on('error', sass.logError))
.pipe(gulp.dest('./production/css/'))
});
//Watch task
gulp.task('default',function() {
gulp.watch('dev/sass/files/*.scss',['styles']);
});
which work when run from the console.
I have my tasks.json:
{
"version": "0.1.0",
"command": "gulp",
"isShellCommand": true,
"tasks": [
{
"taskName": "default",
"isBuildCommand": true,
"showOutput": "always",
"isWatching": true
}
]
}
When I run the build task via F1 and the command palette I get:
"Watching build tasks has finished"
The tasks haven't run and my CSS file hasn't been updated.
I have tried/checked:
gulpfile is called gulpfile and in the root of my project
gulp works from the console. Both tasks (default and styles) are listed via gulp --tasks -simple
both tasks can be successfully run from the console
I have reinstalled VSCode - just in case that made any difference. It didn't
Turns out that my ComSpec environment variable was set to (note the semicolon):
C:\WINDOWS\system32\cmd.exe;
To check yours, run this from the command line:
node -p process.env
You can change yours by using control panel and searching for environment variables.
I have a basic AngularJS app and want to have all my terminal command run with gulp tasks eg $ gulp dev for the development server and $ gulp unitTest for testing etc.
I have installed Gulp as per the Docs using $ npm install --save-dev gulp with my gulpfile.js in the root of the project file. I have also done the same for karma's install and config file.
It is worth stating now that I want all the npm installs tagged with --save for easily move the project around the office and servers.
When it comes to adding the task to Gulp I have to us a relative (to the karma module) path for the configFile option to find the config but then It does not find the tests.
the following gulpfile.js produces the error ERROR [config]: File karma.conf.js does not exist!
var gulp = require('gulp'),
// ....
karma = require('karma').Server;
gulp.task('test', function(done) {
var karmaServerOptions = {
configFile: 'karma.conf.js', // works if relative path from ./node_modules/karma/lib/config.js
singleRun: true
};
karma.start(
karmaServerOptions,
function(exitStatus) {
done(exitStatus ? 'There are failing tests' : undefined);
}
);
});
karma.conf.js:
// Karma configuration
// Generated on Thu Aug 06 2015 13:38:12 GMT+0100 (BST)
module.exports = function(config) {
config.set({
// base path that will be used to resolve all patterns (eg. files, exclude)
basePath: './',
// frameworks to use
// available frameworks: https://npmjs.org/browse/keyword/karma-adapter
frameworks: ['jasmine'],
// list of files / patterns to load in the browser
files: [
// '**/*js',
'node_modules/angular/angular.js',
'app/**/*.js',
// 'unitTests/**/*Spec.js',
// 'unitTests/**/*spec.js'
'unitTests/**/*.js'
],
// list of files to exclude
exclude: [],
// preprocess matching files before serving them to the browser
// available preprocessors: https://npmjs.org/browse/keyword/karma-preprocessor
preprocessors: {},
// test results reporter to use
// possible values: 'dots', 'progress'
// available reporters: https://npmjs.org/browse/keyword/karma-reporter
reporters: ['progress'],
// web server port
port: 9876,
// enable / disable colors in the output (reporters and logs)
colors: true,
// level of logging
// possible values: config.LOG_DISABLE || config.LOG_ERROR || config.LOG_WARN || config.LOG_INFO || config.LOG_DEBUG
logLevel: config.LOG_INFO,
// enable / disable watching file and executing tests whenever any file changes
autoWatch: true,
// start these browsers
// available browser launchers: https://npmjs.org/browse/keyword/karma-launcher
browsers: ['Chrome'],
// Continuous Integration mode
// if true, Karma captures browsers, runs the tests and exits
singleRun: false
})
}
note: The files array is a bit of a mess as it still has some, but not all, of my experiments in it.
See gulp task can't find karma.conf.js for an explantation about __dirname.
Or use:
var Server = require('karma').Server
gulp.task('test', function (done) {
new Server({
configFile: require('path').resolve('karma.conf.js'),
singleRun: true
}, done).start();
});
I am following this guide to generate junit output from my js tests:
https://github.com/sbrandwoo/grunt-qunit-junit
I have installed grunt-qunit-junit into my local test project:
npm install grunt-contrib-qunit --save-dev
And this is my Gruntfile.js:
module.exports = function(grunt) {
"use:strict";
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
qunit_junit: {
options: {
},
all: ["all_tests.html"]
},
})
grunt.loadNpmTasks('grunt-qunit-junit');
};
where all_tests.html is located in the same dir and lists all my *test.js files. But when I run:
user#ubuntu:~/Test$ grunt qunit_junit
Running "qunit_junit" task
>> XML reports will be written to _build/test-reports
Done, without errors.
Why are the tests not executed (the folder _build/test-reports is not created)?
The README states that you should execute both the qunit_junit and qunit tasks: http://github.com/sbrandwoo/grunt-qunit-junit#usage-examples
For example: grunt.registerTask('test', ['qunit_junit', 'qunit']);