how do I run a gulp task from two or more other tasks and pass the pipe through - gulp

This must be obvious but I can't find it. I want to preprocess my stylus/coffee files with a watcher in the dev environment and in production with a build task (isn't that common to all of us?) and also run a few more minification and uglification steps in production but I want to share the pipe steps common to both dev and production for DRY
The problem is that when I run the task which watches the files, the task which preprocesses does that to all the files since it has its own gulp.src statement which includes all stylus files.
How do I avoid compiling all files on watching while still keeping the compile task separate. Thanks
paths = {
jade: ['www/**/*.jade']
};
gulp.task('jade', function() {
return gulp.src(paths.jade).pipe(jade({
pretty: true
})).pipe(gulp.dest('www/')).pipe(browserSync.stream());
});
gulp.task('serve', ['jade', 'coffee'], function() {
browserSync.init({
server: './www'
});
watch(paths.jade, function() {
return gulp.start(['jade']);
});
return gulp.watch('www/**/*.coffee', ['coffee']);
});

One important thing in Gulp is not to duplicate pipelines. If you want to process your stylus files, it has to be the one and only stylus pipe. If you want to execute different steps in your pipe however, you have multiple choices. One that I would suggest would be a noop() function in conjunction with a selection function:
var through = require('through2'); // Gulp's stream engine
/** creates an empty pipeline step **/
function noop() {
return through.obj();
}
/** the isProd variable denotes if we are in
production mode. If so, we execute the task.
If not, we pass it through an empty step
**/
function prod(task) {
if(isProd) {
return task;
} else {
return noop();
}
}
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(stylus())
.pipe(prod(minifyCss())) // We just minify in production mode
.pipe(gulp.dest(path.whatever))
})
As for the incremental builds (building just the changed files with every iteration), the best way would be to get on the gulp-cached plugin:
var cached = require('gulp-cached');
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(cached('styles')) // we just pass through the files that have changed
.pipe(stylus())
.pipe(prod(minifyCss()))
.pipe(gulp.dest(path.whatever))
})
This plugin will check if the contents have changed with each iteration you have done.
I spend a whole chapter on Gulp for different environments in my book, and I found those to be the most suitable ones. For more information on incremental builds, you can also check on my article on that (includes Gulp4): http://fettblog.eu/gulp-4-incremental-builds/

Related

How can I check to see if a task is ran as a dependency of another task in gulp#4?

I use gulp-notify to trigger notifications when tasks complete. If a task is ran standalone, a notification for that specific task is triggered. If a task is ran as a dependency of another task, a notification for all dependencies is triggered.
In gulp#3, I check if the task is being called as a dependency using gulp.seq, which contains an array of the tasks being ran. Let's say I have three tasks: default, styles, and scripts, with the later two set as dependencies of the first. When running gulp styles, gulp.seq will contain [ 'styles' ]. When running gulp (the default task), gulp.seq will contain [ 'styles', 'scripts', 'default' ]. Knowing that, I then check gulp.seq.indexOf("styles") > gulp.seq.indexOf("default"), which tells me weather or not styles was ran as part of the default task.
With gulp#4, it appears that gulp.seq no longer exists. I've tried digging through the documentation and source code with no luck. It seems like gulp.tree({ deep:true }) (docs) might be what I'm looking for, but I don't see anything in it that returns anything useful.
Is there an equivalent of gulp.seq in gulp#4?
The API gulp.seq was never an official prop exposed by Gulp. With Gulp 4, you cannot do that. gulp.tree({ /* */ }) will not solve this problem for you.
Having said that, if you still need to find whether a task has run during some other task's pipeline, then you will have to decorate every gulp task with your own wrapper using something like this:
let runTasks = [];
function taskWrapper(taskName, tasks, thisTask) {
let callbackTask;
function innerCallback(cb) {
runTasks.push(taskName);
cb();
}
if (thisTask) {
callbackTask = function(cb) {
thisTask(function () {
innerCallback(cb);
});
}
} else {
callbackTask = innerCallback;
}
const newTasks = [ ...tasks, callbackTask ];
gulp.task(taskName, gulp.series(newTasks));
}
// INSTEAD OF THIS
// gulp.task('default', gulp.series('style', 'script', function () { }));
// DO THIS
taskWrapper('default', ['style', 'script'], function(cb) {
console.log('default task starting');
cb();
});
NOTE: Above code snippets has limitation. If you use watch mode, array maintaining the executed tasks i.e. runTasks will keep on growing. Also, it assumes tasks will always run in series. For a parallel mode, the logic gets little complicated.
Finally, you can also have a predefault task to help it further:
taskWrapper('predefault', [], function(cb) {
// RESET runTasks
runTasks = [];
cb();
});
taskWrapper('default', ['predefault', 'style', 'script'], function(cb) {
console.log('default task starting');
cb();
});
Also, I am doubtful if gulp-notify will work with Gulp 4.
Through a bit of luck, I discovered this was possible via the module yargs, which I already have installed.
When running gulp styles, for example, I can check argv._.indexOf("styles") > -1, as it contains ['styles']. When running gulp (i.e the default task), it contains []. In my testing, this works perfectly for my use case.

How to load and use environment-related values in config phase

I would like to deploy my web application to several environments. Using Continuous Integration I can run a task to generate a config.json for a particular environment. This file will contain, among others, the particular URLs to use for it.
{
"baseUrl": "http://www.myapp.es/",
"baseApiUrl": "http://api.myapp.es/",
"baseAuthUrl": "http://api.myapp.es/auth/"
}
The issue comes up when I try to set my different services through providers in the config phase. Of course, services are not available yet in the phase so I cannot use $http to load that json file and set my providers correctly.
Basically I would like to do something like:
function config($authProvider) {
$authProvider.baseUrl = config.baseAuthUrl;
}
Is there a way to load those values on runtime from a file? The only thing I can think about is having that mentioned task altering this file straight away. However I have several modules and therefore, that would have to do in all of them which doesn´t seem right.
You can create constants in the config of your main module:
Add $provide as a dependency in your config method
use the provider method to add all constants like this
$provide.provider('BASE_API_URL', {
$get: function () {
return 'https://myexample.net/api/';
}
});
You can use BASE_API_URL as a dependency in your services.
I hope this helps
Optionally you can set the url depending of your environment:
$provide.provider('BASE_API_URL', {
$get: function () {
if(window.location.hostname.toLowerCase() == 'myapp.myexample.net')
{
return 'https://myexample.net/api/' //pre-production
}else
{
return 'http://localhost:61132/'; //local
}
}
});
Regards!
Finally, the solution was generating an angular constants file using templating (gulp-template) through a gulp task. At the end, I am using a yaml file instead a json one (which is the one generated my CI engine with the proper values for the environment I want to deploy to).
Basically:
config.yml
baseUrl: 'http://www.myapp.es/'
baseApiUrl: 'http://api.myapp.es/'
auth:
url: 'auth/'
config.module.constants.template
(function () {
'use strict';
angular
.module('app.config')
.constant('env_variables', {
baseUrl: '<%=baseUrl%>',
baseApiUrl: '<%=baseApiUrl%>',
authUrl: '<%=auth.url%>'
});
}());
gulpfile.js
gulp.task('splicing', function(done) {
var yml = path.join(conf.paths.src, '../config/config.yml');
var json = yaml.safeLoad(fs.readFileSync(yml, 'utf8'));
var template = path.join(conf.paths.src, '../config/config.module.constants.template');
var targetFile = path.join(conf.paths.src, '/app/config');
return gulp.src(template)
.pipe($.template(json))
.pipe($.rename("config.module.constants.js"))
.pipe(gulp.dest(targetFile), done);
});
Then you just inject it in the config phase you need:
function config($authProvider, env_variables) {
$authProvider.baseUrl = env_variables.baseApiUrl + env_variables.authUrl;
}
One more benefit about using gulp for this need is that you can integrate the generation of these constants with your build, serve or watch tasks and literally, forget about doing any change from now on. Hope it helps!

Creating multiple output files from a gulp task

I'm learning the gulp way of doing things after using grunt exclusively in the past. I'm struggling to understand how to pass multiple inputs to get multiple outputs w/gulp.
Let's say I have a large project that has specialized js on a per page basis:
The Grunt Way:
grunt.initConfig({
uglify: {
my_target: {
files: {
'dest/everypage.min.js': ['src/jquery.js', 'src/navigation.js'],
'dest/special-page.min.js': ['src/vendor/handlebars.js', 'src/something-else.js']
}
}
}
});
This may be a poor example as it violates the "do only one thing" principle since grunt-uglify is concatenating and uglifying. In any event I'm interested in learning how to accomplish the same thing using gulp.
Thanks to #AnilNatha I'm starting to think with more of a Gulp mindset.
For my case I have a load of files that need to be concatenated. I offloaded these to a config object that my concat task iterates over:
// Could be moved to another file and `required` in.
var files = {
'polyfills.js': ['js/vendor/picturefill.js', 'js/vendor/augment.js'],
'map.js': [
'js/vendor/leaflet.js',
'js/vendor/leaflet.markercluster.min.js',
'js/vendor/jquery.easyModal.js',
'js/vendor/jquery-autocomplete.min.js',
'js/vendor/underscore.1.8.3.js',
'js/map.js'
],
...
};
var output = './build/js';
// Using underscore.js pass the key/value pair to custom concat function
gulp.task('concat', function (done) {
_.each(files, concat);
// bs.reload(); if you're using browsersync
done(); // tell gulp this asynchronous process is complete
});
// Custom concat function
function concat(files, dest) {
return gulp.src(files)
.pipe($.concat(dest))
.pipe(gulp.dest(output));
}

Infinite watch loop with gulp on auto-updated / auto-generated files

I am attempting to use some gulp plugins ( jscs, csscomb ) to style my code on the fly during dev time.
I'm having a problem with the gulp process running an infinite loop with the format task.
What's I believe to be happening:
start a serve task of some kind
an initial run is performed with all tasks to prep files for the staging server
a local staging server is started in parallel with a watch task
myfile.scss is updated by a developer
the gulp watcher starts the csscomb task
csscomb plugin changes the file and replaces it
the watcher task sees the change from the file replacement & starts the format task again...
the csscomb plugin runs again and so on ...
Here is a snippet that causes this loop. (Note: this uses v4 of gulp)
'use strict'
import { task, parallel, series, src, dest, watch, plugins } from './gulp';
import { startStagingServer } from './servers';
import { solution } from './solution.map';
const path = require('path');
task('serve', parallel(startStagingServer, watchStyles);
function watchStyles() {
watch([ solution.src.styles ], series(formatStyles, compileStyles))
}
function formatStyles(done) {
return src([ solution.src.styles ])
.pipe(plugins.csscomb())
.pipe(dest(solution.src.mount)) // the root of the solution
}
function compileStyles() {
return src([ solution.src.styles ])
.pipe(plugins.sass().on('error', plug.sass.logError))
.pipe(dest(path.join(solution.dest.stage, 'serve')));
}
Does anyone know a way to avoid this?
The way to avoid this is not to put the fix in the watcher. Use 2 separate functions: one that fixes and the other that doesn't. Only watch the one that doesn't. Example:
function taskJscsFix() {
return gulp.src(path.JS)
.pipe(jscs({
configPath: './gulp/.jscsrc',
fix: true
}))
.pipe(gulp.dest(path.SRC.JS));
}
function taskScripts() {
return gulp.src(path.JS)
.pipe(jscs({
configPath: './gulp/.jscsrc'
}))
.pipe(jscs.reporter())
.pipe(gulp.dest(path.DEST.JS));
}

How should I create a complete build with Gulp?

Just learning Gulp. Looks great, but I can't find any information on how to make a complete distribution with it.
Let's say I want to use Gulp to concatenate and minify my CSS and JS, and optimise my images.
In doing so I change the location of JS scripts in my build directory (eg. from bower_components/jquery/dist/jquery.js to js/jquery.js).
How do I automatically update my build HTML/PHP documents to reference the correct files? What is the standard way of doing this?
How do I copy over the rest of my project files?. These are files that need to be included as part of the distribution, such as HTML, PHP, various txt, JSON and all sorts of other files. Surely I don't have to copy and paste those from my development directory each time I do a clean build with Gulp?
Sorry for asking what are probably very n00bish questions. It's possible I should be using something else other than Gulp to manage these, but I'm not sure where to start.
Many thanks in advance.
Point #1
The way i used to achieve this:
var scripts = [];
function getScriptStream(dir) { // Find it as a gulp module or create it
var devT = new Stream.Transform({objectMode: true});
devT._transform = function(file, unused, done) {
scripts.push(path.relative(dir, file.path));
this.push(file);
done();
};
return devT;
}
// Bower
gulp.task('build_bower', function() {
var jsFilter = g.filter('**/*.js');
var ngFilter = g.filter(['!**/angular.js', '!**/angular-mocks.js']);
return g.bowerFiles({
paths: {
bowerDirectory: src.vendors
},
includeDev: !prod
})
.pipe(ngFilter)
.pipe(jsFilter)
.pipe(g.cond(prod, g.streamify(g.concat.bind(null, 'libs.js'))))
.pipe(getScriptStream(src.html))
.pipe(jsFilter.restore())
.pipe(ngFilter.restore())
.pipe(gulp.dest(build.vendors));
});
// JavaScript
gulp.task('build_js', function() {
return gulp.src(src.js + '/**/*.js', {buffer: buffer})
.pipe(g.streamify(g.jshint))
.pipe(g.streamify(g.jshint.reporter.bind(null, 'default')))
.pipe(g.cond(prod, g.streamify(g.concat.bind(null,'app.js'))))
.pipe(g.cond(
prod,
g.streamify.bind(null, g.uglify),
g.livereload.bind(null, server)
))
.pipe(gulp.dest(build.js))
.pipe(getScriptStream(build.html));
});
// HTML
gulp.task('build_html', ['build_bower', 'build_js', 'build_views',
'build_templates'], function() {
fs.writeFile('scripts.json', JSON.stringify(scripts));
return gulp.src(src.html + '/index.html' , {buffer: true})
.pipe(g.replace(/(^\s+)<!-- SCRIPTS -->\r?\n/m, function($, $1) {
return $ + scripts.map(function(script) {
return $1 + '<script type="text/javascript" src="'+script+'"></script>';
}).join('\n') + '\n';
}))
.pipe(gulp.dest(build.html));
});
It has the advantages of concatenating and minifying everything for production while include every files for testing purpose keeping error line numbers coherent.
Point 2
Copying files with gulp is just as simple as doing this:
gulp.src(path).pipe(gulp.dest(buildPath));
Bonus
I generally proceed to deployment by creating a "build" branch and just cloning her in the production server. I created buildbranch for that matter:
// Publish task
gulp.task('publish', function(cb) {
buildBranch({
branch: 'build',
ignore: ['.git', '.token', 'www', 'node_modules']
}, function(err) {
if(err) {
throw err;
}
cb();
});
});
To loosely answer my own question, several years later:
How do I automatically update my build HTML/PHP documents to reference the correct files? What is the standard way of doing this?
Always link to dist version, but ensure sourcemaps are created, so the source is easy to debug. Of course, the watch task is a must.
How do I copy over the rest of my project files?. These are files that need to be included as part of the distribution, such as HTML, PHP, various txt, JSON and all sorts of other files. Surely I don't have to copy and paste those from my development directory each time I do a clean build with Gulp?
This usually isn't a problem as there aren't offer too many files. Large files and configuration are often kept out if the repo, besides.