Creating multiple output files from a gulp task - gulp

I'm learning the gulp way of doing things after using grunt exclusively in the past. I'm struggling to understand how to pass multiple inputs to get multiple outputs w/gulp.
Let's say I have a large project that has specialized js on a per page basis:
The Grunt Way:
grunt.initConfig({
uglify: {
my_target: {
files: {
'dest/everypage.min.js': ['src/jquery.js', 'src/navigation.js'],
'dest/special-page.min.js': ['src/vendor/handlebars.js', 'src/something-else.js']
}
}
}
});
This may be a poor example as it violates the "do only one thing" principle since grunt-uglify is concatenating and uglifying. In any event I'm interested in learning how to accomplish the same thing using gulp.

Thanks to #AnilNatha I'm starting to think with more of a Gulp mindset.
For my case I have a load of files that need to be concatenated. I offloaded these to a config object that my concat task iterates over:
// Could be moved to another file and `required` in.
var files = {
'polyfills.js': ['js/vendor/picturefill.js', 'js/vendor/augment.js'],
'map.js': [
'js/vendor/leaflet.js',
'js/vendor/leaflet.markercluster.min.js',
'js/vendor/jquery.easyModal.js',
'js/vendor/jquery-autocomplete.min.js',
'js/vendor/underscore.1.8.3.js',
'js/map.js'
],
...
};
var output = './build/js';
// Using underscore.js pass the key/value pair to custom concat function
gulp.task('concat', function (done) {
_.each(files, concat);
// bs.reload(); if you're using browsersync
done(); // tell gulp this asynchronous process is complete
});
// Custom concat function
function concat(files, dest) {
return gulp.src(files)
.pipe($.concat(dest))
.pipe(gulp.dest(output));
}

Related

How to load and use environment-related values in config phase

I would like to deploy my web application to several environments. Using Continuous Integration I can run a task to generate a config.json for a particular environment. This file will contain, among others, the particular URLs to use for it.
{
"baseUrl": "http://www.myapp.es/",
"baseApiUrl": "http://api.myapp.es/",
"baseAuthUrl": "http://api.myapp.es/auth/"
}
The issue comes up when I try to set my different services through providers in the config phase. Of course, services are not available yet in the phase so I cannot use $http to load that json file and set my providers correctly.
Basically I would like to do something like:
function config($authProvider) {
$authProvider.baseUrl = config.baseAuthUrl;
}
Is there a way to load those values on runtime from a file? The only thing I can think about is having that mentioned task altering this file straight away. However I have several modules and therefore, that would have to do in all of them which doesn´t seem right.
You can create constants in the config of your main module:
Add $provide as a dependency in your config method
use the provider method to add all constants like this
$provide.provider('BASE_API_URL', {
$get: function () {
return 'https://myexample.net/api/';
}
});
You can use BASE_API_URL as a dependency in your services.
I hope this helps
Optionally you can set the url depending of your environment:
$provide.provider('BASE_API_URL', {
$get: function () {
if(window.location.hostname.toLowerCase() == 'myapp.myexample.net')
{
return 'https://myexample.net/api/' //pre-production
}else
{
return 'http://localhost:61132/'; //local
}
}
});
Regards!
Finally, the solution was generating an angular constants file using templating (gulp-template) through a gulp task. At the end, I am using a yaml file instead a json one (which is the one generated my CI engine with the proper values for the environment I want to deploy to).
Basically:
config.yml
baseUrl: 'http://www.myapp.es/'
baseApiUrl: 'http://api.myapp.es/'
auth:
url: 'auth/'
config.module.constants.template
(function () {
'use strict';
angular
.module('app.config')
.constant('env_variables', {
baseUrl: '<%=baseUrl%>',
baseApiUrl: '<%=baseApiUrl%>',
authUrl: '<%=auth.url%>'
});
}());
gulpfile.js
gulp.task('splicing', function(done) {
var yml = path.join(conf.paths.src, '../config/config.yml');
var json = yaml.safeLoad(fs.readFileSync(yml, 'utf8'));
var template = path.join(conf.paths.src, '../config/config.module.constants.template');
var targetFile = path.join(conf.paths.src, '/app/config');
return gulp.src(template)
.pipe($.template(json))
.pipe($.rename("config.module.constants.js"))
.pipe(gulp.dest(targetFile), done);
});
Then you just inject it in the config phase you need:
function config($authProvider, env_variables) {
$authProvider.baseUrl = env_variables.baseApiUrl + env_variables.authUrl;
}
One more benefit about using gulp for this need is that you can integrate the generation of these constants with your build, serve or watch tasks and literally, forget about doing any change from now on. Hope it helps!

how do I run a gulp task from two or more other tasks and pass the pipe through

This must be obvious but I can't find it. I want to preprocess my stylus/coffee files with a watcher in the dev environment and in production with a build task (isn't that common to all of us?) and also run a few more minification and uglification steps in production but I want to share the pipe steps common to both dev and production for DRY
The problem is that when I run the task which watches the files, the task which preprocesses does that to all the files since it has its own gulp.src statement which includes all stylus files.
How do I avoid compiling all files on watching while still keeping the compile task separate. Thanks
paths = {
jade: ['www/**/*.jade']
};
gulp.task('jade', function() {
return gulp.src(paths.jade).pipe(jade({
pretty: true
})).pipe(gulp.dest('www/')).pipe(browserSync.stream());
});
gulp.task('serve', ['jade', 'coffee'], function() {
browserSync.init({
server: './www'
});
watch(paths.jade, function() {
return gulp.start(['jade']);
});
return gulp.watch('www/**/*.coffee', ['coffee']);
});
One important thing in Gulp is not to duplicate pipelines. If you want to process your stylus files, it has to be the one and only stylus pipe. If you want to execute different steps in your pipe however, you have multiple choices. One that I would suggest would be a noop() function in conjunction with a selection function:
var through = require('through2'); // Gulp's stream engine
/** creates an empty pipeline step **/
function noop() {
return through.obj();
}
/** the isProd variable denotes if we are in
production mode. If so, we execute the task.
If not, we pass it through an empty step
**/
function prod(task) {
if(isProd) {
return task;
} else {
return noop();
}
}
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(stylus())
.pipe(prod(minifyCss())) // We just minify in production mode
.pipe(gulp.dest(path.whatever))
})
As for the incremental builds (building just the changed files with every iteration), the best way would be to get on the gulp-cached plugin:
var cached = require('gulp-cached');
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(cached('styles')) // we just pass through the files that have changed
.pipe(stylus())
.pipe(prod(minifyCss()))
.pipe(gulp.dest(path.whatever))
})
This plugin will check if the contents have changed with each iteration you have done.
I spend a whole chapter on Gulp for different environments in my book, and I found those to be the most suitable ones. For more information on incremental builds, you can also check on my article on that (includes Gulp4): http://fettblog.eu/gulp-4-incremental-builds/

Gulp: how to pass parameters from watch to tasks

With gulp you often see patterns like this:
gulp.watch('src/*.jade',['templates']);
gulp.task('templates', function() {
return gulp.src('src/*.jade')
.pipe(jade({
pretty: true
}))
.pipe(gulp.dest('dist/'))
.pipe( livereload( server ));
});
Does this actually pass the watch'ed files into the templates task? How do these overwrite/extend/filter the src'ed tasks?
I had the same question some time ago and came to the following conclusion after digging for a bit.
gulp.watch is an eventEmitter that emits a change event, and so you can do this:
var watcher = gulp.watch('src/*.jade',['templates']);
watcher.on('change', function(f) {
console.log('Change Event:', f);
});
and you'll see this:
Change Event: { type: 'changed',
path: '/Users/developer/Sites/stackoverflow/src/touch.jade' }
This information could presumably be passed to the template task either via its task function, or the behavior of gulp.src.
The task function itself can only receive a callback (https://github.com/gulpjs/gulp/blob/master/docs/API.md#fn) and cannot receive any information about vinyl files (https://github.com/wearefractal/vinyl-fs) that are used by gulp.
The source starting a task (.watch in this case, or gulp command line) has no effect on the behavior of gulp.src('src-glob', [options]). 'src-glob' is a string (or array of strings) and options (https://github.com/isaacs/node-glob#options) has nothing about any file changes.
Hence, I don't see any way in which .watch could directly affect the behavior of a task it triggers.
If you want to process only the changed files, you can use gulp-changed (https://www.npmjs.com/package/gulp-changed) if you want to use gulp.watch, or you cold use gulp-watch.
Alternatively, you could do this as well:
var gulp = require('gulp');
var jade = require('gulp-jade');
var livereload = require('gulp-livereload');
gulp.watch('src/*.jade', function(event){
template(event.path);
});
gulp.task('templates', function() {
template('src/*.jade');
});
function template(files) {
return gulp.src(files)
.pipe(jade({
pretty: true
}))
.pipe(gulp.dest('dist/'))
}
One of the possible way to pass a parameter or a data from your watcher to a task. Is through using a global variable, or a variable that is in both blocks scops. Here is an example:
gulp.task('watch', function () {
//....
//json comments
watch('./app/tempGulp/json/**/*.json', function (evt) {
jsonCommentWatchEvt = evt; // we set the global variable first
gulp.start('jsonComment'); // then we start the task
})
})
//global variable
var jsonCommentWatchEvt = null
//json comments task
gulp.task('jsonComment', function () {
jsonComment_Task(jsonCommentWatchEvt)
})
And here the function doing the task work in case it interest any one, But know i didn't need to put the work in such another function i could just implemented it directly in the task. And for the file you have your global variable. Here it's jsonCommentWatchEvt. But know if you don't use a function as i did, a good practice is to assign the value of the global variable to a local one, that you will be using. And you do that at the all top entry of the task. So you will not be using the global variable itself. And that to avoid the problem that it can change by another watch handling triggering. When it stay in use by the current running task.
function jsonComment_Task(evt) {
console.log('handling : ' + evt.path);
gulp.src(evt.path, {
base: './app/tempGulp/json/'
}).
pipe(stripJsonComments({whitespace: false})).on('error', console.log).
on('data', function (file) { // here we want to manipulate the resulting stream
var str = file.contents.toString()
var stream = source(path.basename(file.path))
stream.end(str.replace(/\n\s*\n/g, '\n\n'))
stream.
pipe(gulp.dest('./app/json/')).on('error', console.log)
})
}
I had a directory of different json's files, where i will use comments on them. I'm watching them. When a file is modified the watch handling is triggered, and i need then to process only the file that was modified. To remove the comments, i used json-comment-strip plugin for that. Plus that i needed to do a more treatment. to remove the multiple successive line break. Whatever, at all first i needed to pass the path to the file that we can recover from the event parameter. I passed that to the task through a global variable, that does only that. Allow passing the data.
Note: Even though that doesn't have a relation with the question, in my example here, i needed to treat the stream getting out from the plugin processing. i used the on("data" event. it's asynchronous. so the task will mark the end before the work completely end (the task reach the end, but the launched asynchronous function will stay processing a little more). So the time you will get in the console at task end, isn't the time for the whole processing, but task block end. Just that you know. For me it doesn't matter.

How to pass a parameter to gulp-watch invoked task

I am trying to pass a parameter to a task that is being invoked by gulp-watch. I need it because I am trying to build a modular framework.
So if a file changes in module 1, the other modules don't need to be rebuild.
And I want just one function to create the concatted & uglified files per module.
This is what I got so far:
//here I need the 'module' parameter
gulp.task('script', function(module) { ... }
gulp.task('watch', function() {
gulp.watch('files/in/module1/*.js', ['script']); //here I want to pass module1
gulp.watch('files/in/module2/*.js', ['script']); //here I want to pass module2
});
A lot of the documentation/examples seems to be outdated (gulp.run(), gulp.start()).
I hope someone can help me out here.
I had the very same issue, searched for a while, and the "cleanest" way I came up with, uses the .on() event handler of gulp.watch(), and the .env property of gulp-util:
var gulp = require('gulp');
$.util = require('gulp-util');
var modules = {
module1: {}, // awesome module1
module2: {} // awesome module2
};
gulp.task('script', function(){
var moduleName = $.util.env.module;
// Exit if the value is missing...
var module = modules[moduleName];
if (!module) {
$.util.log($.util.colors.red('Error'), "Wrong module value!");
return;
}
$.util.log("Executing task on module '" + moduleName + "'");
// Do your task on "module" here.
});
gulp.task('watch', function () {
gulp.watch(['files/in/module1/*.js'], ['script']).on('change', function () {
$.util.env.module = 'module1';
});
gulp.watch(['files/in/module2/*.js'], ['script']).on('change', function () {
$.util.env.module = 'module2';
});
});
gulp-util also comes in handy if you need to pass (global) parameters from the shell:
[emiliano#dev ~]# gulp script --module=module1 --minify
Hope this helps someone else out there!
Regards.
In that i will answer directly the question "How to pass a parameter to gulp-watch invoked task"
My way of doing, and one of the possibility i see, is to use a global variable to pass the value between the two blocks. you set it just before launching the task in the watcher. And in the task, just at the start you pass it to a local variable.
See this answer for more details: https://stackoverflow.com/a/49733123/7668448
In what you want to achieve, you can too use just one watcher over the directory that hold all modules. If so is the structure. Then when a change happen, you can recover the changed file path. From that you can deduce what module does belong to. By getting the Module folder. That way you will not need to add a new watcher for each new module. Which can be nice when there is multiple contributors to the project for example when working on open source. And you do it one time, and don't have to care about adding anything. Just like with the delegation principle, with DOM event handling when there is multiple elements. Even if the chosen structure, doesn't have all the modules in one directory. You can stay pass multiple globs to the one watcher.
gulp.watch(['glob1/**/*.js', 'glob2/**/*.js',...], function(evt) {/*.....*/});
And following the structure you have, you can work your way to deduce what module is.
For the watcher here how i suggest you do it:
watch('./your/allModulesFolder/**/*.js', function (evt) {
rebuildModulWatchEvt = evt; //here you update the global var
gulp.start('rebuildModul'); // you start the task
})
The evt here hold multiple info: cwd, base, state, _contents ...etc And what interest us is path. So evt.path will give you the path of the changed file.
In your task either you do that:
gulp.task('rebuildModul', function() {
let evt = rebuildModulWatchEvt; // at all start you pass it to a local var
let filePath = evt.path; // how you get the changed file path
// your code go here for the rest, following your structure, get the path for the module folder
});
or you use a function :
gulp.task('rebuildModul', function() {
rebuildModulTaskRun(rebuildModulWatchEvt);
});
function rebuilModulTaskRun(evt) {
let filePath = evt.path;
// your code go here for the rest, following your structure, get the path for the module folder
}

How to start Gulp task with params?

I need to apply a build task for specific files. For finding them, I use the typical template. But I can't understood how to pass the arguments (file path) from gulp.src.
Desirable solution.
gulp.task('bundles', function() {
gulp.src('bundles/**/*.js').
pipe(gulp.start('build', file.path));
});
gulp.task('build', function (path) {
// use here
});
Question is a bit stale and I am not sure I totally understand what you're trying to achieve here, but I think what you're looking for is lazypipe
You might want to clarify your question if that's not what you're looking for
Example Usage:
var lazypipe = require('lazypipe'),
g = require('gulp-load-plugins')({lazy: true}),
jsTransformPipe = lazypipe()
.pipe(g.jshint) // <-- Notice the notation: g.jshint, not g.jshint()
.pipe(g.concat, 'bundle.js'), // <-- Notice how the param is passed to g.concat, as a second param to .pipe()
jsSourcePipe = lazypipe()
.pipe(gulp.src, './**/*.js');
gulp.task('bundle', function() {
jsSourcePipe()
.pipe(jsTransformPipe()) // <-- You execute the lazypipe by calling it as a function
.pipe(gulp.dest('../build/');
});
With lazypipe you basically create a pipe for future use; hope this help
(Can't comment because of rep, sorry)
I assume that your sample code isn't filled with everything, but why don't you merge those tasks and use your gulp.src() in your build task instead of calling another task.
Maybe it's useful for you but with what you're showing I can't find an explanation for why you do this instead of simply going with something like :
gulp.task('build', function (path) {
gulp.src('bundles/**/*.js)
//Your code for this task
});
Of course, it removes the bundles task, but it's not useful as is.
Don't hesitate to comment if I'm wrong and I'll try to help you as much as I can.
First off, gulp.task('build', function (path) won't ever work. The only valid argument for gulp tasks is a callback to signal asynchronous task completion. If you tried to do run the above, gulp would expect path to be a function and the task would never complete unless that function was called. In this example, the 'build' task should be a regular function called from the 'bundles' pipe, not a task.
The better question would be: How do I run a custom function inside a gulp pipe? Plugins like gulp-tap might get you close, but it's not difficult to create what is essentially an inline gulp plugin to call your function.
Gulp pipes receive a through2 object stream containing a vinyl file object, an encoding and a callback. Here's a basic skeleton for calling any arbitrary function against the files in a gulp pipe:
var gulp = require('gulp');
var through = require('through2');
gulp.task('stack', function() {
return gulp.src('./src/*.js')
.pipe(through.obj(function(file, enc, cb) {
// file.path is the full path to the file
myBuildFunction(file.path);
cb(null, file);
}))
.pipe(gulp.dest('./build/'));
})
This can be incredibly powerful. To modify the file's contents, just change the file.contents buffer. To rename or relocate the file, change file.path. Everything can be done in gulp's native pipes.