I am trying to use gulp as an installer for complex system that involves creating folder, copying files around and runnin compliation scripts.
Presently I have the following gulp tasks:
// Some tasks skipped that set sessionFolder
gulp.task('default', function () {
// Main
runSequence('prepare_comedi', 'compile_comedi');
});
gulp.task('prepare_comedi', function () {
// Copies comedi files into build folder
gulp.src(['../comedi/**/*']).pipe(gulp.dest(sessionFolder));
});
gulp.task('compile_comedi', function () {
var logfile=this.currentTask.name+'.log';
gutil.log(gutil.colors.green(this.currentTask.name), ": building and installing COMEDI, logging to "+logfile);
var cmd= new run.Command('{ ./autogen.sh; ./configure; make; make install; depmod -a ; make dev;} > ../'+logfile+ ' 2>&1', {cwd:sessionFolder+'/comedi', verbosity:3});
cmd.exec();
});
When I run gulp, it becomes obvious that the processes start in background and gulp task finishes immediately. The first task above should copy source files, and second one compile them. In practice, second task hits the error, as the first task is not ready with copying when second task (almost immediately) starts.
If I run second task alone, previosuly having all files from first task copied, it works ok, but I have output like this:
[19:52:47] Starting 'compile_comedi'...
[19:52:47] compile_comedi : building and installing COMEDI, logging to compile_comedi.log
$ { ./autogen.sh; ./configure; make; make install; depmod -a ; make dev;} > ../compile_comedi.log 2>&1
[19:52:47] Finished 'compile_comedi' after 6.68 ms
So it takes 6.68 millisec to leave the task, while I want gulp to leave it only after all compilations specified in the task are finished. I then would run another compile process that uses built binaries from this step as a dependency.
How I can run external commands in such a way, that next gulp task starts only after first task complete execution of an external process?
You should make sure that the task prepare_comedi is finalized prior to start compile_comedi. In order to do so, since you're using regular streams on the prepare task, simply return the stream:
gulp.task('prepare_comedi', function () {
// !!! returns the stream. Gulp will not consider the task as done
// until the stream ends.
return gulp.src(['../comedi/**/*']).pipe(gulp.dest(sessionFolder));
});
Since these tasks are interdependent and require certain order, you might also want to consider refactoring your code to actually create two methods and call them normally. Take a look at this note.
Update
Addressing your question in the comment below, if you want to hold a task until some asynchronous job has been completed, you have pretty much three choices:
return a stream (case above)
returning a promise and fulfilling it when you're done (using Q in this example):
var Q = require('Q');
gulp.task('asyncWithPromise', function() {
var deferred = Q.defer();
// anything asynchronous
setTimeout(function() {
Q.resolve('nice');
}, 5000);
return deferred.promise;
});
Receiving a callback function and calling it
gulp.task('asyncWithPromise', function(done) {
setTimeout(function() {
done();
}, 5000);
});
These approaches are in the docs.
Related
Basic question but i just cannot find answer yet.
var gulp = require('gulp');
gulp.task('one', function(cb) {
// do stuff -- async or otherwise
cb(err);
});
gulp.task('two', function(cb) {
// do something
cb(err)
});
gulp.task('three', function(cb) {
// do something
cb(err)
});
The Q is: does task 2 only runs when task 1 finishes, task 3 only runs when task 2 finishes ?
With just this setup, only one task will be executed at all, e.g. task two if you invoke gulp two.
You can create composite tasks with the help of the functions series(...) and parallel(...) provided by Gulp. The new task will run the tasks passed to the function either in sequence or in parallel. Calls to the functions can be nested to create more complex scenarios.
I use gulp-notify to trigger notifications when tasks complete. If a task is ran standalone, a notification for that specific task is triggered. If a task is ran as a dependency of another task, a notification for all dependencies is triggered.
In gulp#3, I check if the task is being called as a dependency using gulp.seq, which contains an array of the tasks being ran. Let's say I have three tasks: default, styles, and scripts, with the later two set as dependencies of the first. When running gulp styles, gulp.seq will contain [ 'styles' ]. When running gulp (the default task), gulp.seq will contain [ 'styles', 'scripts', 'default' ]. Knowing that, I then check gulp.seq.indexOf("styles") > gulp.seq.indexOf("default"), which tells me weather or not styles was ran as part of the default task.
With gulp#4, it appears that gulp.seq no longer exists. I've tried digging through the documentation and source code with no luck. It seems like gulp.tree({ deep:true }) (docs) might be what I'm looking for, but I don't see anything in it that returns anything useful.
Is there an equivalent of gulp.seq in gulp#4?
The API gulp.seq was never an official prop exposed by Gulp. With Gulp 4, you cannot do that. gulp.tree({ /* */ }) will not solve this problem for you.
Having said that, if you still need to find whether a task has run during some other task's pipeline, then you will have to decorate every gulp task with your own wrapper using something like this:
let runTasks = [];
function taskWrapper(taskName, tasks, thisTask) {
let callbackTask;
function innerCallback(cb) {
runTasks.push(taskName);
cb();
}
if (thisTask) {
callbackTask = function(cb) {
thisTask(function () {
innerCallback(cb);
});
}
} else {
callbackTask = innerCallback;
}
const newTasks = [ ...tasks, callbackTask ];
gulp.task(taskName, gulp.series(newTasks));
}
// INSTEAD OF THIS
// gulp.task('default', gulp.series('style', 'script', function () { }));
// DO THIS
taskWrapper('default', ['style', 'script'], function(cb) {
console.log('default task starting');
cb();
});
NOTE: Above code snippets has limitation. If you use watch mode, array maintaining the executed tasks i.e. runTasks will keep on growing. Also, it assumes tasks will always run in series. For a parallel mode, the logic gets little complicated.
Finally, you can also have a predefault task to help it further:
taskWrapper('predefault', [], function(cb) {
// RESET runTasks
runTasks = [];
cb();
});
taskWrapper('default', ['predefault', 'style', 'script'], function(cb) {
console.log('default task starting');
cb();
});
Also, I am doubtful if gulp-notify will work with Gulp 4.
Through a bit of luck, I discovered this was possible via the module yargs, which I already have installed.
When running gulp styles, for example, I can check argv._.indexOf("styles") > -1, as it contains ['styles']. When running gulp (i.e the default task), it contains []. In my testing, this works perfectly for my use case.
I am new to gulp.
I have written two task that need to be performed. When I run them separately, they work fine. But when I combine them, the "replace" does not work.
gulp.task('bundle-source', function () {
return bundler.bundle(config);
});
gulp.task('bundle-config', function(){
return gulp.src(['config.js'])
.pipe(replace('src/*', 'dist/*'))
.pipe(gulp.dest(''));
});
gulp.task('bundle', ['bundle-config', 'bundle-source']);
I think the issue is that they both manipulate config.js. I think the second task when it saves to disk overwrites the change the first one made. The second task is about 30 seconds.
Gulp tasks are run in parallel by default. So if your tasks are working on the same files, they might step on each others' toes indeed.
You can use gulp's tasks dependencies to have them run one after the other. So if bundle-config should be run before bundle-source :
gulp.task('bundle-source', ['bundle-config'], function () {
return bundler.bundle(config);
});
You can also use a package like run-sequence if you need them to run one after the other :
var seq = require('run-sequence');
gulp.task('bundle', function(cb) {
return seq('bundle-config', 'bundle-source', cb);
});
Finally, You could use gulp 4, which has a built-in mechanism to run tasks in series.
This must be obvious but I can't find it. I want to preprocess my stylus/coffee files with a watcher in the dev environment and in production with a build task (isn't that common to all of us?) and also run a few more minification and uglification steps in production but I want to share the pipe steps common to both dev and production for DRY
The problem is that when I run the task which watches the files, the task which preprocesses does that to all the files since it has its own gulp.src statement which includes all stylus files.
How do I avoid compiling all files on watching while still keeping the compile task separate. Thanks
paths = {
jade: ['www/**/*.jade']
};
gulp.task('jade', function() {
return gulp.src(paths.jade).pipe(jade({
pretty: true
})).pipe(gulp.dest('www/')).pipe(browserSync.stream());
});
gulp.task('serve', ['jade', 'coffee'], function() {
browserSync.init({
server: './www'
});
watch(paths.jade, function() {
return gulp.start(['jade']);
});
return gulp.watch('www/**/*.coffee', ['coffee']);
});
One important thing in Gulp is not to duplicate pipelines. If you want to process your stylus files, it has to be the one and only stylus pipe. If you want to execute different steps in your pipe however, you have multiple choices. One that I would suggest would be a noop() function in conjunction with a selection function:
var through = require('through2'); // Gulp's stream engine
/** creates an empty pipeline step **/
function noop() {
return through.obj();
}
/** the isProd variable denotes if we are in
production mode. If so, we execute the task.
If not, we pass it through an empty step
**/
function prod(task) {
if(isProd) {
return task;
} else {
return noop();
}
}
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(stylus())
.pipe(prod(minifyCss())) // We just minify in production mode
.pipe(gulp.dest(path.whatever))
})
As for the incremental builds (building just the changed files with every iteration), the best way would be to get on the gulp-cached plugin:
var cached = require('gulp-cached');
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(cached('styles')) // we just pass through the files that have changed
.pipe(stylus())
.pipe(prod(minifyCss()))
.pipe(gulp.dest(path.whatever))
})
This plugin will check if the contents have changed with each iteration you have done.
I spend a whole chapter on Gulp for different environments in my book, and I found those to be the most suitable ones. For more information on incremental builds, you can also check on my article on that (includes Gulp4): http://fettblog.eu/gulp-4-incremental-builds/
With gulp you often see patterns like this:
gulp.watch('src/*.jade',['templates']);
gulp.task('templates', function() {
return gulp.src('src/*.jade')
.pipe(jade({
pretty: true
}))
.pipe(gulp.dest('dist/'))
.pipe( livereload( server ));
});
Does this actually pass the watch'ed files into the templates task? How do these overwrite/extend/filter the src'ed tasks?
I had the same question some time ago and came to the following conclusion after digging for a bit.
gulp.watch is an eventEmitter that emits a change event, and so you can do this:
var watcher = gulp.watch('src/*.jade',['templates']);
watcher.on('change', function(f) {
console.log('Change Event:', f);
});
and you'll see this:
Change Event: { type: 'changed',
path: '/Users/developer/Sites/stackoverflow/src/touch.jade' }
This information could presumably be passed to the template task either via its task function, or the behavior of gulp.src.
The task function itself can only receive a callback (https://github.com/gulpjs/gulp/blob/master/docs/API.md#fn) and cannot receive any information about vinyl files (https://github.com/wearefractal/vinyl-fs) that are used by gulp.
The source starting a task (.watch in this case, or gulp command line) has no effect on the behavior of gulp.src('src-glob', [options]). 'src-glob' is a string (or array of strings) and options (https://github.com/isaacs/node-glob#options) has nothing about any file changes.
Hence, I don't see any way in which .watch could directly affect the behavior of a task it triggers.
If you want to process only the changed files, you can use gulp-changed (https://www.npmjs.com/package/gulp-changed) if you want to use gulp.watch, or you cold use gulp-watch.
Alternatively, you could do this as well:
var gulp = require('gulp');
var jade = require('gulp-jade');
var livereload = require('gulp-livereload');
gulp.watch('src/*.jade', function(event){
template(event.path);
});
gulp.task('templates', function() {
template('src/*.jade');
});
function template(files) {
return gulp.src(files)
.pipe(jade({
pretty: true
}))
.pipe(gulp.dest('dist/'))
}
One of the possible way to pass a parameter or a data from your watcher to a task. Is through using a global variable, or a variable that is in both blocks scops. Here is an example:
gulp.task('watch', function () {
//....
//json comments
watch('./app/tempGulp/json/**/*.json', function (evt) {
jsonCommentWatchEvt = evt; // we set the global variable first
gulp.start('jsonComment'); // then we start the task
})
})
//global variable
var jsonCommentWatchEvt = null
//json comments task
gulp.task('jsonComment', function () {
jsonComment_Task(jsonCommentWatchEvt)
})
And here the function doing the task work in case it interest any one, But know i didn't need to put the work in such another function i could just implemented it directly in the task. And for the file you have your global variable. Here it's jsonCommentWatchEvt. But know if you don't use a function as i did, a good practice is to assign the value of the global variable to a local one, that you will be using. And you do that at the all top entry of the task. So you will not be using the global variable itself. And that to avoid the problem that it can change by another watch handling triggering. When it stay in use by the current running task.
function jsonComment_Task(evt) {
console.log('handling : ' + evt.path);
gulp.src(evt.path, {
base: './app/tempGulp/json/'
}).
pipe(stripJsonComments({whitespace: false})).on('error', console.log).
on('data', function (file) { // here we want to manipulate the resulting stream
var str = file.contents.toString()
var stream = source(path.basename(file.path))
stream.end(str.replace(/\n\s*\n/g, '\n\n'))
stream.
pipe(gulp.dest('./app/json/')).on('error', console.log)
})
}
I had a directory of different json's files, where i will use comments on them. I'm watching them. When a file is modified the watch handling is triggered, and i need then to process only the file that was modified. To remove the comments, i used json-comment-strip plugin for that. Plus that i needed to do a more treatment. to remove the multiple successive line break. Whatever, at all first i needed to pass the path to the file that we can recover from the event parameter. I passed that to the task through a global variable, that does only that. Allow passing the data.
Note: Even though that doesn't have a relation with the question, in my example here, i needed to treat the stream getting out from the plugin processing. i used the on("data" event. it's asynchronous. so the task will mark the end before the work completely end (the task reach the end, but the launched asynchronous function will stay processing a little more). So the time you will get in the console at task end, isn't the time for the whole processing, but task block end. Just that you know. For me it doesn't matter.