I'm a fan of the files object format
files: {
'dest/a.js': ['src/aa.js', 'src/aaa.js'], // key: value
'dest/a1.js': ['src/aa1.js', 'src/aaa1.js'],
}
I have a gulp task that concats source files like
gulp.task('cat', function() {
gulp.src( <value-goes-here> )
.
<many pipeline steps>
.
.pipe(concat(<key-goes-here>))
.pipe(gulp.dest('target/')
.
<more pipeline steps to be run on 'dest/a.js' and 'dest/a1.js'>
.
});
Is there a streaming way to extend this task so that I get 1 bundle file for each key-value in files ?
I would like to NOT create one task per key-value pair, as I would like to continue piping more steps even after the last .pipe(gulp.dest('target/');
If I'm approaching this problem in wrong way, is there a better way?
Thank you in advance!
Rob Rich's answer works, Heres working version :
var Q = require('q');
var gulp = require('gulp');
var concat = require('gulp-concat');
var files = {
'a.js': ['src/aa.js', 'src/aaa.js'],
'a1.js': ['src/aa1.js', 'src/aaa1.js'],
};
gulp.task('cat', function() {
var promises = Object.keys(files).map(function (key) {
var deferred = Q.defer();
var val = files[key];
console.log(val);
gulp.src(val)
.pipe(concat(key))
.pipe(gulp.dest('dest/'))
.on('end', function () {
deferred.resolve();
});
return deferred.promise;
});
return Q.all(promises);
});
Try this:
var Q = require('q');
gulp.task('cat', function() {
var promises = Object.keys(files).map(function (key) {
var deferred = Q.defer();
var val = files[key];
gulp.src(val)
.
<many pipeline steps>
.
.pipe(concat(key))
.pipe(gulp.dest('target/')
.
<more pipeline steps to be run on 'dest/a.js' and 'dest/a1.js'>
.
.on('end', function () {
deferred.resolve();
});
return deferred.promise;
});
return Q.all(promises);
});
You can also accomplish a similar scenario by using streams instead of promises by using combined-stream or stream-combiner packages.
Related
I am kind of lacking imagination on that one.
My goal is to retrieve a json object so I can run a replace string on all the files I want to translate, I have looked into a lot of translation libraries but this way is the best i can think of for my use.
Anyway my issue here is Once I got my json object, I have to run on all the files and when it is done, finish the task 'trad'.
I have done some research and tried a lot of things but there is something that I miss, something that I didn't understood about the good way to do that ?
Please help !
gulp.task('trad', gulp.series( 'createTradFile', 'copyBeforeTrad', function( done ) {
var data = require('gulp-data');
var path = require('path');
var fs = require('fs');
var replace2 = require('gulp-string-replace');
var transObj = null;
var translateAll = function()
{
var files = gulp.src(['fr/**/*.html', 'fr/**/*.js']);
for (var k in transObj)
{
if (transObj[k].ID)
{
console.log("TRAD " + transObj[k].ID + " TO " + transObj[k].LANG1);
files.pipe(replace2(new RegExp('\\+' + transObj[k].ID + '\\+', 'g'),
transObj[k].LANG1,
{'logs': {'enabled': true}}))
.pipe(chmod(755));
}
}
files.pipe(gulp.dest("fr"))
.on('end', done);
};
gulp.src('distTemp/wording.json')
.pipe(data(function(file) {
transObj = JSON.parse( fs.readFileSync('distTemp/' + path.basename(file.path)));
console.log("TRAD first part OK");
translateAll();
}));
}));
So this code will translate like I want it too, but the task does not end :
[16:38:34] The following tasks did not complete: trad, <anonymous>
[16:38:34] Did you forget to signal async completion?
So, after a bit of research I found this ( almost crappy ) solution, which do the trick ( please answer if you hava a better solution )
var transObj = null;
gulp.task("retrieveTradObject", function(){
var data = require('gulp-data');
var path = require('path');
var fs = require('fs');
return gulp.src('distTemp/wording.json')
.pipe(data(function(file) {
transObj = JSON.parse( fs.readFileSync('distTemp/' + path.basename(file.path)));
console.log("TRAD first part OK");
}));
});
gulp.task('trad', gulp.series( 'createTradFile', 'copyBeforeTrad', 'retrieveTradObject', function( done ) {
var replace2 = require('gulp-string-replace');
var files = gulp.src(['fr/**/*.html', 'fr/**/*.js']);
for (var k in transObj)
{
if (transObj[k].ID)
{
console.log("TRAD " + transObj[k].ID + " TO " + transObj[k].LANG1);
files = files.pipe(replace2(new RegExp('\\+' + transObj[k].ID + '\\+', 'g'),
transObj[k].LANG1,
{'logs': {'enabled': true}}))
.pipe(chmod(755));
}
}
files.pipe(gulp.dest("fr"));
return files;
}));
So main idea here was to separate the two promises into task ( mainly for a better understanding of the code for later ) and then to do the files = files.pipe( ... ) Which is explained here : How to create repeating pipe in gulp?
Hope this can help !
I'm not sure I understand the question 100% so I'll take the dv's, but are talking about something like gulp-run-sequence?
You can do all sorts of tasking stuff like this
var gulp = require('gulp');
//webp images for optimization on some browsers
const webp = require('gulp-webp');
//responsive images!
var responsive = require('gulp-responsive-images');
//gulp delete for cleaning
var del = require('del');
//run sequence to make sure each gulp command completes in the right order.
var runSequence = require('run-sequence');
// =======================================================================//
// ! Default and bulk tasks //
// =======================================================================//
//default runs when the user types 'gulp' into CLI
//first clean is ran, then webp, then the rest are ran async.
//If you want something ran after, you can add something like 'example'
gulp.task('default',function(callback){
runSequence('clean','webp',['responsive-jpg','responsive-webp','copy-data','copy-sw'],'example'),callback
});
// =======================================================================//
// Images and fonts //
// =======================================================================//
gulp.task('responsive-jpg',function(){
gulp.src('src/images/*')
.pipe(responsive({
'*.jpg':[
{width:1600, suffix: '_large_1x', quality:40},
{width:800, suffix: '_medium_1x', quality:70},
{width:550, suffix: '_small_1x', quality:100}
]
}))
.pipe(gulp.dest('build/images'));
});
gulp.task('responsive-webp',function(){
gulp.src('src/images/*')
.pipe(responsive({
'*.webp':[
{width:1600, suffix: '_large_1x', quality:40},
{width:800, suffix: '_medium_1x', quality:70},
{width:550, suffix: '_small_1x', quality:80}
]
}))
.pipe(gulp.dest('build/images'));
});
gulp.task('webp', () =>
gulp.src('src/images/*.jpg')
.pipe(webp())
.pipe(gulp.dest('src/images'))
);
gulp.task('copy-data', function () {
gulp.src('./src/data/*.json')
.pipe(gulp.dest('./build/data'));
});
gulp.task('copy-sw', function () {
gulp.src('./src/sw.js')
.pipe(gulp.dest('./build/'));
});
In my example here, I clear out old files, then I convert any images that need to be converted to webp, then I async the tasks that can be run together. You can do this in any arrangement you need. You could create then a gulp task that even points to two gulp run sequence tasks to double down on the effectiveness.
At first I thought this was related to dependency of tasks so I went with run-sequence and even tried defining dependencies within tasks themselves. But I cannot get the compress task to run after copy. Or, even if it says it did finish the compress task, the compression only works if I run compress in the task runner inside visual studio by itself. What else can I try to get it to compress after copy?
/// <binding BeforeBuild='default' />
/*
This file is the main entry point for defining Gulp tasks and using Gulp plugins.
Click here to learn more. https://go.microsoft.com/fwlink/?LinkId=518007
*/
var gulp = require("gulp");
var debug = require("gulp-debug");
var del = require("del");
var uglify = require("gulp-uglify");
var pump = require("pump");
var runSequence = require("run-sequence");
var paths = {
bower: "./bower_components/",
lib: "./Lib/"
};
var modules = {
"store-js": ["store-js/dist/store.legacy.js"],
"bootstrap-select": [
"bootstrap-select/dist/css/bootstrap-select.css",
"bootstrap-select/dist/js/bootstrap-select.js",
"bootstrap-select/dist/js/i18n/*.min.js"
]
}
gulp.task("default", function (cb) {
runSequence("clean", ["copy", "compress"], cb);
});
gulp.task("clean",
function () {
return del.sync(["Lib/**", "!Lib", "!Lib/ReadMe.md"]);
});
gulp.task("compress",
function (cb) {
pump([
gulp.src(paths.lib + "**/*.js"),
uglify(),
gulp.dest(paths.lib)
], cb);
});
gulp.task("copy",
function (cb) {
prefixPathToModules();
copyModules();
cb();
});
function prefixPathToModules() {
for (var moduleIndex in modules) {
for (var fileIndex in modules[moduleIndex]) {
modules[moduleIndex][fileIndex] = paths.bower + modules[moduleIndex][fileIndex];
}
}
}
function copyModules() {
for (var files in modules) {
gulp.src(modules[files], { base: paths.bower })
.pipe(gulp.dest(paths.lib));
}
}
You use run-sequence and your code
runSequence("clean", ["copy", "compress"], cb);
run in such order
clean
copy and compress in parallel // that's why your code compresses nothing, because you have not copied files yet
cb
Write like this and compress will be after copy
runSequence("clean", "copy", "compress", cb);
I am not familiar with runSequence. But why don't you try the following. By this way your default task depends on compress and compress depends on copy. So, 'copy' will run first and then 'compress'
gulp.task('default', ['copy','compress'], function(cb){});
gulp.task('compress',['copy'], function(cb){});
Gulp returns a steam , since you are calling it in a for loop the stream is returned during the first iteration itself.
Update your copyModule to the following and you can try either runSequence like posted by Kirill or follow my approach
function copyModules() {
var inputFileArr = [];
for (var files in modules) {
inputFileArr = inputFileArr.concat(modules[files]);
};
return gulp.src(inputFileArr, { base: paths.bower })
.pipe(gulp.dest(paths.lib));
}
I am writing my own gulp plugin which looks like this...
var through2 = require('through2');
var order = require('gulp-order');
module.exports = function() {
return through2.obj(function(file, encoding, callback) {
callback(null, transform(file));
});
};
function transform(file) {
// I will modify file.contents here - its ok
return file;
}
and I would like to apply some other gulp plugin on my buffer which came from gulp.src. Is it possible using through2? For example before calling through2.obj() I would like to apply gulp-order plugin - how can I do this?
If you want to chain different gulp plugins together lazypipe is generally is good option:
var through2 = require('through2');
var order = require('gulp-order');
function yourPlugin()
return through2.obj(function(file, encoding, callback) {
callback(null, transform(file));
});
}
function transform(file) {
// I will modify file.contents here - its ok
return file;
}
function orderPlugin()
return order(['someFolder/*.js', 'someOtherFolder/*.js']);
}
module.exports = function() {
return lazypipe().pipe(orderPlugin).pipe(yourPlugin)();
};
I am fairly new to Laravel 5.2 and Elixir/gulp but I have an issue with queueTask being undefined when I run gulp from the command line.
What I want to do is to extend elixir to delete some files (according to all the documentation I can find, that's what I need to do), so I have this:
var gulp = require('gulp');
var elixir = require("laravel-elixir");
var del = require('del');
elixir.extend("remove", function(path) {
gulp.task("removeFiles", function() {
return del(path);
});
return this.queueTask("removeFiles");
});
and then in my mix I have:
.remove([
"path/to/file1/filename1",
"path/to/file2/filename2"
])
When I run gulp in the command line, I get:
return this.queueTask("removeFiles");
^
TypeError: undefined is not a function
can anyone throw some light on what I am doing wrong please?
API has changed again since Elixir v3.0.0. So for v4.0.0 you must do this:
var elixir = require('laravel-elixir');
var del = require('del');
var Task = elixir.Task;
elixir.extend('remove', function (path) {
new Task('remove', function () {
return del(path);
});
});
And then you can call it within your pipeline like this:
mix.remove([
"path/to/file1/filename1",
"path/to/file2/filename2"
]);
The difference seems to be calling elixir.extend as opposed to elixir.Task.extend. And then returning a new elixir.Task.
API was changed in Elixir v3.0.0.
You no longer need to call Gulp.task(). Elixir will handle that, instead you have to create a new Task.
var Elixir = require('laravel-elixir');
var del = require('del');
Elixir.Task.extend('remove', function (path) {
new Task('remove', function () {
return del(path);
});
});
I am using Gulp with gulp-minify-html and gulp-html-replace:
var minifyhtml = require('gulp-minify-html');
var htmlreplace = require('gulp-html-replace');
var dev_paths = {
HTML: dev + '/**/*.html'
};
var prod_paths = {
RELATIVE_CSS: ['css/bootstrap.css', 'css/font-awesome.css', 'css/c3.css', 'css/main.css'],
};
//Compress HTML
gulp.task('minify-html', function () {
var opts = {
empty: true,
comments: true
};
return gulp.src(dev_paths.HTML)
.pipe(minifyhtml(opts))
.pipe(gulp.dest(prod + '/'));
});
//Add call to the JS and CSS in the HTML files
gulp.task('replace-files', function() {
gulp.src(dev_paths.HTML)
.pipe(htmlreplace({
'css': prod_paths.RELATIVE_CSS,
'js': 'js/script.js'
}))
.pipe(gulp.dest('public/prod/'));
});
gulp.task('prod',['replace-files','minify-html'], function(){
})
However, the HTML doesn't replace the CSS and JS files I specified with task replace-files. When I run gulp without the task minify-html, it works fine though.
Does anyone knows why using both tasks replace-files and minify-html together is not working?
Thank you.
As the tasks run in parallel it is likely the 'minify-html' task is running before the 'replace-files' task is complete.
Try using run-sequence to ensure the tasks run in the required order.