Making Grunt automatically remove files that are deleted - html

I use Grunt to automate converting my jade files. For that I use this script:
jade: {
compile: {
options: {
client: false,
pretty: true
},
files: [{
cwd: "_/components/jade",
src: "**/*.jade",
dest: "_/html",
expand: true,
ext: ".html"
}]
}
}
I also have this watch script running:
watch: {
jade: {
files: ['_/components/jade/**/*.jade'],
tasks: ['jade']
}
}
This works fine. However, when I delete a jade file, the html file remains. Is there a way to make grunt delete the corresponding html files when I delete a jade file?

If I understood you correctly, if you delete foo.jade you also want to delete foo.html correct? Here's an complete example using grunt-contrib-clean and grunt-contrib-watch:
You start by watching all the files with .jade extension with grunt watch. When a watched file is modified in some way, a watch event is emitted. If the event is deleted, we take the file path, change the extension to .html, set it as the src value of the clean:jade task and run the task.
module.exports = function(grunt) {
grunt.initConfig({
clean: {
jade: {
src: null
}
},
watch: {
jade: {
files: ['*.jade'],
options: {
spawn: false
}
},
}
});
grunt.loadNpmTasks("grunt-contrib-watch");
grunt.loadNpmTasks("grunt-contrib-clean");
grunt.event.on('watch', function(action, filepath) {
if (action === "deleted") {
var file = filepath.slice(0, -5) + ".html";
grunt.config.set('clean.jade.src', [file]);
grunt.task.run("clean:jade");
}
});
};
For more information, see Using the watch event # grunt-contrib-watch. Note that spawn option must be false
If you need to dynamically modify your config, the spawn option must be disabled to keep the watch running under the same context.

You need grunt-contrib-clean. But this code clear all files of same type and make grunt slowly and require a specific config for every task. So often just use clean single time when grunt start:
module.exports = function (grunt){
grunt.initConfig({
pckg: grunt.file.readJSON('package.json'),
clean: { // Grun-contrib-clean tasks
jade: ["dist/*.html"],
all: ["dist"]
},
jade: {
dist: {
files: [{
expand: true,
cwd: 'src/templates',
src: ['**/*.jade'],
dest: 'dist',
filter: 'isFile',
ext: '.html'
}]
}
},
watch: {
jade: {
files: ['src/templates/**/*.jade'],
tasks: ['clean:jade','jade']
},
}
});
require('load-grunt-tasks')(grunt);
grunt.registerTask('default', ['clean:all', 'jade', 'watch']);
};

Related

Adding additional module export from external file to webpack bundle

I have a project that requires an additional "config" file that I would like compiled into the final webpack bundle as an additional export of the bundled library.
The condition is that this config file, shouldn't need to be added to the entry file, but simply added an an additional export to the bundle.
I'm still relatively new to Webpack, but have been looking into how I might be able to accomplish this for a while now with no avail. Any help on getting into the right direction would be greatly appreciated!
Entry File (ts, using ts-loader):
export default class TestPlugin {
this.name: string;
constructor(name: string) {
this.name = name;
}
}
Plugin "config".
{
"name": "test plugin"
}
Plugin "loader" logic (separate project).
const plugin = require(path.resolve(pluginDirectory, fileName)
const config = plugin.config
const newPlugin = new plugin(config.name)
Webpack Config.
entry: ['./src/index.ts'],
module: {
rules: [
{
test: /\.tsx?$/,
use: 'ts-loader',
exclude: /node_modules/,
},
],
},
output: {
filename: 'example.plugin.js',
path: path.resolve(__dirname, 'build'),
library: 'plugin',
libraryTarget: 'umd',
libraryExport: 'default',
globalObject: 'this',
},

How to disable gulp encoding images with base64?

So after gulp work, i have encoded images in base64 in css file which size is 2.8mb(((
Here is my gulpfile:
const path = {
stylus: {
src: './src/stylus/**/*.styl',
dest: './build/styles',
},
build: {
dest: 'build/**'
}
}
function stylusTask() {
return src(path.stylus.src)
.pipe(plumber())
.pipe(stylus({
use: nib(),
import: ['nib'],
compress: true
}))
.pipe(dest(path.stylus.dest))
}
You can configure stylus to only encode images smaller than a specified limit. The urls for images which exceed that limit will not be modified.
In this example, only images smaller than 2000 bytes are encoded:
function stylusTask() {
return src(path.stylus.src)
.pipe(plumber())
.pipe(stylus({
use: nib(),
import: ['nib'],
compress: true,
define: {
url: require('stylus').url({
limit:2000
})
}
}))
.pipe(dest(path.stylus.dest))
}
For more information on the url function, see the following documentation:
https://stylus-lang.com/docs/functions.url.html

Where can i mention source js files that needs to be tested in gulp-jasmine?

Here is what i am using as of now.
gulp.task('test', function () {
return gulp.src([ './app/spec/*.js'])
.pipe(jasmine());
});
But in case of grunt, you can mention source files like this
jasmine: {
src: 'js/**/*.js',
options: {
specs: 'spec/**/*.js'
}
}
Where can i mention the location for source files in case of gulp ?
Another issues is that gulp only runs the test cases from first file, as of now all of them fails, and it doesn't go to the second files in the spec folder.
For issue 2 I see this gulp-jasmine option:
errorOnFail
Default: true
Stops the stream on failed tests.
.pipe(jasmine({errorOnFail: false}))
should run the rest of the tests.
Issue 1: There are a number of ways to indicate multiple sources for gulp.src such as
return gulp.src([ './app/spec/*.js', 'spec/**/*.js'])
or set up a variable:
var testSources = ['./app/spec/*.js', 'spec/**/*.js'];
return gulp.src(testSources)
or I use something like this:
var paths = {
html: {
src: "home.html",
temp: "./temp",
deploy: "./deploy/html"
},
sass: {
src: "./src/styles/scss/**/*.scss",
stylesFile: "./src/styles/scss/styles.scss"
},
css: {
src: "./temp/css/styles.css",
temp: "./temp/css",
deploy: "./deploy/css"
},
js: {
src: "./src/js/**/*.js",
temp: "./temp/js",
deploy: "./deploy/js"
}
};
and then to use:
function reloadJS() {
return gulp.src(paths.js.src)
.pipe(sourcemaps.init())
.pipe(sourcemaps.write("../sourcemaps", {includeContent:false,sourceRoot:"/js"}))
.pipe(gulp.dest(paths.js.temp))
.pipe(reload({ stream:true }));
}
BTW, putting two questions into one post is generally frowned upon on SO.

foundation-sites 6 replacing panini for jekyll

I'm looking to extend a zurb foundation for sites 6 site and use jekyll instead of panini to render the HTML. I'm using the out of the box ZURB foundation prototype template that includes ES6 + webpack. I want foundation to handle all the SASS and JS compiling and I also want to retain the browsersync functionality. I just want to know the best approach for modifying the gulp file to integrate jekyll, this is so I can work with GitHub Pages.
Here is what the default gulp.babel.js file looks like:
'use strict';
import plugins from 'gulp-load-plugins';
import yargs from 'yargs';
import browser from 'browser-sync';
import gulp from 'gulp';
import panini from 'panini';
import rimraf from 'rimraf';
import sherpa from 'style-sherpa';
import yaml from 'js-yaml';
import fs from 'fs';
import webpackStream from 'webpack-stream';
import webpack2 from 'webpack';
import named from 'vinyl-named';
// Load all Gulp plugins into one variable
const $ = plugins();
// Check for --production flag
const PRODUCTION = !!(yargs.argv.production);
// Load settings from settings.yml
const { COMPATIBILITY, PORT, UNCSS_OPTIONS, PATHS } = loadConfig();
function loadConfig() {
let ymlFile = fs.readFileSync('config.yml', 'utf8');
return yaml.load(ymlFile);
}
// Build the "dist" folder by running all of the below tasks
gulp.task('build',
gulp.series(clean, gulp.parallel(pages, sass, javascript, images, copy), styleGuide));
// Build the site, run the server, and watch for file changes
gulp.task('default',
gulp.series('build', server, watch));
// Delete the "dist" folder
// This happens every time a build starts
function clean(done) {
rimraf(PATHS.dist, done);
}
// Copy files out of the assets folder
// This task skips over the "img", "js", and "scss" folders, which are parsed separately
function copy() {
return gulp.src(PATHS.assets)
.pipe(gulp.dest(PATHS.dist + '/assets'));
}
// Copy page templates into finished HTML files
function pages() {
return gulp.src('src/pages/**/*.{html,hbs,handlebars}')
.pipe(panini({
root: 'src/pages/',
layouts: 'src/layouts/',
partials: 'src/partials/',
data: 'src/data/',
helpers: 'src/helpers/'
}))
.pipe(gulp.dest(PATHS.dist));
}
// Load updated HTML templates and partials into Panini
function resetPages(done) {
panini.refresh();
done();
}
// Generate a style guide from the Markdown content and HTML template in styleguide/
function styleGuide(done) {
sherpa('src/styleguide/index.md', {
output: PATHS.dist + '/styleguide.html',
template: 'src/styleguide/template.html'
}, done);
}
// Compile Sass into CSS
// In production, the CSS is compressed
function sass() {
return gulp.src('src/assets/scss/app.scss')
.pipe($.sourcemaps.init())
.pipe($.sass({
includePaths: PATHS.sass
})
.on('error', $.sass.logError))
.pipe($.autoprefixer({
browsers: COMPATIBILITY
}))
// Comment in the pipe below to run UnCSS in production
//.pipe($.if(PRODUCTION, $.uncss(UNCSS_OPTIONS)))
.pipe($.if(PRODUCTION, $.cleanCss({ compatibility: 'ie9' })))
.pipe($.if(!PRODUCTION, $.sourcemaps.write()))
.pipe(gulp.dest(PATHS.dist + '/assets/css'))
.pipe(browser.reload({ stream: true }));
}
let webpackConfig = {
rules: [
{
test: /.js$/,
use: [
{
loader: 'babel-loader'
}
]
}
]
}
// Combine JavaScript into one file
// In production, the file is minified
function javascript() {
return gulp.src(PATHS.entries)
.pipe(named())
.pipe($.sourcemaps.init())
.pipe(webpackStream({module: webpackConfig}, webpack2))
.pipe($.if(PRODUCTION, $.uglify()
.on('error', e => { console.log(e); })
))
.pipe($.if(!PRODUCTION, $.sourcemaps.write()))
.pipe(gulp.dest(PATHS.dist + '/assets/js'));
}
// Copy images to the "dist" folder
// In production, the images are compressed
function images() {
return gulp.src('src/assets/img/**/*')
.pipe($.if(PRODUCTION, $.imagemin({
progressive: true
})))
.pipe(gulp.dest(PATHS.dist + '/assets/img'));
}
// Start a server with BrowserSync to preview the site in
function server(done) {
browser.init({
server: PATHS.dist, port: PORT
});
done();
}
// Reload the browser with BrowserSync
function reload(done) {
browser.reload();
done();
}
// Watch for changes to static assets, pages, Sass, and JavaScript
function watch() {
gulp.watch(PATHS.assets, copy);
gulp.watch('src/pages/**/*.html').on('all', gulp.series(pages, browser.reload));
gulp.watch('src/{layouts,partials}/**/*.html').on('all', gulp.series(resetPages, pages, browser.reload));
gulp.watch('src/assets/scss/**/*.scss').on('all', sass);
gulp.watch('src/assets/js/**/*.js').on('all', gulp.series(javascript, browser.reload));
gulp.watch('src/assets/img/**/*').on('all', gulp.series(images, browser.reload));
gulp.watch('src/styleguide/**').on('all', gulp.series(styleGuide, browser.reload));
}
I assume as part of the jekyll build/rebuild I would need to use the keep-files config setting so when jekyll clobbers the output directory file don't get overwritten.
Appreciate any help
Thanks
Jekyll is a different solution than Panini and uses Ruby
https://jekyllrb.com/docs/installation/
In general you might need some git hook or Travis config.
https://github.com/DanielRuf/testblog/blob/source/.travis.yml
https://github.com/DanielRuf/testblog/blob/source/Rakefile

Using gruntjs, how would I get data from "nested" json files?

Let's say in my Gruntfile I have pkg: grunt.file.readJSON('package.json'), and inside the package.json is the following object:
{
"file": "data.json"
}
How would I access the data from data.json? Which might look something like this:
{
"name": "Jon Schlinkert",
"company": "Sellside"
}
Just load the first file, then use the result of that to load the second file and add it to the grunt config. Like this:
module.exports = function (grunt) {
var pkg = grunt.file.readJSON('package.json');
grunt.initConfig({
pkg: pkg,
data: grunt.file.readJSON(pkg.file),
task: {
target: {
files: {
'dest': '<%- data.name %>'
}
}
}
});
grunt.registerMultiTask('task', function() {});
console.log('name', grunt.config('data.name'));
};
Maybe I don't understand the problem, but what about:
var pkg = grunt.file.readJSON('package.json');
var data = grunt.file.readJSON(pkg.file);