Short Question Version
Changes to files happen below a target directory. I have browsersync setup like this:
var bs = require("browser-sync").create();
// Start the browsersync server
bs.init({
server: './target'
});
bs.reload("*.html");
However this is not detecting changes that occur in target subdirectories and refreshing the browser. Seems that the above lines are not enough?
Long Question Version
I have built a CLI. It watches for CSS changes in src/main/css and compiles the CSS (Using PostCSS) to target/main/css. The same is enabled for html templates in src/main/html.
Gaze watches for file changes and runs the functions that performs the compiling and this part works fine.
The full source code can be seen here.
I was hoping BrowserSync would pickup on the file changes in the target directory and refresh the browser when edits are performed, however I'm not seeing any refreshes. I have BrowserSync setup like this within the serve command:
var bs = require("browser-sync").create();
// Start the browsersync server
bs.init({
server: './target'
});
bs.reload("*.html");
The CLI can be tested by doing:
git clone https://github.com/superflycss/cli
cd cli
npm i -g
Or just install from NPM:
npm i -g #superflycss/cli
Then run:
sfc new project
cd project
sfc serve
The target folder will open up in the browser. Change the URL to http://localhost:3000/test/html/. Edit the html in src/test/html/index.html. The changes compile to target/test/html/index.html and BrowserSync should pickup on the changes IIUC...but it's not...
Thoughts?
It's pretty obvious, but bs.reload("*.html"); has to be called from within the on event of the watcher. So in other words whenever there is a file change call bs.reload("*.html");.
Since I'm using gaze to watch for file changes, I ended up doing this:
gaze(PLI.SRC_MAIN_CSS, (err, watcher) => {
if (err) {
log('error', 'Error buliding src/main/css/ content.');
throw new Error(err);
}
/**
* Triggered both when new files are added and when files are changed.
*/
watcher.on('changed', function (filepath) {
buildMainCSS();
bs.reload("*.html");
});
});
Related
I have a developed gulp file with both browser-sync and live reload. this gulp file works perfectly for my requirements. for the base idea, I have a root folder in my Apache server and inside it I have my project folders.
--public
---proj1
---proj2
I'm using my gulp file to watch and build project folder and live reload.
currently I'm using browser-sync:
browserSync.init({
proxy: {
target: "localhost/newTest/public", // can be [virtual host, sub-directory, localhost with port]
ws: true // enables websockets
}
});
every changes done inside the 'public' folder, will affect the live reload. if I have opened both projects in two separate windows and do some changes on one project both windows are refreshing(live-reloading). I do not need that to happen. only one window should be live-reloading. how can I do it.
reason:
I want to implement this to server side level for many users, how can I do it? A change from a user should not affect other users.
Problem: I'm learning ES6 through playing around with the code. I found that it's quite annoying to rebuild and restart the server every time I made any changes.
Goal: I want the changes that I saved to be reflected on the browser, without having to manually rebuild, and restart the server. What's the simplest way to do that?
Background:
The current script configuration in the package.json file is as below.
"scripts": {
"babel": "babel --presets es2015 js/main.js -o build/main.bundle.js",
"start": "http-server -p 9000"
},
I hope this is clear. Thank you!
I believe you must be using gulp tasks to run your project. If so, browser-sync + gulp.watch() is the best option for this. Below is what working for me, add something like below to your gulp task .js file. Whenever you change and save your es6 source code, it will automatically build and refresh the browser.
var gulp = require('gulp');
var browser = require('browser-sync').create();
// your default task goes here that should add "watch-changes" as dependency
// watch changes in js and html files
gulp.task('watch-changes', function() {
browser.init({
// initiate your browser here, refer browser-sync website
});
gulp.watch(
['build/main.bundle.js', 'webapp/**/*.html'],
browser.reload);
});
Check here neat example.
Refer browser-sync website and npm gulp-watch task
I currently have a file structure like this
SASS
gulpfile.js
node_modules
sites
example-site
scss
css
example-site-two
scss
css
example-site-three
scss
css
I have gulp installed in the main parent SASS folder with a task 'sass-all' that can go through every single sites scss folder and compile it into css.
I'm trying to write a new task called 'sass-single' which can be run from any of the example-site folders. So let's say I'm in the folder "example-site-two", I want to be able to cmd and do 'gulp sass-single' and ONLY have it compile the SASS in this site. Same thing for a watch-single task I'd like to setup.
Problem is whenever I run this task from a site folder, it changes my working directory up to the parent SASS folder. I don't want to have 100 different tasks for every different site, I'd prefer to just have one 'sass-single' task thats smart enough to only compile the files from the folder I was in when I ran the script.
current Gulp task attempt
gulp.task('sass-single', function () {
process.chdir('./');
// Where are the SCSS files?
var input = './scss/*.scss';
// Where do you want to save the compiles CSS file?
var output = './css';
return gulp
.src(input)
.pipe(sourcemaps.init())
.pipe(sass(sassOptions).on('error', sass.logError))
.pipe(postcss(processors))
.pipe(sourcemaps.write('./maps'))
.pipe(gulp.dest(output));
});
However this goes back to the main SASS folder and then just does nothing.
How would I go about modifying this to be able to run from any site folder and have it only do it for that site?
If you want to change the current working directory (CWD) back to the directory where you invoked gulp then this won't work:
process.chdir('./');
That's a relative path. Relative paths are relative to the CWD. But by the time you execute process.chdir('./') Gulp has already changed the CWD to the directory where your Gulpfile.js is located. So you're just changing the CWD to ... the CWD.
You could explicitly pass a CWD to gulp on the command line:
SASS/sites/example-site> gulp --cwd .
But that would get annoying pretty quickly.
Luckily for you Gulp stores the original CWD in process.env.INIT_CWD before changing it. So you can use the following in your task to change the CWD back to the original:
process.chdir(process.env.INIT_CWD);
I'm having the following file structure:
/ src
-- app.less
/ gulp
-- index.js
-- gulpfile.js
This file structure is mounted in a vagrant box in /vagrant which means the path to app.less becomes /vagrant/src/app.less. Yes, I've checked this.
gulpfile.js
require('./gulp');
index.js
var paths = {
less: '/vagrant/src/app.less'
};
gulp.task('less', function () {
console.log('less function running');
return gulp.src(paths.less)
.pipe(less());
});
gulp.task('watch:styles', function () {
console.log('watch function running');
gulp.watch(paths.less, gulp.series('less'));
});
gulp.task('watch', gulp.parallel('watch:styles'));
gulp -v returns:
[10:02:05] CLI version 0.4.0
[10:02:05] Local version 4.0.0-alpha.1
gulp watch returns:
[09:45:20] Using gulpfile /vagrant/gulpfile.js
[09:45:20] Starting 'watch'...
[09:45:20] Starting 'watch:styles'...
watch function running
I've been using Gulp 4 for over 2 months without problems with the watcher. Since last week the watcher is not responding to files that are being changed. I've tried several editors, I've tried multiple paths like '/vagrant/**/*.less' and '../src/*.less' and even the absolute path to app.less '/vagrant/src/app.less', none of them worked.
After some research I found several issues on the github repo of Gulp 4 about the watcher. Yet, I can't figure out what the problem is. Maybe I'm overlooking an error in my code or something new in the docs, but I'm trying to solve this since yesterday morning without any luck.
It appears you're using Vagrant. If you have Gulp running on your Vagrant machine instead of on the host it won't detect any changes to files that you make on the host. This is because the events that notify the OS about filesystem changes don't propagate into the VM.
If this is the case, the solution is to simply run Gulp wherever you actually make changes to the files (i.e. if you make the changes on the VM, run it on the VM, if you make changes on the host, run Gulp on the host).
Also maybe make the path relative, instead of tying your implementation to your Vagrant box. i.e. less: './src/app.less'.
I'm trying to use Polymer with a Jekyll site, but I can't figure out how to get things set. I downloaded and can run the Polymer Starter Kit. Polymer has the page contents in the app directory, but if I try to set up and run Jekyll from this folder, I get a load of errors because the Polymer index.html can't find the resources (because the root directory is different).
What is the correct way to set-up and structure for Jekyll and Polymer to work together?
Reading polymer started kit readme.md paragraph development workflow you learn that :
gulp serve is made for development phase and gulp makes a build of your application, ready to be deployed on a web server.
Just copying what you've downloaded from github on a web server will not work as is, because gulp serve is more complex than this. Read the gulpfile.js and you will see all what is done by the gulp serve command.
You need to do a gulp and you then can deploy what is generated in the dist folder. This will work in a jekyll site.
You can integrate gulp-jekyll in your gulp build process. I'd also consider watching changes in your browser-sync to automatically generate html files on change. Vulcanization process should be done only when you are deploying.
I just came back to this, and things are much improved since last summer. I made a gulpfile based on that for the Polymer Starter Kit (1.2.3). But I changed the behavior of the default and serve tasks to run Jekyll serve and build in the shell:
var spawn = require('child_process').spawn;
var argv = require('yargs').argv;
gulp.task('jekyllbuild', function(done) {
return spawn('bundle', ['exec', 'jekyll', 'build'], { stdio: 'inherit' })
.on('close', done);
});
// Build production files, the default task
gulp.task('default', ['clean'], function(cb) {
// Uncomment 'cache-config' if you are going to use service workers.
runSequence(
'jekyllbuild',
['ensureFiles', 'copy', 'styles'],
'elements',
['images', 'fonts', 'html'],
'vulcanize', // 'cache-config',
cb);
});
gulp.task('serve', function(done) {
if (argv.port) {
return spawn('bundle', ['exec', 'jekyll', 'serve', '--port=' + argv.port], { stdio: 'inherit' })
.on('close', done);
} else {
return spawn('bundle', ['exec', 'jekyll', 'serve'], { stdio: 'inherit' })
.on('close', done);
}
});
Using BrowserSync would have a much cleaner effect, but this is a simple way to get Jekyll functionality and the benefit of vulcanization for production. (Note that you also have to install the yargs package to handle port specification.) My whole gulpfile is here.