I'm trying to use Polymer with a Jekyll site, but I can't figure out how to get things set. I downloaded and can run the Polymer Starter Kit. Polymer has the page contents in the app directory, but if I try to set up and run Jekyll from this folder, I get a load of errors because the Polymer index.html can't find the resources (because the root directory is different).
What is the correct way to set-up and structure for Jekyll and Polymer to work together?
Reading polymer started kit readme.md paragraph development workflow you learn that :
gulp serve is made for development phase and gulp makes a build of your application, ready to be deployed on a web server.
Just copying what you've downloaded from github on a web server will not work as is, because gulp serve is more complex than this. Read the gulpfile.js and you will see all what is done by the gulp serve command.
You need to do a gulp and you then can deploy what is generated in the dist folder. This will work in a jekyll site.
You can integrate gulp-jekyll in your gulp build process. I'd also consider watching changes in your browser-sync to automatically generate html files on change. Vulcanization process should be done only when you are deploying.
I just came back to this, and things are much improved since last summer. I made a gulpfile based on that for the Polymer Starter Kit (1.2.3). But I changed the behavior of the default and serve tasks to run Jekyll serve and build in the shell:
var spawn = require('child_process').spawn;
var argv = require('yargs').argv;
gulp.task('jekyllbuild', function(done) {
return spawn('bundle', ['exec', 'jekyll', 'build'], { stdio: 'inherit' })
.on('close', done);
});
// Build production files, the default task
gulp.task('default', ['clean'], function(cb) {
// Uncomment 'cache-config' if you are going to use service workers.
runSequence(
'jekyllbuild',
['ensureFiles', 'copy', 'styles'],
'elements',
['images', 'fonts', 'html'],
'vulcanize', // 'cache-config',
cb);
});
gulp.task('serve', function(done) {
if (argv.port) {
return spawn('bundle', ['exec', 'jekyll', 'serve', '--port=' + argv.port], { stdio: 'inherit' })
.on('close', done);
} else {
return spawn('bundle', ['exec', 'jekyll', 'serve'], { stdio: 'inherit' })
.on('close', done);
}
});
Using BrowserSync would have a much cleaner effect, but this is a simple way to get Jekyll functionality and the benefit of vulcanization for production. (Note that you also have to install the yargs package to handle port specification.) My whole gulpfile is here.
Related
Short Question Version
Changes to files happen below a target directory. I have browsersync setup like this:
var bs = require("browser-sync").create();
// Start the browsersync server
bs.init({
server: './target'
});
bs.reload("*.html");
However this is not detecting changes that occur in target subdirectories and refreshing the browser. Seems that the above lines are not enough?
Long Question Version
I have built a CLI. It watches for CSS changes in src/main/css and compiles the CSS (Using PostCSS) to target/main/css. The same is enabled for html templates in src/main/html.
Gaze watches for file changes and runs the functions that performs the compiling and this part works fine.
The full source code can be seen here.
I was hoping BrowserSync would pickup on the file changes in the target directory and refresh the browser when edits are performed, however I'm not seeing any refreshes. I have BrowserSync setup like this within the serve command:
var bs = require("browser-sync").create();
// Start the browsersync server
bs.init({
server: './target'
});
bs.reload("*.html");
The CLI can be tested by doing:
git clone https://github.com/superflycss/cli
cd cli
npm i -g
Or just install from NPM:
npm i -g #superflycss/cli
Then run:
sfc new project
cd project
sfc serve
The target folder will open up in the browser. Change the URL to http://localhost:3000/test/html/. Edit the html in src/test/html/index.html. The changes compile to target/test/html/index.html and BrowserSync should pickup on the changes IIUC...but it's not...
Thoughts?
It's pretty obvious, but bs.reload("*.html"); has to be called from within the on event of the watcher. So in other words whenever there is a file change call bs.reload("*.html");.
Since I'm using gaze to watch for file changes, I ended up doing this:
gaze(PLI.SRC_MAIN_CSS, (err, watcher) => {
if (err) {
log('error', 'Error buliding src/main/css/ content.');
throw new Error(err);
}
/**
* Triggered both when new files are added and when files are changed.
*/
watcher.on('changed', function (filepath) {
buildMainCSS();
bs.reload("*.html");
});
});
Problem: I'm learning ES6 through playing around with the code. I found that it's quite annoying to rebuild and restart the server every time I made any changes.
Goal: I want the changes that I saved to be reflected on the browser, without having to manually rebuild, and restart the server. What's the simplest way to do that?
Background:
The current script configuration in the package.json file is as below.
"scripts": {
"babel": "babel --presets es2015 js/main.js -o build/main.bundle.js",
"start": "http-server -p 9000"
},
I hope this is clear. Thank you!
I believe you must be using gulp tasks to run your project. If so, browser-sync + gulp.watch() is the best option for this. Below is what working for me, add something like below to your gulp task .js file. Whenever you change and save your es6 source code, it will automatically build and refresh the browser.
var gulp = require('gulp');
var browser = require('browser-sync').create();
// your default task goes here that should add "watch-changes" as dependency
// watch changes in js and html files
gulp.task('watch-changes', function() {
browser.init({
// initiate your browser here, refer browser-sync website
});
gulp.watch(
['build/main.bundle.js', 'webapp/**/*.html'],
browser.reload);
});
Check here neat example.
Refer browser-sync website and npm gulp-watch task
I am having difficulties with livereload in my gulp file.
If I run livereload.listen() as part of my watch task it works.
gulp.task('watch', function() {
livereload.listen();
gulp.watch('app/styles/**/*.less', ['less']);
});
// run gulp watch
However, if I launch live reload as separate task in another term window before watch it does not work.
gulp.task('lr', function() {
livereload.listen();
});
gulp.task('watch', function() {
gulp.watch('app/styles/**/*.less', ['less']);
});
// run gulp lr in one terminal and gulp watch in another - after.
I've tried using the gulp-connect plugin with the same result. Something about the watch task being run separately. I can see livereload loaded in the browser in all cases and watch runs correctly.
What is happening and is it possible to run these as 2 tasks (note that I am not interested in joining the tasks as one. That's not the goal of this question).
I'm having the following file structure:
/ src
-- app.less
/ gulp
-- index.js
-- gulpfile.js
This file structure is mounted in a vagrant box in /vagrant which means the path to app.less becomes /vagrant/src/app.less. Yes, I've checked this.
gulpfile.js
require('./gulp');
index.js
var paths = {
less: '/vagrant/src/app.less'
};
gulp.task('less', function () {
console.log('less function running');
return gulp.src(paths.less)
.pipe(less());
});
gulp.task('watch:styles', function () {
console.log('watch function running');
gulp.watch(paths.less, gulp.series('less'));
});
gulp.task('watch', gulp.parallel('watch:styles'));
gulp -v returns:
[10:02:05] CLI version 0.4.0
[10:02:05] Local version 4.0.0-alpha.1
gulp watch returns:
[09:45:20] Using gulpfile /vagrant/gulpfile.js
[09:45:20] Starting 'watch'...
[09:45:20] Starting 'watch:styles'...
watch function running
I've been using Gulp 4 for over 2 months without problems with the watcher. Since last week the watcher is not responding to files that are being changed. I've tried several editors, I've tried multiple paths like '/vagrant/**/*.less' and '../src/*.less' and even the absolute path to app.less '/vagrant/src/app.less', none of them worked.
After some research I found several issues on the github repo of Gulp 4 about the watcher. Yet, I can't figure out what the problem is. Maybe I'm overlooking an error in my code or something new in the docs, but I'm trying to solve this since yesterday morning without any luck.
It appears you're using Vagrant. If you have Gulp running on your Vagrant machine instead of on the host it won't detect any changes to files that you make on the host. This is because the events that notify the OS about filesystem changes don't propagate into the VM.
If this is the case, the solution is to simply run Gulp wherever you actually make changes to the files (i.e. if you make the changes on the VM, run it on the VM, if you make changes on the host, run Gulp on the host).
Also maybe make the path relative, instead of tying your implementation to your Vagrant box. i.e. less: './src/app.less'.
I'm trying to create a gulp task that will execute Yeoman generator I'm developing. I've got this working using the following task, but I'm trying to find a way to not pass in the fully qualified path to the location of my globally installed NPM modules.
The gulp plugins I've seen (gulp-shell & gulp-run) execute a command (such as npm root -g) but I can't figure out how to read the text into a variable or if there's another / easier way to get this value.
gulp.task('run-yo', function () {
spawn('node', [
'--debug',
'/Users/ac/.npm-packages/lib/node_modules/yo/lib/cli.js',
'nodehttps'], { stdio: 'inherit' });
});
You can use node which
var which = require('which');
which.sync('yo');