I have a Hugo site and I'm using Netlify CMS to manage the content. Everytime the site builds, it runs a Gulp task that optimizes images from the src folder to the static folder. But, the problem is, when I upload an image through the CMS, it stores it in the static folder.
So, in the admin config.yml, should I set the media_folder to src/images instead?
My thinking is that the task will run and store the new minified image into the static folder but is that right? Or is there another way to do this?
Gulp task:
gulp.task('images', () => {
return gulp.src('src/images/**/*.{png,jpg,jpeg,gif,svg,webp,ico}')
.pipe($.newer('static/images'))
.pipe($.print())
.pipe($.imagemin([
$.imagemin.jpegtran({progressive: true}),
$.imagemin.optipng({optimizationLevel: 5}),
]))
.pipe(gulp.dest('static/images'));
});
Admin config.yml
media_folder: "static/images"
public_folder: "images"
Just configure Netlify CMS to upload files to a different location, i.e. a page bundle, then Hugo can take care of image optimisation natively.
In your content repository, you can make a build script (build & deploy if hosted on Netlify) and it can resize and optimise images and put them into a new folder anytime it detects new content. Most importantly, remove EXIF data such as Geolocation.
const path = require('path');
const gm = require('gm');
const fs = require('fs-extra');
const klaw = require('klaw');
const mediaDir = path.resolve(__dirname, 'media');
const imagesDir = path.resolve(__dirname, 'images');
const sizes = [
{size: 1280, rename: false},
{size: 640, rename: true},
{size: 320, rename: true},
];
const imagesToProcess = [];
(async () => {
await fs.ensureDir(imagesDir);
klaw(mediaDir)
.on('data', (item) => {
const stat = fs.lstatSync(item.path);
const copyPath = path.resolve(imagesDir, path.basename(item.path));
if (stat.isFile() && !fs.pathExistsSync(copyPath)) {
imagesToProcess.push([item.path, copyPath]);
}
})
.on('end', () => {
imagesToProcess.reduce((promise, [originalImage, copyPath]) => {
sizes.reduce((promise, sizeObject) => {
return promise.then(() => new Promise((resolve) => {
gm(originalImage)
.noProfile()
.resizeExact(sizeObject.size, sizeObject.size)
.quality(75)
.write(copyPath.replace('.jpg', `-${sizeObject.size}.jpg`), () => resolve());
}));
}, promise);
}, Promise.resolve());
});
})();
Related
I have an electron js application:
const path = require('path');
const url = require('url');
const {app, BrowserWindow, ipcMain, nativeTheme, globalShortcut} = require('electron');
let win;
function createWindow() {
win = new BrowserWindow({
width: 1024,
height: 1024,
icon: path.join(__dirname, "web/img/app.png"),
fullscreen: true,
autoHideMenuBar: true,
webPreferences: {
preload: path.join(__dirname, "web/js/preload.js"),
}
});
win.loadURL(url.format({
pathname: path.join(__dirname, "web/html/index.html"),
protocol: 'file',
slashes: true
}));
win.webContents.openDevTools();
win.on('closed', () => {
win = null;
});
ipcMain.on('sys-shutdown', () => {
app.quit();
});
}
app.on('ready', createWindow);
app.on('window-all-closed', () => {
if (process.platform !== 'darwin') {
app.quit()
}
})
render.js:
window.onload=function() {
console.log("LOAD!");
const sys_shutdown = document.getElementById("SysShutdown");
sys_shutdown.addEventListener('click', async () => {
console.log('hello');
await window.sys.shutdown();
});
}
preload.js:
const { contextBridge, ipcRenderer } = require('electron');
contextBridge.exposeInMainWorld('sys', {
shutdown: () => ipcRenderer.send('sys-shutdown')
});
When I start a project, I get an error in the console:
Electron Security Warning (Insecure Content-Security-Policy) This renderer process has either no Content Security
Policy set or a policy with "unsafe-eval" enabled. This exposes users of
this app to unnecessary security risks.
For more information and help, consult
https://electronjs.org/docs/tutorial/security.
This warning will not show up
once the app is packaged.
I tried googling, but the solutions didn't work for me. I tried to allow all rights, but it didn't help either, I don't know what to do anymore...
P.S The console.log("LOAD") line works fine, but the code doesn't work any further.
Please help
I have a node js application that creates dynamic content which I want users to download.
static async downloadPDF(res, html, filename) {
const puppeteer = require('puppeteer');
const browser = await puppeteer.launch({
headless: true
});
const page = await browser.newPage()
await page.setContent(html, {
waitUntil: 'domcontentloaded'
})
const pdfBuffer = await page.pdf({
format: 'A4'
});
res.set("Content-Disposition", "attachment;filename=" + filename + ".pdf");
res.setHeader("Content-Type", "application/pdf");
res.send(pdfBuffer);
await browser.close()
}
Is there a way to speed up the whole process since it takes about 10 seconds to create a pdf file of size about 100kb?
I read somewhere that I can launch the headless browser once then I will only be creating a new page instead of launching a browser every time I request for the file.
I cannot find out a correct way of doing it.
You could move page creation to a util and hoist it to re-use it.
const puppeteer = require('puppeteer');
let page;
const getPage = async () => {
if (page) return page;
const browser = await puppeteer.launch({
headless: true,
});
page = await browser.newPage();
return page;
};
.
const getPage = require('./getPage');
static async downloadPDF(res, html, filename) {
const page = await getPage()
}
Yes, no reason to launch browser every time. You can set puppeter to call new url and get content. Without every time launching, it would be more faster.
How implement this ? Cut your function to three steps :
Create a browser instance. No matter headless or not. If you run app in X environment, you can launch a window, to see what your puppetter do
Create a function code, that will do main task in cycle.
After block is done, call await page.goto(url) ( where "page" is the instance of browser.newPage() ) and run your function again.
This is one of possible solution in function style code :
Create a instnces :
const browser = await puppeteer.launch( {'headless' : false });
const page = await browser.newPage();
page.setViewport({'width' : 1280, 'height' : 1024 });
I put it in realtime async function like (async ()=>{})();
Gets a data
Im my case, a set of urls was in mongo db, after getting it, I had ran a cycle :
for( const entrie of entries)
{
const url = entrie[1];
const id = entrie[0];
await get_aplicants_data(page,url,id,collection);
}
In get_aplicants_data() I had realized a logic according a loaded page :
await page.goto(url); // Going to url
.... code to prcess page data
Also you can load url in cycle and then put in your logic
Hope I have given you some help )
Apify can crawl links from sitemap.xml
const Apify = require('apify');
Apify.main(async () => {
const requestList = new Apify.RequestList({
sources: [{ requestsFromUrl: 'https://edition.cnn.com/sitemaps/cnn/news.xml' }],
});
await requestList.initialize();
const crawler = new Apify.PuppeteerCrawler({
requestList,
handlePageFunction: async ({ page, request }) => {
console.log(`Processing ${request.url}...`);
await Apify.pushData({
url: request.url,
title: await page.title(),
html: await page.content(),
});
},
});
await crawler.run();
console.log('Done.');
});
https://sdk.apify.com/docs/examples/puppeteersitemap#docsNav
However, I am not sure how to crawl links from sitemap.xml if I am using requestQueue. For ex:
const requestQueue = await Apify.openRequestQueue();
await requestQueue.addRequest({url: "https://google.com});
//this is not working. Apify is simply crawling sitemap.xml
//and not adding urls from sitemap.xml to requestQueue
await requestQueue.addRequest({url:`https://google.com/sitemap.xml`});
const crawler = new Apify.PuppeteerCrawler({
requestQueue,
// This function is called for every page the crawler visits
handlePageFunction: async (context) => {
const {request, page} = context;
const title = await page.title();
let page_url = request.url;
console.log(`Title of ${page_url}: ${title}`);
await Apify.utils.enqueueLinks({
page, selector: 'a', pseudoUrls, requestQueue});
},
});
await crawler.run();
The great thing about Apify is that you can use both RequestList and RequestQueue together. In that case, items are taken from the list to the queue as you scrape (not overloading the queue). By using both, you get the best from both worlds.
Apify.main(async () => {
const requestList = new Apify.RequestList({
sources: [{ requestsFromUrl: 'https://edition.cnn.com/sitemaps/cnn/news.xml' }],
});
await requestList.initialize();
const requestQueue = await Apify.openRequestQueue();
const crawler = new Apify.PuppeteerCrawler({
requestList,
requestQueue,
handlePageFunction: async ({ page, request }) => {
console.log(`Processing ${request.url}...`);
// This is just an example, define your logic
await Apify.utils.enqueueLinks({
page, selector: 'a', pseudoUrls: null, requestQueue,
});
await Apify.pushData({
url: request.url,
title: await page.title(),
html: await page.content(),
});
},
});
await crawler.run();
console.log('Done.');
});
If you want to use just the queue, you will need to parse the XML yourself. Of course, this is not a big issue. You can parse it easily with Cheerio either before the crawler or by using Apify.CheerioCrawler
Anyway, we recommend using RequestList for bulk urls because it is basically instantly created in-memory but the queue is actually a database (or JSON files locally).
I'm working on an app build with Electron and a config file is saved after login.
When the app is reopened in development, it works, but when it's reopened after being distributed with electron-builder, I have to login again.
Code
const {writeFile} = require('fs');
const {join} = require('path');
function write(config) {
writeFile(join(__dirname, './config.json'), JSON.stringify(config, null, 4), (err) => {
cb(err);
})
}
write(someJsonObject);
Project:
Electron 1.7.11
Electron-builder 19.55.2
Node 8.8.1
First check your are not building asar file for your production app since it is read-only and you are trying to write directly to __dirname.
My suggestion is to use writable directory for your configuration file, for example:
const fs = require('fs');
const _HOME_ = require('os').homedir();
const _SEP_ = require('path').sep;
const _APPHOME_ = `${_HOME_}${_SEP_}.myapp${_SEP_}`;
if (!fs.existsSync(_APPHOME_)) {//Check dir exists or create it
fs.mkdir(_APPHOME_, '0777', true, function () {
console.log('Created app home dir :)');
});
}
fs.writeFile(_APPHOME_ + 'config.json'), JSON.stringify(config, null, 4), (err) => {
cb(err);
});
Of course, you may want to accommodate this solution to your needs. Hope works for you!
I have a gulp file that runs Sass and injects my CSS, app and Bower js files in separate tasks. I would like, at the end of the task, to watch my SCSS files and run Sass on change, and application code and inject new files (with a view to livereload when I can get this working).
Unfortunately, I've run into a problem where my watch tasks end early, therefore no changes are being watched for.
I've read in another question that the watch tasks have to be returned, however, this did not help...
Here is my gulpfile (with prod tasks removed)
const gulp = require('gulp')
// plugins
const concat = require('gulp-concat')
const uglify = require('gulp-uglify')
const rename = require('gulp-rename')
const closure = require('gulp-jsclosure')
const rev = require('gulp-rev')
const sass = require('gulp-sass')
const cssmin = require('gulp-cssmin')
const bower = require('main-bower-files')
const strip = require('gulp-strip-comments')
const inject = require('gulp-inject')
const ngannotate = require('gulp-ng-annotate')
const mainBowerFiles = require('main-bower-files')
const templateCache = require('gulp-angular-templatecache')
const minifyHtml = require('gulp-minify-html')
const path = require('path')
const babel = require('gulp-babel')
const es = require('event-stream')
// configuration variables
const config = {
dir: {
root: 'client/',
app: './Dist/',
vendor: './Dist/',
css: 'client/',
cssMin: './Dist/',
bower: 'client/wwwroot/lib',
index: './Views/Shared/_Layout.cshtml',
indexDir: './Views/Shared',
modules: 'client/app/modules'
},
filters: {
app: 'client/app.js',
appModules: 'client/app/modules/**/*.module.js',
appModuleFiles: 'client/app/modules/**/*.js',
vendors: 'client/vendors/**/*.js',
templates: 'client/app/modules/**/*.html',
client: 'client/app/css/**/*.scss'
}
}
//----------CSS---------
gulp.task('sass', () => {
return gulp.src(['client/app/css/bootstrap.scss', 'client/app/css/app.scss', 'client/app/css/themes/theme-f.scss'])
.pipe(sass().on('error', sass.logError))
.pipe(gulp.dest('build/css'));
})
gulp.task('inject-css', ['sass'], () => {
var sources = gulp.src(['build/css/bootstrap.css', 'build/css/app.css', 'build/css/theme-f.css'], { read: false }, { name: 'css' });
return gulp.src(config.dir.index)
.pipe(inject(sources))
.pipe(gulp.dest(config.dir.indexDir));
})
//--------End CSS-------
//----------App---------
gulp.task('inject-js', ['inject-bower'], () => {
var sources = gulp.src([config.filters.app, config.filters.appModules, config.filters.appModuleFiles], { read: false });
return gulp.src(config.dir.index)
.pipe(inject(sources))
.pipe(gulp.dest(config.dir.indexDir));
})
//--------End App-------
//---------Vendor-------
gulp.task('inject-bower', ['inject-css'], () => {
let bower = gulp.src(mainBowerFiles(), { read: false }, { relative: true })
return gulp.src(config.dir.index)
// .pipe(inject(es.merge(bower, vendors), { name: 'vendor' }))
.pipe(inject(bower, { name: 'vendor' }))
.pipe(gulp.dest(config.dir.indexDir));
})
//-------End Vendor-----
//----------Prod--------
//---------Watches-------
gulp.task('scripts:watch', () => {
return gulp.watch([config.filters.app, config.filters.appModuleFiles], ['inject-js']);
})
gulp.task('sass:watch', () => {
return gulp.watch(config.filters.scss, ['sass']);
})
gulp.task('watch', ['scripts:watch', 'sass:watch'], () => {
console.log(" ");
console.log("Watching for changes...");
console.log(" ");
})
//-------End Watches-----
//----------Tasks--------
gulp.task('default', ['inject-js', 'inject-bower', 'inject-css',' watch'], () => {
// gulp.watch([config.filters.app, config.filters.appModuleFiles], ['inject-js'])
// gulp.watch(config.filters.scss, ['sass'])
})
As you can probably see from the commented out code, I have also tried running these watched directly inside of the default task.
This is the console output from running gulp
[12:48:12] Using gulpfile C:\source\Icon.Admin\src\UI\Icon.Admin\gulpfile.js
[12:48:12] Starting 'scripts:watch'...
[12:48:12] Finished 'scripts:watch' after 145 ms
[12:48:12] Starting 'sass:watch'...
[12:48:12] Finished 'sass:watch' after 180 μs
[12:48:12] Starting 'watch'...
Watching for changes...
[12:48:12] Finished 'watch' after 159 μs
Any help with this is greatly appreciated!
Fixed this by running my watch tasks inside of the main watch task:
gulp.task('watch', [], () => {
console.log(" ");
console.log("Watching for changes...");
console.log(" ");
gulp.watch([config.filters.app, config.filters.appModuleFiles], ['inject-js']);
gulp.watch(config.filters.client, ['sass']);
})
Well, Inside your sass:watch it refers to config.filters.scss but that is no where to be found in the config object. Probably why your scss isn't compiling.
...
gulp.task('scripts:watch', () => {
return gulp.watch([config.filters.app, config.filters.appModuleFiles], ['inject-js']);
})
gulp.task('sass:watch', () => {
return gulp.watch(config.filters.client, ['inject-sass']);
})
gulp.task('default', ['inject-js', 'inject-bower', 'inject-css', 'scripts:watch', 'sass:watch']);
This should be all you need.