I am using feathersjs as restful API and primusjs as websocket connection. Below is the code I am using to generate primus.js file:
app.configure(primus({
transformer: 'websockets',
timeout: false
}, (primus) => {
primus.library();
primus.save(path.join(__dirname, '../public/dist/primus.js'));
}))
In order to let my client to use the generated primus.js file. I have to serve this file from my server. From the client side, it can use it like below:
<script src='http://xxxxxx/public/dist/primus.js'>
But my client is using webpack to package every dependencies into a few big js files. How can I package primus.js file in client if it is an auto generated file?
I don't believe you can out of the box but it looks like there is a primus-webpack-plugin:
This plugin allows you to pass in your Primus options and then adds the client library to your Webpack build assets.
Related
The javascript used by Google Scripts does not include the URI API library for parsing URLs. It also does not support complex (perl-like backwards looking) regular expressions. As far as I know you can't import public libraries. This makes it hard, verbose and unreliable to parse out URL elements.
However it does support web calls through URLFetchApp and the REST API. Is there a parsing server out in the internet that hosts the URI API, which can be called by URLFetchApp or using the built in REST API? I can not find one easily. Other solutions welcome.
I have a working solution only for US based URLs. International URLs break my regEx. I prefer using a robust solution not dependent on regEx.
If you want to know the problem dealing with....
I need to compare two URLs and see if the 2nd url is a on a subdomain, or directory or same as the home page.
function scoreURL (urlOne,urlTwo){
let regexSubdomain = /(?:http[s]?:\/\/)?([^\/\s]+)(\/.*)?/;
var urlOneArray = urlOne.split(regexSubdomain);
var urlTwoArray = urlTwo.split(regexSubdomain);
var subdomainOne = urlOneArray[1].replace(new RegExp('www.','i'),'')
var subdomainTwo = urlTwoArray[1].replace(new RegExp('www.','i'),'')
// return -1 if landing page is on sub domais, 0 if landing page is separate page , 1 if landing page is home page
if (subdomainOne === subdomainTwo) {
if (urlOneArray[2] === urlTwoArray[2])
{return (1);} else {return(0);}
} else return (-1);
}
The basic URL api links to a polyfill core-js module.
The URL polyfill uses multiple require statements that are not directly supported in apps script.
You can manually copy paste all required files from the parent directory and remove all required dependencies OR
Use webpack in your local nodejs to transpile the polyfill
install webpack and corejs
mkdir webpack
cd webpack
npm init -y
npm install webpack webpack-cli --save-dev
npm install --save core-js#3.18.3
src/index.js:
import 'core-js/web/url'
Bundle with webpack
npx webpack
Copy the resulting bundled js(in dist/main.js) to a file(url.gs) in apps script.
You'll now be able to use URL, URLSearchParams in global scope.
Inspired by this sample repository, I'm generating a swagger output in json with protoc and serving it. However, for certain reasons I'm hosting the swagger content on a different port(:10000) than my REST api service(:8000).
I'm using the Go library statik to bundle up the swagger assets and serve them. It works, and a webpage is served when going to localhost:10000.
However, every cURL request swagger makes seems to be confined to just that - localhost:10000. The REST API lives on localhost:8081.
Serving swagger-ui with static content, how do I change the host/port for the REST api server?
I've tried going into the index.html of the swagger-ui content to add basePath as here, but with no luck. Every request is still made to :10000
window.onload = function() {
// Begin Swagger UI call region
const ui = SwaggerUIBundle({
url: "./service.swagger.json",
dom_id: '#swagger-ui',
deepLinking: true,
presets: [
SwaggerUIBundle.presets.apis,
SwaggerUIStandalonePreset
],
plugins: [
SwaggerUIBundle.plugins.DownloadUrl
],
layout: "StandaloneLayout",
// I added this, but it did not change anything.
basePath: "localhost:8081"
})
// End Swagger UI call region
window.ui = ui}
</script>
Add host with value localhost:8081
Remove basePath, basePath is used to change the relative path after host i.e if your web server is hosted at /v1/ etc, then you can use basepath to change that
i am still trying to find out how to pass host value dynamically for production, stage, dev env
TL;DR:
Any suggestions in NodeJS to convert an HTML to PDF or PNG without any headless browser instances.
Also anyone uses puppeteer in any production environment. I would like to know how the resource utilisations and performance of running headless browser in prod.
Longer version:
In a NodeJS server we need to convert an HTML string to a PDF or PNG based on the request params. We are using puppeteer to generate this PDF and PNG (screenshot) deployed in a google cloud function. In my local running this application in a docker and restricted memory usage to 100MB and this seems working. But in cloud function it throws memory limit exception when we set the cloud function to 250MB memory. For a temporary solution we upgraded the cloud function to 1 GB.
We would like to try any alternatives for puppeteer without any headless browser approach. Another library PDF-Kit looks good but it have canvas api kind of input. We can't directly feed html.
Any thoughts or input on this
Any suggestions in NodeJS to convert an HTML to PDF or PNG without any headless browser instances.
Yes, you can try with jsPDF. I never used it before.
The syntax is simple.
Under the hood it looks no headless browser libraries are used and it seems this is a 100% pure javascript implementation.
You can feed the library directly with and HTML string.
BUT there is no png option. For images anyway there are a lot of solution that could be combined with jsPDF (so, HTML to PDF to PNG) or also other HTML to PNG direct solutions. Take a look here.
Also anyone uses puppeteer in any production environment. I would like to know how the resource utilisations and performance of running headless browser in prod.
When you want use puppeteer, I suggest to split services: a simple http server that must just handle the HTTP communication with your clients and a separate puppeteer service. Both services must be scalable but, ofcourse, the second will require more resources to run. To optimize resorces, I suggest using puppeter-cluster to create a cluster of puppeteer workers. You can better handle errors, flow and concurrency and at the same time you can save memory by using a single istance of Chromium (with the CONCURRENCY_PAGE or CONCURRENCY_CONTEXT model)
If you can use Docker, then a great solution for you may be Gotenberg.
It's an incredible service that can convert a lot of formats (HTML, Markdown, Word, Excel, etc.) into PDF.
If your page render depends on JavaScript, then no problem, it will run it and wait (you can even configure the max wait time) for the page to be completely rendered to generate your PDF.
We are using it for an application that generates 3000 PDFs per day and never had any issue with it.
Demo:
Take a look at this sample HTML invoice: https://sparksuite.github.io/simple-html-invoice-template/
Now let's convert it to PDF:
Boom, done!
1: Gotenberg URL (here using a demo endpoint provided by Gotenberg team with some limitations like 2 requests per second per IP and 5MB body limit)
2: pass an url parameter with the URL of the webpage you want to convert
3: You get the PDF as the HTTP response with Content-Type application/pdf
Curl version:
curl --location --request POST 'https://demo.gotenberg.dev/forms/chromium/convert/url' \
--form 'url="https://sparksuite.github.io/simple-html-invoice-template/"' \
-o myfile.pdf
Node.JS version:
const fetch = require('node-fetch');
const FormData = require('form-data');
const fs = require('fs');
async function main() {
const formData = new FormData();
formData.append('url', 'https://sparksuite.github.io/simple-html-invoice-template/')
const res = await fetch('https://demo.gotenberg.dev/forms/chromium/convert/url', {
method: 'POST',
body: formData
})
const pdfBuffer = await res.buffer()
// You can do whatever you like with the pdfBuffer, such as writing it to the disk:
fs.writeFileSync('/home/myfile.pdf', pdfBuffer);
}
main()
Using your own Docker instance instead of the demo endpoint, here is what you need to do:
1. Create the Gotenberg Docker container:
docker run -p 3333:3000 gotenberg/gotenberg:7 gotenberg
2. Call the http://localhost:3333/forms/chromium/convert/url endpoint:
Curl version:
curl --location --request POST 'http://localhost:3333/forms/chromium/convert/url' \
--form 'url="https://sparksuite.github.io/simple-html-invoice-template/"' \
-o myfile.pdf
Node.JS version:
const fetch = require('node-fetch');
const FormData = require('form-data');
const fs = require('fs');
async function main() {
const formData = new FormData();
formData.append('url', 'https://sparksuite.github.io/simple-html-invoice-template/')
const res = await fetch('http://localhost:3333/forms/chromium/convert/url', {
method: 'POST',
body: formData
})
const pdfBuffer = await res.buffer()
// You can do whatever you like with the pdfBuffer, such as writing it to the disk:
fs.writeFileSync('/home/myfile.pdf', pdfBuffer);
}
main()
Gotenberg homepage: https://gotenberg.dev/
If you have access to command wkhtmltopdf, I recommended it.
We use with success in our production website to generate pdfs.
First generate file_name html file, then
wkhtmltopdf --encoding utf8 --disable-smart-shrinking --dpi 100 -s {paper_size} -O {orientation} '{file_name}'
I'm new to using Svelte and would like to create a ordering website using Svelte. I know that I will need a database to keep track of the order, customer name, price etc. I have used MySQL before but I haven't learned how to connect a database to a website.
Is there a specific database that you can use if you are using Svelte?
Or is there a way to connect MySQL to Svelte?
I have searched about this on Youtube and Google but I'm not sure if it's different if you are using Svelte so I wanted to make sure.
Note: I have not started this project yet so I do not have any code to show I just want to know how you can connect a database if you're using Svelte.
Svelte is a front end javascript framework that run on the browser.
Traditionally, in order to use databases like mysql from a front end project such as svelte, (that contains only html,css and js), you would have to do it with a separate backend project. You can then communicate the svelte app and the backend project with the help of REST api. The same applies to other other front end libraries/frameworks like react, angular vue etc.
There are still so many ways to achieve the result. Since you are focusing on Svelte here are few things options
1 Sapper
Sapper is an application framework powered by svelte. You can also write backend code using express or polka so that you can connect to database of your choice (mysql / mongodb)
2 User Server less database
If you want you app simple and just focus on svelte app, you can use cloud based databases such as firebase. Svelte can directly talk to them via their javascript SDK.
3 monolithic architecture
To connect with mysql in the backend, you would need to use one serverside application programming language such as nodejs (express) php or python or whatever you are familiar with. Then use can embed svelte app or use api to pass data to the svelte app.
I can make an example with mongodb
You have to install the library
npm install mongodb
or add in package.json
Then you have to make a connection file that you have to call everytime you need to use the db
const mongo = require("mongodb");
let client = null;
let db = null;
export async function init() {
if(!client) {
client = await mongo.MongoClient.connect("mongodb://localhost");
db = client.db("name-of-your-db");
}
return { client, db }
}
for a complete example with insert you can see this video
https://www.youtube.com/watch?v=Mey2KZDog_A
You can use pouchdb, which gives you direct access to the indexedDB in the browser. No backend needed for this.
The client-pouchdb can then be replicated/synced with a remote couchdb. This can all be done inside you svelte-app from the client-side.
It is pretty easy to setup.
var db = new PouchDB('dbname');
db.put({
_id: 'dave#gmail.com',
name: 'David',
age: 69
});
db.changes().on('change', function() {
console.log('Ch-Ch-Changes');
});
db.replicate.to('http://example.com/mydb');
more on pouchdb.com
Also the client can save the data offline first and later connect to a remote database.
As i get question mostly about connection to backend, not a database. It is pity, but svelte app template has no way to connect backend "in box".
What about me, i'm using express middleware in front of rollup server. In this case you able to proxy some requests to backend server. Check code below
const proxy = require('express-http-proxy');
const app = require('express')();
app.use('/data/', proxy(
'http://backend/data',
{
proxyReqPathResolver: req => {
return '/data'+ req.url;
}
}
)
);
app.use('/', proxy('http://127.0.0.1:5000'));
app.listen(5001);
This script opend 5001 port where you have /data/ url proxied to backend server. And 5000 port still available from rollup server. So at http://localhost:5001/ you have svelte intance, connected to backend vi /data/ url, here you can send requests for fetching some data from database.
Is there any known and consolidated alternative for defining a new Angular scope reading data from outside?
I am working on a demo that should make available a standalone html page which reads the data from the same html file position, and on client machines without any webserver.
This because the HTML is generated on the fly from a pdf.
Do you have any idea?
In my working code below I should change $http.get('data.json'.. to avoid the Google restriction (on Firefox my sample is working fine).
<script>
var isisApp = angular.module('isisApp', []);
isisApp.controller('ISISListCtrl', function($scope, $http) {
$http.get('data.json').success(function(data) {
$scope.IsisDocument = data;
etc.....
and this is the error I get from Chrome:
XMLHttpRequest cannot load file:///C:/temp/data.json. Cross origin requests are only supported for HTTP. angular.js:8081
Error: A network error occurred.
Thanks in advance
Fabio
If you want to test your code, while developing, you have two options:
Use a local web server. You could use Node.js platform, using expressjs.
Start Chrome from the terminal with the –allow-file-access-from-files option