I have several instances of a react-slick carousel. Each of them requires a different set of config options.
Currently, I have the carousel component bundled up via webpack and then deployed to multiple locations. Unfortunately, this means that the bundle is slightly different in each case, as the config file changes the overall bundle! What's the right approach for this solution?
I feel like I can think of the following solutions:
1) Load the config file asynchronously. Seems like a lazy solution, because making an extra round trip is overkill.
2) Try to use require.ensure to split out the config file into it's own chunk.
What's the right approach for this solution?
Thanks!
To reply for point 1, I've managed to accomplish runtime loading of config this way:
import xhr from 'xhr'
class Config {
load_external_config = (cb) => {
xhr.get("config.json", {
sync: true,
timeout: 3000
},(error, response, body)=>{
if(response.statusCode==200) {
try{
const conf = JSON.parse(body);
for(var i in conf) {
this[i] = conf[i];
}
}catch(e){
/* Manage error */
}
} else {
/* Manage error */
}
})
}
}
export let config = new Config();
The class above has two basic functions, on the one hand it is a "singleton", so every time you import it in each file of your project, the istance remain the same and will not be duplicated. On the other hand, through a XHR package it loads (synchronously) an external json file and puts every config voice in its instance as a first level attribute. Later, you will be able to do this:
import { config } from './config'
config.load_external_config();
config.MY_VAR
For point 2 I would like to see some examples, and I will remain tuned to this post for someone more skilled than me.
Related
I need to be able to add a bunch of files to a common directory using the mutable file system from ipfs.files.* without revealing this content to the IPFS network until it is needed.
This is the code I have been using to test:
const onDrop = useCallback(async (acceptedFiles) => {
const test = JSON.stringify({test:"test"});
const testFile = new File([test], "test.json")
try{
for(let i = 0; i < acceptedFiles.length; i++){
console.log(acceptedFiles[i].path)
await ipfs.files.write(files[i].path, files[i], {parents: true})
}
await ipfs.files.write("/test.json", [testFile], { create: true })
} catch(e){
console.log(e)
}
console.log((await ipfs.files.stat( "/" )).cid.toV1().toString())
}, [])
This throws me the error:
ReferenceError: global is not defined
at toAsyncIterator (to-async-iterator.js?6924:44:1)
at eval (write.js?db08:99:37)
at eval (create-lock.js?5096:33:1)
at async mfsWrite (write.js?db08:98:1)
caused by the ipfs.files.wite() statement in the loop.
I initially added content through the ipfs.addAll() function, but this did not seem to work for adding it to the global MFS directory.
I think I am using the documentation right, but I might be trying to use MFS for something it was not intended for. If anyone knows the best way to achieve combining many files into a directory without revealing the content to the IPFS network, please let me know.
Is it possible to reference a React App that is running on another server using
<img src="https://www.react_app.com">
The idea is that the React App returns an image string (or similar) like this:
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgA ...
So that it can be read in a <img src=""> tag?
The main question is what React code simply sends back a request with the string so that it can be read in src=""?
Also is there a timeout for how long an <img src=""> attempts to fetch an image?
React component imports
import React, { useCallback, useEffect, useState, useRef } from 'react'
import classNames from 'classnames'
import { fabric } from 'fabric'
import fabricConfig from './fabricConfig'
import FileUploader from './components/FileUploader'
import ColorPicker from './components/ColorPicker'
import Checkbox from './components/Checkbox'
import Button from './components/Button'
import getRatio from './utils/getRatio'
import getInitialCanvasSize from './utils/getInitialCanvasSize'
import getImageFromURL from './utils/getImageFromURL'
import resizeCanvas from './utils/resizeCanvas'
import removeSelectedElements from './utils/removeSelectedElements'
import getCanvasObjectFilterRGB from './utils/getCanvasObjectFilterRGB'
import setAttributes from './utils/setAttributes'
import { Z, Y, DELETE } from './utils/constants'
Fetch image from URL and automatically make changes to it on load
const imageUrl = "www.something.com/image"
if (imageUrl) {
new Promise(resolve => fabric.loadSVGFromURL(imageUrl, (objects, options) => {
const group = new fabric.Group(objects)
resolve(getRatio(group, canvas))
}))
.then(({ ratio, width, height }) => {
fabric.loadSVGFromURL(imageUrl, (objects, options) => {
try {
objects.forEach(obj => {
setAttributes(obj, {
left: (obj.left * ratio) + ((canvas.width / 2) - ((width * ratio) / 2)),
top: (obj.top * ratio) + ((canvas.height / 2) - ((height * ratio) / 2)),
})
obj.scale(ratio)
// MAKE EDITS TO THE SVG OBJECT HERE
canvas.add(obj)
})
canvas.renderAll()
// HERE I AM TRYING TO SAVE THE CANVAS STATE AND SEND IT BACK TO THE THIRD PARTY WEBSITE USING GET PARAMETERS
var canvasImg = ''
if(urlParams.get("export") === "png"){
canvasImg = canvas.toDataURL("image/png")
} else if (urlParams.get("export") === "pdf") {
canvasImg = canvas.toDataURL("image/pdf")
} else {
onCanvasModified(canvas)
}
} catch(err) {
console.log('Could not retrieve that image')
}
})
})
What you want is a CDN, which serves image assets via a GET request (the img src accepts a string which it uses to fetch (GET) content). In short, a CDN serves the application with assets -- be it images, javascript, CSS or HTML. A React application is designed to update content in place via manipulating a virtual DOM; therefore, expecting it to serve assets across domains is anti-pattern. Instead, you would use a custom server (like express) or a web server (like nginx) to serve static assets.
As a simple example, imgur.com would the React application, while i.imgur.com would be their CDN to serve images like this and s.imgur.com would be their CDN to serve CSS/JS assets like this.
This answer goes into more detail how to do it; HOWEVER, this is only one of many, many ways on how accomplish the above, but the concept is still the same: Making a request to retrieve an image via an img src string.
I hesitate to provide full example code since I have no idea what stack you're working with and what your knowledge/comfort-level is regarding backend services. As such, if you want practice consuming a request that serves images on the frontend, then I'd recommend you start with this API service.
Example
Here's one of many ways to do it: Example Repo
To run the example...
1.) Clone the repo: git clone git#github.com:mattcarlotta/save-canvas-example.git
2.) Install dependencies: yarn or npm i
3.) Run yarn dev or npm dev
4.) Click one of the buttons to either save an image as PNG or as a PDF
The example includes quite a bit of notes, but to provide a brief:
User clicks button. File Ref
Depending on the clicked button, the canvas is converted to a Blob, which is then converted to a File. File Ref
This file is then sent (via POST) to an image microservice running at http://localhost:4000 listening for requests to /upload-file. File Ref
The microservice sees the request and directs to our middleware functions. File Ref
Then it directs it to the /upload-file controller. File Ref
The controller determines if the file upload was valid (in the middleware), if not it throws an error. File Ref
When valid, the file details are generated from req.file (this comes from our multer middleware function), a directory is created and a file is saved to that directory. File Ref
A filepath is then sent back to the client. File Ref
Client receives filepath from image microservice and sets it to state. File Ref
Client then renders a shareable link, a link to view the file, and a preview. File Ref
Results
Save PNG:
Save PDF:
Flow Diagram
I've tried to reproduce the project with minimal features. user can add and interact with rectangle and save the image. upon saving it would go to the server and the image data will be stored in a JSON file.
Here's the link to frontend: https://codesandbox.io/s/so-fabric-client-5bjsf
As you have mentioned, there are two different react apps; I've created two routes, /draw where the user can draw the image and /images where I fetch the images. you can consider these two routes as different react projects since the logic remains the same regardless of their origin.
On the backend side, for demonstration purposes and simplicity, I've used a JSON file and sending all the file content in response when the application wants to display the images. It could become problematic once there are hundreds of images or when you want to search them by the user. so consider using a database or any other method.
here's the backend code:
const express = require("express");
const path = require("path");
const fs = require("fs");
const app = express();
const PORT = process.env.PORT || 3000;
app.use(express.static(path.join(__dirname + "/build")));
app.use(express.urlencoded({ extended: true }));
app.use(express.json());
app.post("/save-image", (req, res) => {
const image = req.body.image
fs.readFile('images.json', function (err, data) {
if(err) {
console.log(err)
res.status(500)
}
var json = JSON.parse(data)
json.push({id: json.length, image})
fs.writeFile("images.json", JSON.stringify(json), (err, result) => {
if(err) console.log(err);
})
})
res.status(200);
})
app.get("/get-images", (req, res) => {
res.json(JSON.parse(fs.readFileSync("./images.json")))
})
app.listen(PORT, () => {
console.log(`up and running on port ${PORT}`);
})
images.json is just a file with [] as its content. the code is pretty much self-explainatory, I've uploaded all the code on GitHub as well -- so-fabric-demo and you can check the demo on Heroku
TL;DR: Could you please explain when bundle-loader is needed for code splitting using Webpack?
When I started migrating a Backbone-based app from Require.js to Webpack, I remember that this kind of require statement in the router:
someMatchedRoute: function () {
require(['path-to-file'], function(module) {
// doing something with the loaded module
module();
});
}
would put the required code in the same bundle as the rest of the code, and in order to generate a separate file that would be required dynamically when switching to a particular route, I needed to use bundle-loader, like so:
// a function executed when the user’s profile route is matched
someMatchedRoute: function () {
require('bundle!path-to-file')(function(module) {
// doing something with the loaded module
module();
});
}
Now, when I am migrating my codebase to ES6 modules and using the require.ensure syntax as described in Webpack documentation:
someMatchedRoute: function () {
require.ensure(['path-to-file'], function(require) {
var loadedModule = require('path-to-file');
// doing something with the loaded module
loadedModule();
});
}
I am unsure whether I need bundle-loader at all to in order to generate multiple chunks and load them dynamically. And if I do, in which require call does it go — in the require.ensure or in the require in the callback? Or maybe both? It's all so confusing.
I'm working on an AngularJs/MVC app with Web API etc. which is using a CDN. I have managed to whitelist two URLs for Angular to use, a local CDN and a live CDN (web app hosted in Azure).
I can successfully ng-include a template from my local CDN domain, but the problem arises when I push the site to a UAT / Live environment, I cant be using a template on Localhost.
I need a way to be able to dynamically get the base url for the templates. The location on the server will always be the same, eg: rooturl/html/templates. I just need to be able to change the rooturl depending on the environment.
I was thinking if there was some way to store a global variable, possibly on the $rootScope somewhere that I can get to when using the templates and then set that to the url via Web API which will get return a config setting.
For example on my dev machine the var could be http://Localhost:52920/ but on my uat server it could be https://uat-cdn.com/
Any help would be greatly appreciated as I don't want to store Js, css, fonts etc on the CDN but not the HTML as it feels nasty.
Thanks I'm advance!
I think it's good practice to keep environment and global config stuff outside of Angular altogether, so it's not part of the normal build process and is harder to accidentally blow away during a deploy. One way is to include a script file containing just a single global variable:
var config = {
myBaseUrl: '/templates/',
otherStuff: 'whatever'
}
...and expose it to Angular via a service:
angular.module('myApp')
.factory('config', function () {
var config = window.config ? window.config : {}; // (or throw an error if it's not found)
// set defaults here if useful
config.myBaseUrl = config.myBaseUrl || 'defaultBaseUrlValue';
// etc
return config;
}
...so it's now injectable as a dependency anywhere you need it:
.controller('fooController', function (config, $scope), {
$scope.myBaseUrl = config.myBaseUrl;
}
Functionally speaking, this is not terribly different from dumping a global variable into $rootScope but I feel like it's a cleaner separation of app from environment.
If you decide to create a factory then it would look like this:
angular.module('myModule', [])
.factory('baseUrl', ['$location', function ($location) {
return {
getBaseUrl: function () {
return $location.hostname;
}
};
}]);
A provider could be handy if you want to make any type of customization during config.
Maybe you want to build the baseurl manually instead of using hostname property.
If you want to use it on the templates then you need to create a filter that reuses it:
angular.module('myModule').filter('anchorBuilder', ['baseUrl', function (baseUrl) {
return function (path) {
return baseUrl.getBaseUrl() + path;
}
}]);
And on the template:
EDIT
The above example was to create links but if you want to use it on a ng-include directive then you will have a function on your controller that uses the factory and returns the url.
// Template
<div ng-include src="urlBuilder('path')"></div>
//Controller
$scope.urlBuilder = function (path) {
return BaseUrl.getBaseUrl() + path;
};
Make sure to inject the factory in the controller
I would like to deploy my web application to several environments. Using Continuous Integration I can run a task to generate a config.json for a particular environment. This file will contain, among others, the particular URLs to use for it.
{
"baseUrl": "http://www.myapp.es/",
"baseApiUrl": "http://api.myapp.es/",
"baseAuthUrl": "http://api.myapp.es/auth/"
}
The issue comes up when I try to set my different services through providers in the config phase. Of course, services are not available yet in the phase so I cannot use $http to load that json file and set my providers correctly.
Basically I would like to do something like:
function config($authProvider) {
$authProvider.baseUrl = config.baseAuthUrl;
}
Is there a way to load those values on runtime from a file? The only thing I can think about is having that mentioned task altering this file straight away. However I have several modules and therefore, that would have to do in all of them which doesn´t seem right.
You can create constants in the config of your main module:
Add $provide as a dependency in your config method
use the provider method to add all constants like this
$provide.provider('BASE_API_URL', {
$get: function () {
return 'https://myexample.net/api/';
}
});
You can use BASE_API_URL as a dependency in your services.
I hope this helps
Optionally you can set the url depending of your environment:
$provide.provider('BASE_API_URL', {
$get: function () {
if(window.location.hostname.toLowerCase() == 'myapp.myexample.net')
{
return 'https://myexample.net/api/' //pre-production
}else
{
return 'http://localhost:61132/'; //local
}
}
});
Regards!
Finally, the solution was generating an angular constants file using templating (gulp-template) through a gulp task. At the end, I am using a yaml file instead a json one (which is the one generated my CI engine with the proper values for the environment I want to deploy to).
Basically:
config.yml
baseUrl: 'http://www.myapp.es/'
baseApiUrl: 'http://api.myapp.es/'
auth:
url: 'auth/'
config.module.constants.template
(function () {
'use strict';
angular
.module('app.config')
.constant('env_variables', {
baseUrl: '<%=baseUrl%>',
baseApiUrl: '<%=baseApiUrl%>',
authUrl: '<%=auth.url%>'
});
}());
gulpfile.js
gulp.task('splicing', function(done) {
var yml = path.join(conf.paths.src, '../config/config.yml');
var json = yaml.safeLoad(fs.readFileSync(yml, 'utf8'));
var template = path.join(conf.paths.src, '../config/config.module.constants.template');
var targetFile = path.join(conf.paths.src, '/app/config');
return gulp.src(template)
.pipe($.template(json))
.pipe($.rename("config.module.constants.js"))
.pipe(gulp.dest(targetFile), done);
});
Then you just inject it in the config phase you need:
function config($authProvider, env_variables) {
$authProvider.baseUrl = env_variables.baseApiUrl + env_variables.authUrl;
}
One more benefit about using gulp for this need is that you can integrate the generation of these constants with your build, serve or watch tasks and literally, forget about doing any change from now on. Hope it helps!