I would like to deploy my web application to several environments. Using Continuous Integration I can run a task to generate a config.json for a particular environment. This file will contain, among others, the particular URLs to use for it.
{
"baseUrl": "http://www.myapp.es/",
"baseApiUrl": "http://api.myapp.es/",
"baseAuthUrl": "http://api.myapp.es/auth/"
}
The issue comes up when I try to set my different services through providers in the config phase. Of course, services are not available yet in the phase so I cannot use $http to load that json file and set my providers correctly.
Basically I would like to do something like:
function config($authProvider) {
$authProvider.baseUrl = config.baseAuthUrl;
}
Is there a way to load those values on runtime from a file? The only thing I can think about is having that mentioned task altering this file straight away. However I have several modules and therefore, that would have to do in all of them which doesn´t seem right.
You can create constants in the config of your main module:
Add $provide as a dependency in your config method
use the provider method to add all constants like this
$provide.provider('BASE_API_URL', {
$get: function () {
return 'https://myexample.net/api/';
}
});
You can use BASE_API_URL as a dependency in your services.
I hope this helps
Optionally you can set the url depending of your environment:
$provide.provider('BASE_API_URL', {
$get: function () {
if(window.location.hostname.toLowerCase() == 'myapp.myexample.net')
{
return 'https://myexample.net/api/' //pre-production
}else
{
return 'http://localhost:61132/'; //local
}
}
});
Regards!
Finally, the solution was generating an angular constants file using templating (gulp-template) through a gulp task. At the end, I am using a yaml file instead a json one (which is the one generated my CI engine with the proper values for the environment I want to deploy to).
Basically:
config.yml
baseUrl: 'http://www.myapp.es/'
baseApiUrl: 'http://api.myapp.es/'
auth:
url: 'auth/'
config.module.constants.template
(function () {
'use strict';
angular
.module('app.config')
.constant('env_variables', {
baseUrl: '<%=baseUrl%>',
baseApiUrl: '<%=baseApiUrl%>',
authUrl: '<%=auth.url%>'
});
}());
gulpfile.js
gulp.task('splicing', function(done) {
var yml = path.join(conf.paths.src, '../config/config.yml');
var json = yaml.safeLoad(fs.readFileSync(yml, 'utf8'));
var template = path.join(conf.paths.src, '../config/config.module.constants.template');
var targetFile = path.join(conf.paths.src, '/app/config');
return gulp.src(template)
.pipe($.template(json))
.pipe($.rename("config.module.constants.js"))
.pipe(gulp.dest(targetFile), done);
});
Then you just inject it in the config phase you need:
function config($authProvider, env_variables) {
$authProvider.baseUrl = env_variables.baseApiUrl + env_variables.authUrl;
}
One more benefit about using gulp for this need is that you can integrate the generation of these constants with your build, serve or watch tasks and literally, forget about doing any change from now on. Hope it helps!
Related
TL;DR: Could you please explain when bundle-loader is needed for code splitting using Webpack?
When I started migrating a Backbone-based app from Require.js to Webpack, I remember that this kind of require statement in the router:
someMatchedRoute: function () {
require(['path-to-file'], function(module) {
// doing something with the loaded module
module();
});
}
would put the required code in the same bundle as the rest of the code, and in order to generate a separate file that would be required dynamically when switching to a particular route, I needed to use bundle-loader, like so:
// a function executed when the user’s profile route is matched
someMatchedRoute: function () {
require('bundle!path-to-file')(function(module) {
// doing something with the loaded module
module();
});
}
Now, when I am migrating my codebase to ES6 modules and using the require.ensure syntax as described in Webpack documentation:
someMatchedRoute: function () {
require.ensure(['path-to-file'], function(require) {
var loadedModule = require('path-to-file');
// doing something with the loaded module
loadedModule();
});
}
I am unsure whether I need bundle-loader at all to in order to generate multiple chunks and load them dynamically. And if I do, in which require call does it go — in the require.ensure or in the require in the callback? Or maybe both? It's all so confusing.
I'm working on an AngularJs/MVC app with Web API etc. which is using a CDN. I have managed to whitelist two URLs for Angular to use, a local CDN and a live CDN (web app hosted in Azure).
I can successfully ng-include a template from my local CDN domain, but the problem arises when I push the site to a UAT / Live environment, I cant be using a template on Localhost.
I need a way to be able to dynamically get the base url for the templates. The location on the server will always be the same, eg: rooturl/html/templates. I just need to be able to change the rooturl depending on the environment.
I was thinking if there was some way to store a global variable, possibly on the $rootScope somewhere that I can get to when using the templates and then set that to the url via Web API which will get return a config setting.
For example on my dev machine the var could be http://Localhost:52920/ but on my uat server it could be https://uat-cdn.com/
Any help would be greatly appreciated as I don't want to store Js, css, fonts etc on the CDN but not the HTML as it feels nasty.
Thanks I'm advance!
I think it's good practice to keep environment and global config stuff outside of Angular altogether, so it's not part of the normal build process and is harder to accidentally blow away during a deploy. One way is to include a script file containing just a single global variable:
var config = {
myBaseUrl: '/templates/',
otherStuff: 'whatever'
}
...and expose it to Angular via a service:
angular.module('myApp')
.factory('config', function () {
var config = window.config ? window.config : {}; // (or throw an error if it's not found)
// set defaults here if useful
config.myBaseUrl = config.myBaseUrl || 'defaultBaseUrlValue';
// etc
return config;
}
...so it's now injectable as a dependency anywhere you need it:
.controller('fooController', function (config, $scope), {
$scope.myBaseUrl = config.myBaseUrl;
}
Functionally speaking, this is not terribly different from dumping a global variable into $rootScope but I feel like it's a cleaner separation of app from environment.
If you decide to create a factory then it would look like this:
angular.module('myModule', [])
.factory('baseUrl', ['$location', function ($location) {
return {
getBaseUrl: function () {
return $location.hostname;
}
};
}]);
A provider could be handy if you want to make any type of customization during config.
Maybe you want to build the baseurl manually instead of using hostname property.
If you want to use it on the templates then you need to create a filter that reuses it:
angular.module('myModule').filter('anchorBuilder', ['baseUrl', function (baseUrl) {
return function (path) {
return baseUrl.getBaseUrl() + path;
}
}]);
And on the template:
EDIT
The above example was to create links but if you want to use it on a ng-include directive then you will have a function on your controller that uses the factory and returns the url.
// Template
<div ng-include src="urlBuilder('path')"></div>
//Controller
$scope.urlBuilder = function (path) {
return BaseUrl.getBaseUrl() + path;
};
Make sure to inject the factory in the controller
I'm learning the gulp way of doing things after using grunt exclusively in the past. I'm struggling to understand how to pass multiple inputs to get multiple outputs w/gulp.
Let's say I have a large project that has specialized js on a per page basis:
The Grunt Way:
grunt.initConfig({
uglify: {
my_target: {
files: {
'dest/everypage.min.js': ['src/jquery.js', 'src/navigation.js'],
'dest/special-page.min.js': ['src/vendor/handlebars.js', 'src/something-else.js']
}
}
}
});
This may be a poor example as it violates the "do only one thing" principle since grunt-uglify is concatenating and uglifying. In any event I'm interested in learning how to accomplish the same thing using gulp.
Thanks to #AnilNatha I'm starting to think with more of a Gulp mindset.
For my case I have a load of files that need to be concatenated. I offloaded these to a config object that my concat task iterates over:
// Could be moved to another file and `required` in.
var files = {
'polyfills.js': ['js/vendor/picturefill.js', 'js/vendor/augment.js'],
'map.js': [
'js/vendor/leaflet.js',
'js/vendor/leaflet.markercluster.min.js',
'js/vendor/jquery.easyModal.js',
'js/vendor/jquery-autocomplete.min.js',
'js/vendor/underscore.1.8.3.js',
'js/map.js'
],
...
};
var output = './build/js';
// Using underscore.js pass the key/value pair to custom concat function
gulp.task('concat', function (done) {
_.each(files, concat);
// bs.reload(); if you're using browsersync
done(); // tell gulp this asynchronous process is complete
});
// Custom concat function
function concat(files, dest) {
return gulp.src(files)
.pipe($.concat(dest))
.pipe(gulp.dest(output));
}
This must be obvious but I can't find it. I want to preprocess my stylus/coffee files with a watcher in the dev environment and in production with a build task (isn't that common to all of us?) and also run a few more minification and uglification steps in production but I want to share the pipe steps common to both dev and production for DRY
The problem is that when I run the task which watches the files, the task which preprocesses does that to all the files since it has its own gulp.src statement which includes all stylus files.
How do I avoid compiling all files on watching while still keeping the compile task separate. Thanks
paths = {
jade: ['www/**/*.jade']
};
gulp.task('jade', function() {
return gulp.src(paths.jade).pipe(jade({
pretty: true
})).pipe(gulp.dest('www/')).pipe(browserSync.stream());
});
gulp.task('serve', ['jade', 'coffee'], function() {
browserSync.init({
server: './www'
});
watch(paths.jade, function() {
return gulp.start(['jade']);
});
return gulp.watch('www/**/*.coffee', ['coffee']);
});
One important thing in Gulp is not to duplicate pipelines. If you want to process your stylus files, it has to be the one and only stylus pipe. If you want to execute different steps in your pipe however, you have multiple choices. One that I would suggest would be a noop() function in conjunction with a selection function:
var through = require('through2'); // Gulp's stream engine
/** creates an empty pipeline step **/
function noop() {
return through.obj();
}
/** the isProd variable denotes if we are in
production mode. If so, we execute the task.
If not, we pass it through an empty step
**/
function prod(task) {
if(isProd) {
return task;
} else {
return noop();
}
}
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(stylus())
.pipe(prod(minifyCss())) // We just minify in production mode
.pipe(gulp.dest(path.whatever))
})
As for the incremental builds (building just the changed files with every iteration), the best way would be to get on the gulp-cached plugin:
var cached = require('gulp-cached');
gulp.task('stylus', function() {
return gulp.src(path.styles)
.pipe(cached('styles')) // we just pass through the files that have changed
.pipe(stylus())
.pipe(prod(minifyCss()))
.pipe(gulp.dest(path.whatever))
})
This plugin will check if the contents have changed with each iteration you have done.
I spend a whole chapter on Gulp for different environments in my book, and I found those to be the most suitable ones. For more information on incremental builds, you can also check on my article on that (includes Gulp4): http://fettblog.eu/gulp-4-incremental-builds/
I am trying to pass a parameter to a task that is being invoked by gulp-watch. I need it because I am trying to build a modular framework.
So if a file changes in module 1, the other modules don't need to be rebuild.
And I want just one function to create the concatted & uglified files per module.
This is what I got so far:
//here I need the 'module' parameter
gulp.task('script', function(module) { ... }
gulp.task('watch', function() {
gulp.watch('files/in/module1/*.js', ['script']); //here I want to pass module1
gulp.watch('files/in/module2/*.js', ['script']); //here I want to pass module2
});
A lot of the documentation/examples seems to be outdated (gulp.run(), gulp.start()).
I hope someone can help me out here.
I had the very same issue, searched for a while, and the "cleanest" way I came up with, uses the .on() event handler of gulp.watch(), and the .env property of gulp-util:
var gulp = require('gulp');
$.util = require('gulp-util');
var modules = {
module1: {}, // awesome module1
module2: {} // awesome module2
};
gulp.task('script', function(){
var moduleName = $.util.env.module;
// Exit if the value is missing...
var module = modules[moduleName];
if (!module) {
$.util.log($.util.colors.red('Error'), "Wrong module value!");
return;
}
$.util.log("Executing task on module '" + moduleName + "'");
// Do your task on "module" here.
});
gulp.task('watch', function () {
gulp.watch(['files/in/module1/*.js'], ['script']).on('change', function () {
$.util.env.module = 'module1';
});
gulp.watch(['files/in/module2/*.js'], ['script']).on('change', function () {
$.util.env.module = 'module2';
});
});
gulp-util also comes in handy if you need to pass (global) parameters from the shell:
[emiliano#dev ~]# gulp script --module=module1 --minify
Hope this helps someone else out there!
Regards.
In that i will answer directly the question "How to pass a parameter to gulp-watch invoked task"
My way of doing, and one of the possibility i see, is to use a global variable to pass the value between the two blocks. you set it just before launching the task in the watcher. And in the task, just at the start you pass it to a local variable.
See this answer for more details: https://stackoverflow.com/a/49733123/7668448
In what you want to achieve, you can too use just one watcher over the directory that hold all modules. If so is the structure. Then when a change happen, you can recover the changed file path. From that you can deduce what module does belong to. By getting the Module folder. That way you will not need to add a new watcher for each new module. Which can be nice when there is multiple contributors to the project for example when working on open source. And you do it one time, and don't have to care about adding anything. Just like with the delegation principle, with DOM event handling when there is multiple elements. Even if the chosen structure, doesn't have all the modules in one directory. You can stay pass multiple globs to the one watcher.
gulp.watch(['glob1/**/*.js', 'glob2/**/*.js',...], function(evt) {/*.....*/});
And following the structure you have, you can work your way to deduce what module is.
For the watcher here how i suggest you do it:
watch('./your/allModulesFolder/**/*.js', function (evt) {
rebuildModulWatchEvt = evt; //here you update the global var
gulp.start('rebuildModul'); // you start the task
})
The evt here hold multiple info: cwd, base, state, _contents ...etc And what interest us is path. So evt.path will give you the path of the changed file.
In your task either you do that:
gulp.task('rebuildModul', function() {
let evt = rebuildModulWatchEvt; // at all start you pass it to a local var
let filePath = evt.path; // how you get the changed file path
// your code go here for the rest, following your structure, get the path for the module folder
});
or you use a function :
gulp.task('rebuildModul', function() {
rebuildModulTaskRun(rebuildModulWatchEvt);
});
function rebuilModulTaskRun(evt) {
let filePath = evt.path;
// your code go here for the rest, following your structure, get the path for the module folder
}