Is there a way to lazily load static json during Angular6 routing? - json

I've got an Angular6 app that is being built as more of a framework for internal apps. The header / footer / major navigation will be shared, but each app (feature module) should have internal flows separate from each other. For example one feature might be purchasing, another might be onboarding, another might be a sales dashboard. They're very different from each other.
What I'm trying to do is come up with a declarative way to define known configurations for each feature. For example, the minor navigation (page to page within a feature), top level header title, and various other context related data points. My initial thought was to have them defined as JSON within each feature, which works great except I now have to import every feature's config regardless of whether a user navigates, or even has access to, that feature.
I've already got the context service set up that is checking the URL on navigation and setting some items, but again, it has to import all possible configs using this at the top of that service.
import { fooConfig } from '#foo\foo.config';
import { barConfig } from '#bar\bar.config';
import { bazConfig } from '#baz\baz.config';
So the question is: Is there a way for me to check the URL on navigation, and within that subscription, pick up the config from the correct file without pre-maturely importing them? Or maybe I can use the module to express / declare those options?

Using Typescript's Dynamic Import Expressions might be a viable option in your case..
let config;
switch (val) {
case 'foo': config = await import('#foo\foo.config'); break;
case 'bar': config = await import('#bar\bar.config'); break;
case 'baz': config = await import('#baz\baz.config'); break;
}
Though, as far as I know, there's now way at the time of writing to use variables for the import string (e.g. await import(path)).

updateConfig(feature: string) {
import(`../configs/${feature}.config.json`).then(cfg => {
this._currentConfig$.next(cfg);
});
}
The above snippet shows what I came up with. It turns out WebPack can't digest the # paths you normally use on imports and you also can't use fully variable paths. All possible options for the module package must have some common part of the path. I ended up moving my configs from #foo\foo.config.ts to #core\configs\foo.config.json. Which makes them less modular because now core is holding their config, but it makes the module lazy.

Related

Can you preview ASP.NET Core's appsettings.json environment overrides?

In ASP.NET Core, the JsonConfigurationProvider will load configuration from appsettings.json, and then will read in the environment version, appsettings.{Environment}.json, based on what IHostingEnvironment.EnvironmentName is. The environment version can override the values of the base appsettings.json.
Is there any reasonable way to preview what the resulting overridden configuration looks like?
Obviously, you could write unit tests that explicitly test that elements are overridden to your expectations, but that would be a very laborious workaround with upkeep for every time you change a setting. It's not a good solution if you just wanted to validate that you didn't misplace a bracket or misspell an element name.
Back in ASP.NET's web.config transforms, you could simply right-click on a transform in Visual Studio and choose "Preview Transform". There are also many other ways to preview an XSLT transform outside of Visual Studio. Even for web.config parameterization with Parameters.xml, you could at least execute Web Deploy and review the resulting web.config to make sure it came out right.
There does not seem to be any built-in way to preview appsettings.{Environment}.json's effects on the base file in Visual Studio. I haven't been able to find anything outside of VS to help with this either. JSON overriding doesn't appear to be all that commonplace, even though it is now an integral part of ASP.NET Core.
I've figured out you can achieve a preview with Json.NET's Merge function after loading the appsettings files into JObjects.
Here's a simple console app demonstrating this. Provide it the path to where your appsettings files are and it will emit previews of how they'll look in each environment.
static void Main(string[] args)
{
string targetPath = #"C:\path\to\my\app";
// Parse appsettings.json
var baseConfig = ParseAppSettings($#"{targetPath}\appsettings.json");
// Find all appsettings.{env}.json's
var regex = new Regex(#"appsettings\..+\.json");
var environmentConfigs = Directory.GetFiles(targetPath, "*.json")
.Where(path => regex.IsMatch(path));
foreach (var env in environmentConfigs)
{
// Parse appsettings.{env}.json
var transform = ParseAppSettings(env);
// Clone baseConfig since Merge is a void operation
var result = (JObject)baseConfig.DeepClone();
// Merge the two, making sure to overwrite arrays
result.Merge(transform, new JsonMergeSettings
{
MergeArrayHandling = MergeArrayHandling.Replace
});
// Write the preview to file
string dest = $#"{targetPath}\preview-{Path.GetFileName(env)}";
File.WriteAllText(dest, result.ToString());
}
}
private static JObject ParseAppSettings(string path)
=> JObject.Load(new JsonTextReader(new StreamReader(path)));
While this is no guarantee there won't be some other config source won't override these once deployed, this will at least let you validate that the interactions between these two files will be handled correctly.
There's not really a way to do that, but I think a bit about how this actually works would help you understand why.
With config transforms, there was literal file modification, so it's easy enough to "preview" that, showing the resulting file. The config system in ASP.NET Core is completely different.
It's basically just a dictionary. During startup, each registered configuration provider is run in the order it was registered. The provider reads its configuration source, whether that be a JSON file, system environment variables, command line arguments, etc. and builds key-value pairs, which are then added to the main configuration "dictionary". An "override", such as appsettings.{environment}.json, is really just another JSON provider registered after the appsettings.json provider, which obviously uses a different source (JSON file). Since it's registered after, when an existing key is encountered, its value is overwritten, as is typical for anything being added to a dictionary.
In other words, the "preview" would be completed configuration object (dictionary), which is composed of a number of different sources, not just these JSON files, and things like environment variables or command line arguments will override even the environment-specific JSON (since they're registered after that), so you still wouldn't technically know the the environment-specific JSON applied or not, because the value could be coming from another source that overrode that.
You can use the GetDebugView extension method on the IConfigurationRoot with something like
app.UseEndpoints(endpoints =>
{
if(env.IsDevelopment())
{
endpoints.MapGet("/config", ctx =>
{
var config = (Configuration as IConfigurationRoot).GetDebugView();
return ctx.Response.WriteAsync(config);
});
}
});
However, doing this can impose security risks, as it'll expose all your configuration like connection strings so you should enable this only in development.
You can refer to this article by Andrew Lock to understand how it works: https://andrewlock.net/debugging-configuration-values-in-aspnetcore/

Lazy load web-components from other services in polymer 3

I have a microservice based application and each service has a set of polymer based web-components. I want to load these at runtime in the application that is served by one of them at runtime, so that I can run, maintain and deploy the services seperately.
I would like to avoid having a npm repo that serves the components for a central build and each new web component version would make it necessary to rebuild and redeploy the application.
Existing lazy loading examples all lazy load components of the same application, so it's built as a whole and just packaged in chunks.
The application is available under /app/ and the modules are under /mod/...
I can do this in to react to a route:
import('/mod/admindashboard/kw-admindashboard.js').then((module) => {
// Put code in here that you want to run every time when
// navigating to view1 after my-view1.js is loaded.
console.log("Loaded admindashboard");
});
and then I can use the corresponding web component, but for this to work I need to hack the component like this:
import { PolymerElement, html } from '/app/node_modules/#polymer/polymer/polymer-element.js';
import '/app/node_modules/#polymer/iron-icon/iron-icon.js';
import '/app/node_modules/#polymer/iron-icons/iron-icons.js';
class KwAdmindashboard extends PolymerElement {
static get template() {
...
But this approach prevents me completely to run tests, create static builds and IDE support is not available either in many areas, as it's not able to see what is available later at runtime. So as absolute fallback this would be possible. Isn't there a way to utilize the serviceWorkers to handle mapping?
Here below is I think a good example of your requirement. Modules will be loaded with page properties. As page property is depended on iron-page, selected='{{page}}' when page value has been changed with iron-page's name properties, its observer loads the that page's modules. :
static get properties() { return {
page: {
type: String,
reflectToAttribute: true,
observer: '_pageChanged'
},
.......
_pageChanged(page, oldPage) {
if (page != null) {
let cb = this._pageLoaded.bind(this, Boolean(oldPage));
// HERE BELOW YOUR PAGE NAMES
switch (page) {
case 'list':
import('./shop-list.js').then(cb);
break;
case 'detail':
import('./shop-detail.js').then(cb);
break;
case 'cart':
import('./shop-cart.js').then(cb);
break;
case 'checkout':
import('./shop-checkout.js').then(cb);
break;
default:
this._pageLoaded(Boolean(oldPage));
}
here above cb is a function which is loading lazy modules but needs to load immediately after the first render. Which is minimum required files.
_pageLoaded(shouldResetLayout) {
this._ensureLazyLoaded();
}
Here the full code's link of the above. Hope this helps In case of any question I will try to reply. https://github.com/Polymer/shop/blob/master/src/shop-app.js
It seems like Polymer 3 is not yet ready for distributed locations of webcomponents.
There are github issues at the W3C which may solve these problems:
https://github.com/w3c/webcomponents/issues/716
Web component registries for really distributed component development without clashes due to namespaced component registration
https://github.com/domenic/import-maps
introduces a mapping from "nopm module names" to locations, which enables runtime binding much easier
For now I will switch my development model, so the microservices provide one or more webcomponents to my npm repo in nexus and the admin app has build time dependencies to them and builds the whole app in one go and there I can use the lazy loading approach that the shop demo also promotes/shows.
For a decent development experience with webcomponents from multiple sources, you should have a look at "npm link".
Feel free to add another solution for the problem or a real solution as soon as the technology and standards caught up.

JavaScript import statements running code

I'm trying to diagnose a problem thats recently arisen for us, after upgrading our suppported browser (~40 -> ~60)
We have this effective code in an external (now unsupported) library, that sits in an iffe:
(function(window, undefined){
# external library
if(!window.terribleIdea){
window.terribleIdea = {}
}
<some code>
if(!window.terribleIdea.config.url){
window.terribleIdea.config.url = 'the wrong url'
}
localStorage.set('somethingImportant', getStuff(window.terribleIdea.config.url))
})( window );
Now we did have a bootstap type file that looked like this:
# appBootstrapper.js
import applyConfig from './app/configApplier';
import ALL_ANGULAR_MODULES from './app'; # contains angular.module set up for
# app and every dependency
fetchConfig()
.then(applyConfig)
.then () => angular.bootstrap(document, [ALL_ANGULAR_MODULES])
.catch( error => {
alert(`It borked: ${error.message}`)
});
Amongst other things applyConfig does:
window.terribleIdea = {
config: {
url: 'not terrible'
}
}
What now happens, is that the import statement of ALL_ANGULAR_MODULES ends up running the code in the external library, and setting local storage. What we think used to happen is that it was only called on angular.bootstrap running.
Now what I need to find out is, has the functionality of the import statement changed in a later version of chrome, or has it always been running this code and gone unnoticed?
All I can find is references to Dynamic Imports, and the order of scripts running, in <script></script> tags.
It is hard to debug without access to the project (see discussion in comments above). Here are some possibilities worth exploring while encountering such issues. It is of course possible that it was like this all along.
Change in the bundler configuration. Webpack accepts entries as arrays, and order matters in them.
Change in the way the bundler/dependency manager reacts to dynamic imports
Change in the way your application loads its dependency during the its bootstrap phase. It is not (imo) angular specific, as many app use some kind of "componentization" which translates in files executed in a different order they are imported (as they may only export constructors or whatnot).

Vue/Vuex - Load state from JSON file

I am trying to bulid a simple app using Vue / Vuex starting from vue-cli webpack template.
The app works fine but I would like to add the possibility to load and save the state in a JSON file.
Is there a best practice in order to do that ?
My first idea was to read the data into the file store.js
import Vue from 'vue'
import Vuex from 'vuex'
import fs from 'fs'
// Read file
let loaded = null
fs.readFile('./data.json', 'utf8', (err, data) => {
if (err) throw err;
loaded = data;
})
Vue.use(Vuex)
const state = {
notes: loaded,
activeNote: {}
}
...
...
But I am getting error when I try to import fs module.
There are great plugins available for exactly what you're trying to do.
Basically, having Vuex and defining a state is he way to go, however, you should do it a little different.
Take a look at this plugin:
https://github.com/robinvdvleuten/vuex-persistedstate
Since you're using Webpack it is pretty easy to install, use yarn add vuex-persistedstate for example..
Then, you import the plugin using import createPersistedState from 'vuex-persistedState'.
Now you change up your store a little bit, doing something like this:
export const store = new Vuex.Store({
state: {
yourVar: 0
},
mutations: {
changeValue (state, number) {
state.yourVar += number
}
},
plugins: [createPersistedState()]
})
That's basically it. All you need to do is add the plugin line to your Vuex store and all variable data inside your state will be saved in the browsers localStorage by default. Of course, you can read through the GitHub repository to see how you can use sessions, cookies etc, but that should work just fine.
The important part here are your mutations, since everything you want to do with your store variables HAVE to be declared by a mutation function.
If you try to modify your stored variables using ordinary functions, you'll get some warnings. The reason behind this is to ensure that no unexpected mutation of your data will take place, so you have to explicitly define what you want your program to accept in order to change your variables.
Also, using the export const store before your new Vuex.Store allows you to import that state in any of your Vue components and call the mutation functions out of there as well, using store.commit('changeValue', number).
I hope this answer helps you out a little bit, I was struggling with the exact same problem about 2 days ago and this is working like a charm ;)

Using ES6 `import` with CSS/HTML files in Meteor project: bug or feature?

I am currently learning Meteor and I found out something that intrigued me.
I can load HTML and CSS assets from a JS file using the import statement.
import '../imports/hello/myapp.html';
import '../imports/hello/myapp.css';
import * as myApp from '../imports/hello/myapp.js';
This was a surprise to me so I ran to google but could not find this behavior documented in the specification for ES6 import or in Meteor's Docs.
So my questions are:
Can I rely on this behavior to build my apps?
Will my app will break when Meteor gets around to fix it -- if it's a bug --?
Notes
I am using Meteor v1.3, not sure if this works also with previous versions.
You can download the app to see this behavior from Github
After going through the implementation of the built files for my app
I found out why this works.
HTML
Files are read from the file system and their contents added to the global Template object, e.g.,
== myapp.html ==
<body>
<h1>Welcome to Meteor!</h1>
{{> hello}}
</body>
results in the following JS code:
Template.body.addContent((function () {
var view = this;
return [
HTML.Raw("<h1>Welcome to Meteor!</h1>\n\n "),
Spacebars.include(view.lookupTemplate("hello"))
];
}));
Which is wrapped in a function with the name of the file as it's key:
"myapp.html": function (require, exports, module) {
Template.body.addContent((function () {
var view = this;
return [
HTML.Raw("<h1>Welcome to Meteor!</h1>\n\n "),
Spacebars.include(view.lookupTemplate("hello"))];
}));
Meteor.startup(Template.body.renderToDocument);
Template.__checkName("hello");
Template["hello"] = new Template("Template.hello", (
function () {
var view = this;
return [
HTML.Raw("<button>Click Me</button>\n "),
HTML.P("You've pressed the button ",
Blaze.View("lookup:counter",
function () {
return Spacebars.mustache(view.lookup("counter"));
}), " times.")
];
}));
},
So all of our HTML is now pure JS code which will be included by using require like any other module.
CSS
The files are also read from the file system and their contents are embedded also in JS functions, e.g.
== myapp.css ==
/* CSS declarations go here */
body {
background-color: lightblue;
}
Gets transformed into:
"myapp.css": ["meteor/modules", function (require, exports, module) {
module.exports = require("meteor/modules").addStyles("/* CSS declarations go here */\n\nbody {\n background-color: lightblue;\n}\n");
}]
So all of our CSS is also now a JS module that's again imported later on by using require.
Conclusion
All files are in one way or another converted to JS modules that follow similar rules for inclusion as AMD/CommonJS modules.
They will be included/bundled if another module refers to them. And since all of them are transformed to JS code
there's no magic behind the deceitful syntax:
import '../imports/hello/myapp.html';
import '../imports/hello/myapp.css';
They both are transpiled to their equivalent forms with require once the assets have been transformed to JS modules.
Whereas the approach of placing static assets in the imports directory is not mentioned in the official documentation,
this way of importing static assets works.
This seems to be at the core of how Meteor works so I'd bet this functionality is going to be there for a long while.
I don't know if to call this a feature maybe a more appropriate description is unexpected consequence but that would
only be true from the user's perspective, I assume the people who wrote the code understood this would happen and perhaps even
designed it purposely this way.
One of the features in Meteor 1.3 is lazy-loading where you place your files in the /imports folder and will not be evaluated eagerly.
Quote from Meteor Guide:
To fully use the module system and ensure that our code only runs when
we ask it to, we recommend that all of your application code should be
placed inside the imports/ directory. This means that the Meteor build
system will only bundle and include that file if it is referenced from
another file using an import.
So you can lazy load your css files by importing them from the /imports folder. I would say it's a feature.
ES6 export and import functionally are available in Meteor 1.3. You should not be importing HTML and CSS files if you are using Blaze, the current default templating enginge. The import/export functionality is there, but you may be using the wrong approach for building your views.