Using the async plugin with requirejs to load googlemaps - google-maps

I am trying to use the Async module of https://github.com/millermedeiros/requirejs-plugins to load the googlemap api. My index.html file contains the following requirejs configuration:
<script data-main="scripts/myscript" src="scripts/require.js"></script>
<script>
requirejs.config({
"baseUrl": "scripts",
"paths": {
"async": "require_plugins/async",
"gmaps": "gmaps",
"infobox":"http://google-maps-utility-library-v3.googlecode.com/svn/trunk/infobox/src/infobox",
"jquery":"//code.jquery.com/jquery-2.1.1.min",
"jquery_mob":"//code.jquery.com/mobile/1.4.3/jquery.mobile-1.4.3.min"
},
waitSeconds: 15
});
All my javascript files are stored in a scripts folder (relative to index.html)
e.g. script/myscript.js and script/require.js and the async plugins are stored in a subfolder of script called require_plugins e.g. script/require_plugins/async.js
The javascript where I define the googlemap module is called gmaps.js (stored in the script folder) and looks as follows:
define("GMAP",['async!https://maps.googleapis.com/maps/api/js? &key=xxxxxx&region=uk&libraries=places,geometry'], function () {
return window.google.maps;
});
I have obfuscated the key parameter intentionally here. According to the documentation, I should be able to use the gmaps module anywhere in other javascript files just by invoking it like so:
require(["gmaps"],function(GMAP)
{
map= new GMAP.Map("#map-div");
//and then some code to display the map
}
Unfortunately, it does not work at all. It seems that the googlemap library has not loaded at all. I use absolute URLs for jquery and that works fine but googlemap fails miserably. My question is: Is there something wrong with my requirejs config? I can't think of anything else causing this fault :(

My understanding is that the name you set in define() is what you need to use when writing the dependencies.
e.g.:
define('GMAP', ['async!<path>'], function() { return window.google.maps; });
require(['GMAP'], function(GMaps) { ... });
This is how I get GMaps to load for me. I have a secondary problem now that other libraries that depend on Maps no longer load.

Related

Unable to load tensorflowJS model in chrome extension

I am trying to load a trained keras model into web browser using tensorflowjs.
I was able to convert the keras model to tensorflowjs model but unable to load model in chrome extension.
My background.js code to load model
async function app() {
alert('Loading model..');
model = await loadModel("model.json");
alert('Sucessfully loaded model');
}
chrome.runtime.onInstalled.addListener(function(details) {
alert("extension loaded");
chrome.tabs.executeScript(null,
{file:"https://cdn.jsdelivr.net/npm/#tensorflow/tfjs#1.0.0/dist/tf.min.js"});
app();
});
THe url "https://cdn.jsdelivr.net/npm/#tensorflow/tfjs#1.0.0/dist/tf.min.js" is added in permissions key in manifest file.
When i try to laod the extension it fails givind message loadModel is not defined.
Any suggestions on fixing this issue?
loadModel is not defined.
1 - Make sure that the script is loaded in the background process of the tabs
2 - You need to use tf.loadModel() instead of loadModel()

Webpack: ES6 modules, code splitting, and bundle-loader

TL;DR: Could you please explain when bundle-loader is needed for code splitting using Webpack?
When I started migrating a Backbone-based app from Require.js to Webpack, I remember that this kind of require statement in the router:
someMatchedRoute: function () {
require(['path-to-file'], function(module) {
// doing something with the loaded module
module();
});
}
would put the required code in the same bundle as the rest of the code, and in order to generate a separate file that would be required dynamically when switching to a particular route, I needed to use bundle-loader, like so:
// a function executed when the user’s profile route is matched
someMatchedRoute: function () {
require('bundle!path-to-file')(function(module) {
// doing something with the loaded module
module();
});
}
Now, when I am migrating my codebase to ES6 modules and using the require.ensure syntax as described in Webpack documentation:
someMatchedRoute: function () {
require.ensure(['path-to-file'], function(require) {
var loadedModule = require('path-to-file');
// doing something with the loaded module
loadedModule();
});
}
I am unsure whether I need bundle-loader at all to in order to generate multiple chunks and load them dynamically. And if I do, in which require call does it go — in the require.ensure or in the require in the callback? Or maybe both? It's all so confusing.

Programmatically loading a ES6 module with Traceur in web page

I have been using Traceur to develop some projects in ES6. In my HTML page, I include local Traceur sources:
<script src="traceur.js"></script>
<script src="bootstrap.js"></script>
and if I have a module in the HTML afterwards like:
<script type="module" src="foo.js"></script>
Then Traceur loads in that module, compiles it and everything works great.
I now want to programmatically add an ES6 module to the page from within another ES6 module (reasons are somewhat complicated). Here was my first attempt:
var module = document.createElement('script');
module.setAttribute('type', 'module');
module.textContent = `
console.log('Inside the module now!');
`;
document.body.appendChild(module);
Unfortunately this doesn't work as Traceur does not monitor the page for every script tag added, I guess.
How can I get Traceur to compile and execute the script? I guess I need to invoke something on either 'traceur' or '$traceurRuntime' but I haven't found a good online source of documentation for that.
You can load other modules using ES6 import statements or TraceurLoader API for dynamic dependencies.
Example from Traceur Documentation
function getLoader() {
var LoaderHooks = traceur.runtime.LoaderHooks;
var loaderHooks = new LoaderHooks(new traceur.util.ErrorReporter(), './');
return new traceur.runtime.TraceurLoader(loaderHooks);
}
getLoader().import('../src/traceur.js',
function(mod) {
console.log('DONE');
},
function(error) {
console.error(error);
}
);
Also, System.js loader seems to be supported as well
window.System = new traceur.runtime.BrowserTraceurLoader();
System.import('./Greeter.js');
Dynamic module loading is a (not-yet-standardized) feature of System:
System.import('./repl-module.js').catch(function(ex) {
console.error('Internal Error ', ex.stack || ex);
});
To make this work you need to npm test then include BrowserSystem
<script src="../bin/BrowserSystem.js"></script>
You might also like to look into https://github.com/systemjs/systemjs as it has great support for browser loading.
BTW the System object may eventually be standardize (perhaps under a different name) in the WHATWG: http://whatwg.github.io/loader/#system-loader-instance

AngularJs Dynamic/Multiple HTML Templates

I'm working on an AngularJs/MVC app with Web API etc. which is using a CDN. I have managed to whitelist two URLs for Angular to use, a local CDN and a live CDN (web app hosted in Azure).
I can successfully ng-include a template from my local CDN domain, but the problem arises when I push the site to a UAT / Live environment, I cant be using a template on Localhost.
I need a way to be able to dynamically get the base url for the templates. The location on the server will always be the same, eg: rooturl/html/templates. I just need to be able to change the rooturl depending on the environment.
I was thinking if there was some way to store a global variable, possibly on the $rootScope somewhere that I can get to when using the templates and then set that to the url via Web API which will get return a config setting.
For example on my dev machine the var could be http://Localhost:52920/ but on my uat server it could be https://uat-cdn.com/
Any help would be greatly appreciated as I don't want to store Js, css, fonts etc on the CDN but not the HTML as it feels nasty.
Thanks I'm advance!
I think it's good practice to keep environment and global config stuff outside of Angular altogether, so it's not part of the normal build process and is harder to accidentally blow away during a deploy. One way is to include a script file containing just a single global variable:
var config = {
myBaseUrl: '/templates/',
otherStuff: 'whatever'
}
...and expose it to Angular via a service:
angular.module('myApp')
.factory('config', function () {
var config = window.config ? window.config : {}; // (or throw an error if it's not found)
// set defaults here if useful
config.myBaseUrl = config.myBaseUrl || 'defaultBaseUrlValue';
// etc
return config;
}
...so it's now injectable as a dependency anywhere you need it:
.controller('fooController', function (config, $scope), {
$scope.myBaseUrl = config.myBaseUrl;
}
Functionally speaking, this is not terribly different from dumping a global variable into $rootScope but I feel like it's a cleaner separation of app from environment.
If you decide to create a factory then it would look like this:
angular.module('myModule', [])
.factory('baseUrl', ['$location', function ($location) {
return {
getBaseUrl: function () {
return $location.hostname;
}
};
}]);
A provider could be handy if you want to make any type of customization during config.
Maybe you want to build the baseurl manually instead of using hostname property.
If you want to use it on the templates then you need to create a filter that reuses it:
angular.module('myModule').filter('anchorBuilder', ['baseUrl', function (baseUrl) {
return function (path) {
return baseUrl.getBaseUrl() + path;
}
}]);
And on the template:
EDIT
The above example was to create links but if you want to use it on a ng-include directive then you will have a function on your controller that uses the factory and returns the url.
// Template
<div ng-include src="urlBuilder('path')"></div>
//Controller
$scope.urlBuilder = function (path) {
return BaseUrl.getBaseUrl() + path;
};
Make sure to inject the factory in the controller

Referencing resources in a global way either from a virtual directory or the web root?

Let's say I have an MVC/WebAPI/AngularJS site that I'm running locally, e.g. ;
localhost/Test/
which I then want to move to
www.test.com
While local, I have a lot of references to various directories (jsfiles, etc) of the following format (in either JS or HTML files)
app.directive('rpdbSpinner', function() {
return {
restrict: 'E',
**templateUrl: '/Test/templates/directives/spinner.html',**
scope: {
isLoading:'='
}
}
})
when updating/web publishing, I'd have to change everything to:
app.directive('rpdbSpinner', function() {
return {
restrict: 'E',
**templateUrl: '/templates/directives/spinner.html',**
scope: {
isLoading:'='
}
}
})
I can do this manually (which is what I've been doing),but the larger the project grows, the harder it becomes. I could, of course, only change it once and then excluded the files during publishing phase (web.config/rest), but it still feels like I am going about it the wrong way. Using "~/" wouldn't work on plain HTML/JS files as far as I'm aware, and this I can't really use it...
Any suggestions to map to paths globally regardless of whether in a Virtual Directory or the root of a project?
Thanks :)
If you simply care about getting the root/base url of the site so you can append that to get the other url you are after, you may simply use / as the first character of your url.
var getUsersUrl = "/api/users";
Here is an alternate approach if you want more than just the app root (Ex : Specific urls( built using mvc helper methods such as Url.RouteUrl etc)
You should not hard code your app base path like that. You may use the Url.Content or Url.RouteUrl helper methods in your razor view to generate the url to the app base. It will take care of correctly building the url regardless of your current page/path.Once you get this value, assign it to a javascript variable and use that in your other js code to build your other urls. Always make sure to use javascript namespacing when doing so to avoid possible issues with global javascript variables.
So in your razor view (Layout file or specific view), you may do this.
<script>
var myApp = myApp || {};
myApp.Urls = myApp.Urls || {};
myApp.Urls.baseUrl = '#Url.Content("~")';
myApp.Urls.userListUrl = '#Url.Action("Index","User")';
</script>
<script src="~/Scripts/NonAngularJavaScript.js"></script>
<script src="~/Scripts/AngularControllerForPage.js"></script>
<script>
var a = angular.module("app").value("appSettings", myApp);
</script>
In your angular controller, you can access it like,
var app = angular.module("app", []);
var ctrl = function (appSettings) {
var vm = this;
console.log(appSettings.Urls.userListUrl);
vm.baseUrl = appSettings.Urls.baseUrl;
//build other urls using the base url now
var getUsersUrl = vm.baseUrl + "api/users";
console.log(getUsersUrl);
};
app.controller("ctrl", ctrl)
You can also access this in your data services, directives etc.
In your non angular java script files.
// With the base url, you may safely add the remaining url route.
var urlToJobIndex2= myApp.Urls.baseUrl+"jobs/GetIndex";
Using "~/" wouldn't work on plain HTML/JS files as far as I'm aware,
and this I can't really use it...
Yes, but you could inject it in your main server-side served webpage as a variable:
<script>
var baseUrl = ... get the base url from the server using ~/
</script>
and then in your external scripts simply concatenate the relative urls with it. As far as static html files are concerned, then it could be a little more problematic. You could serve them through some special server side handler that will take care of injecting this logic.
You can use module.constant to create an injectable which you can use.
app.constant("URL_BASE", "/Test");
app.directive('rpdbSpinner', function(URL_BASE) {
return {
restrict: 'E',
**templateUrl: URL_BASE + '/templates/directives/spinner.html',**
scope: {
isLoading:'='
}
}
})
You can also use module.value if you register it before you register your directive.
For more information see AngularJS Module Guide -- configuration.