Usually in a plain javascript site, I can use the following script to reference google maps api and set the callback function with initMap.
<script async defer src="https://maps.googleapis.com/maps/api/js?callback=initMap"></script>
What I observed is the initMap function in the plain javascript site is under the window scope, and it can be referenced in the script parameter settings - ?callback=initMap, but once I write a component in angular2 with a component method called initMap, the initMap will be under the scope of my component. Then the async loading script I set up in the index will not be able to catch my component initMap method.
Specifically, I 'd like to know how to achieve the same thing in Angular2?
PS: I know there is an angular2-google-maps component available in alpha via npm, but it currently is shipped with limited capability, so I 'd like to know how to load it in an easier way without using another component so I can just use google maps api to implement my project.
I see you don't want another component, but polymer has components that work well with google apis. I have angular2 code that uses the polymer youtube data api. I had help getting it setup. Here is the plunker that got me started. I think the hardpart is getting setup for that callback, I'm sure you can do it without polymer. The example shows the tricky part an angular service is used to hook everything up.
const url = 'https://apis.google.com/js/client.js?onload=__onGoogleLoaded'
export class GoogleAPI {
loadAPI: Promise<any>
constructor(){
this.loadAPI = new Promise((resolve) => {
window['__onGoogleLoaded'] = (ev) => {
console.log('gapi loaded')
resolve(window.gapi);
}
this.loadScript()
});
}
doSomethingGoogley(){
return this.loadAPI.then((gapi) => {
console.log(gapi);
});
}
loadScript(){
console.log('loading..')
let node = document.createElement('script');
node.src = url;
node.type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild(node);
}
}
I came across this while trying to develop a progressive web app, i.e. where there was a possibility of not being online. There was an error in the code examples: onload in the google maps script should be callback. So my modification of user2467174 led to
map-loader.service.ts
const url = 'http://maps.googleapis.com/maps/api/js?key=xxxxx&callback=__onGoogleLoaded';
#Injectable()
export class GoogleMapsLoader {
private static promise;
public static load() {
// First time 'load' is called?
if (!GoogleMapsLoader.promise) {
// Make promise to load
GoogleMapsLoader.promise = new Promise( resolve => {
// Set callback for when google maps is loaded.
window['__onGoogleLoaded'] = (ev) => {
resolve('google maps api loaded');
};
let node = document.createElement('script');
node.src = url;
node.type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild(node);
});
}
// Always return promise. When 'load' is called many times, the promise is already resolved.
return GoogleMapsLoader.promise;
}
}
And then I have a component with
import { GoogleMapsLoader } from './map/map-loader.service';
constructor() {
GoogleMapsLoader.load()
.then(res => {
console.log('GoogleMapsLoader.load.then', res);
this.mapReady = true;
})
And a template
<app-map *ngIf='mapReady'></app-map>
This way the map div is only put into the dom if online.
And then in the map.component.ts we can wait until the component is placed into the DOM before loading the map itself.
ngOnInit() {
if (typeof google !== 'undefined') {
console.log('MapComponent.ngOnInit');
this.loadMap();
}
}
Just in case you'd like to make it a static function, which always returns a promise, but only gets the api once.
const url = 'https://maps.googleapis.com/maps/api/js?callback=__onGoogleMapsLoaded&ey=YOUR_API_KEY';
export class GoogleMapsLoader {
private static promise;
public static load() {
// First time 'load' is called?
if (!GoogleMapsLoader.promise) {
// Make promise to load
GoogleMapsLoader.promise = new Promise((resolve) => {
// Set callback for when google maps is loaded.
window['__onGoogleMapsLoaded'] = (ev) => {
console.log('google maps api loaded');
resolve(window['google']['maps']);
};
// Add script tag to load google maps, which then triggers the callback, which resolves the promise with windows.google.maps.
console.log('loading..');
let node = document.createElement('script');
node.src = url;
node.type = 'text/javascript';
document.getElementsByTagName('head')[0].appendChild(node);
});
}
// Always return promise. When 'load' is called many times, the promise is already resolved.
return GoogleMapsLoader.promise;
}
}
This is how you can get the api in other scripts:
GoogleMapsLoader.load()
.then((_mapsApi) => {
debugger;
this.geocoder = new _mapsApi.Geocoder();
this.geocoderStatus = _mapsApi.GeocoderStatus;
});
This is what I'm currently using:
loadMapsScript(): Promise<void> {
return new Promise(resolve => {
if (document.querySelectorAll(`[src="${mapsScriptUrl}"]`).length) {
resolve();
} else {
document.body.appendChild(Object.assign(document.createElement('script'), {
type: 'text/javascript',
src: mapsScriptUrl,
onload: doMapInitLogic();
}));
}
});
}
See my more comprehensive instructions here
Related
The Apify Puppeteer Scraper does not expose jquery in the context object. I need to access an external JSON data source within the Puppeteer Scraper pageFunction and then loop over one of the nodes. Here is what I would do if jquery was available:
$.get(urlAPI, function(data) {
$.each(data.feed.entry, function(index, value) {
var url = value.URL;
As the handlePageFunction runs in node js context, there is no jQuery. You can easily include jQuery into page.evaluate function using Apify SDK.
async function pageFunction(context) {
const { page, request, log, Apify } = context;
await Apify.utils.puppeteer.injectJQuery(page);
const title = await page.evaluate(() => {
// There is jQuery include as we incleded it using injectJQuery method
return $('title').text()
});
return {
title,
}
}
EDIT: Using requestAsBrowser.
async function pageFunction(context) {
const { page, request, log, Apify } = context;
const response = await Apify.utils.requestAsBrowser({ url: "http://example.com" });
const data = JSON.parse(response.body);
return {
data,
}
}
You don't need JQuery (you can if you are familiar with it) to access an external resource.
Usually, we extract external data via common libraries like request or Apify's own httpRequest from a standalone actor. Unfortunately, Puppeteer Scraper doesn't allow usage of libraries (only dynamically downloaded which is probably overkill).
I would just use a modern fetch browser call. It is nicer than JQuery's AJAX and doesn't require inject.
async function pageFunction(context) {
const { page, request, log, Apify } = context;
const json = await page.evaluate(() => {
// There is jQuery include as we incleded it using injectJQuery method
return await fetch('http://my-json-url.com').then((resp) => resp.json())
});
// Process the JSON
}
This is my controller which is calling the login service
mod.controller("loginCtrl",function($scope,loginService,$http)
{
$scope.Userlogin = function()
{
var User = {
userid :$scope.uname,
pass:$scope.pass
};
var res = UserloginService(User);
console.log(res);
alert("login_succ");
}
});
And this is the login service code which takes the User variable and checks for username & password
mod.service("loginService",function($http,$q) {
UserloginService = function(User) {
var deffered = $q.defer();
$http({
method:'POST',
url:'http://localhost:8080/WebApplication4_1/login.htm',
data:User
}).then(function(data) {
deffered.resolve(data);
}).error(function(status) {
deffered.reject({
status:status
});
});
return deffered.promise;
// var response = $http({
//
// method:"post",
// url:"http://localhost:8080/WebApplication4_1/login.htm",
// data:JSON.stringify(User),
// dataType:"json"
// });
// return "Name";
}
});
I have created a rest api using springs which upon passing json return back the username and password in json like this
Console shows me this error for angular
You need to enable CORS for your application for guidance see this link
https://htet101.wordpress.com/2014/01/22/cors-with-angularjs-and-spring-rest/
I prefer to use Factory to do what you're trying to do, which would be something like this:
MyApp.factory('MyService', ["$http", function($http) {
var urlBase = "http://localhost:3000";
return {
getRecent: function(numberOfItems) {
return $http.get(urlBase+"/things/recent?limit="+numberOfItems);
},
getSomethingElse: function(url) {
return $http.get(urlBase+"/other/things")
},
search: function (searchTerms) {
return $http.get(urlBase+"/search?q="+searchTerms);
}
}
}]);
And then in your controller you can import MyService and then use it in this way:
MyService.getRecent(10).then(function(res) {
$scope.things = res.data;
});
This is a great way to handle it, because you're putting the .then in your controller and you are able to control the state of the UI during a loading state if you'd like, like this:
// initialize the loading var, set to false
$scope.loading = false;
// create a reuseable update function, and inside use a promise for the ajax call,
// which is running inside the `Factory`
$scope.updateList = function() {
$scope.loading = true;
MyService.getRecent(10).then(function(res) {
$scope.loading = false;
$scope.things = res.data;
});
};
$scope.updateList();
The error in the console shows two issues with your code:
CORS is not enabled in your api. To fix this you need to enable CORS using Access-Control-Allow-Origin header to your rest api.
Unhandled rejection error, as the way you are handling errors with '.error()' method is deprecated.
'Promise.error()' method is deprecated according to this and this commit in Angular js github repo.
Hence you need to change the way you are handling errors as shown below :
$http().then(successCallback, errorCallback);
function successCallback (res) {
return res;
}
function errorCallback (err) {
return err;
}
One more thing in your code which can be avoided is you have defined a new promise and resolving it using $q methods, which is not required. $http itself returns a promise by default, which you need not define again inside it to use it as a Promise. You can directly use $http.then().
I'm pretty new to Angular so maybe I'm asking the impossible but anyway, here is my challenge.
As our server cannot paginate JSON data I would like to stream the JSON and add it page by page to the controller's model. The user doesn't have to wait for the entire stream to load so I refresh the view fo every X (pagesize) records.
I found oboe.js for parsing the JSON stream and added it using bower to my project. (bower install oboe --save).
I want to update the controllers model during the streaming. I did not use the $q implementation of pomises, because there is only one .resolve(...) possible and I want multiple pages of data loaded via the stream so the $digest needs to be called with every page. The restful service that is called is /service/tasks/search
I created a factory with a search function which I call from within the controller:
'use strict';
angular.module('myStreamingApp')
.factory('Stream', function() {
return {
search: function(schema, scope) {
var loaded = 0;
var pagesize = 100;
// JSON streaming parser oboe.js
oboe({
url: '/service/' + schema + '/search'
})
// process every node which has a schema
.node('{schema}', function(rec) {
// push the record to the model data
scope.data.push(rec);
loaded++;
// if there is another page received then refresh the view
if (loaded % pagesize === 0) {
scope.$digest();
}
})
.fail(function(err) {
console.log('streaming error' + err.thrown ? (err.thrown.message):'');
})
.done(function() {
scope.$digest();
});
}
};
});
My controller:
'use strict';
angular.module('myStreamingApp')
.controller('MyCtrl', function($scope, Stream) {
$scope.data = [];
Stream.search('tasks', $scope);
});
It all seams to work. After a while however the system gets slow and the http call doesn't terminate after refreshing the browser. Also the browser (chrome) crashes when there are too many records loaded.
Maybe I'm on the wrong track because passing the scope to the factory search function doesn't "feel" right and I suspect that calling the $digest on that scope is giving me trouble. Any ideas on this subject are welcome. Especially if you have an idea on implementing it where the factory (or service) could return a promise and I could use
$scope.data = Stream.search('tasks');
in the controller.
I digged in a little further and came up with the following solution. It might help someone:
The factory (named Stream) has a search function which is passed parameters for the Ajax request and a callback function. The callback is being called for every page of data loaded by the stream. The callback function is called via a deferred.promise so the scope can be update automatically with every page. To access the search function I use a service (named Search) which initially returns an empty aray of data. As the stream progresses the factory calls the callback function passed by the service and the page is added to the data.
I now can call the Search service form within a controller and assign the return value to the scopes data array.
The service and the factory:
'use strict';
angular.module('myStreamingApp')
.service('Search', function(Stream) {
return function(params) {
// initialize the data
var data = [];
// add the data page by page using a stream
Stream.search(params, function(page) {
// a page of records is received.
// add each record to the data
_.each(page, function(record) {
data.push(record);
});
});
return data;
};
})
.factory('Stream', function($q) {
return {
// the search function calls the oboe module to get the JSON data in a stream
search: function(params, callback) {
// the defer will be resolved immediately
var defer = $q.defer();
var promise = defer.promise;
// counter for the received records
var counter = 0;
// I use an arbitrary page size.
var pagesize = 100;
// initialize the page of records
var page = [];
// call the oboe unction to start the stream
oboe({
url: '/api/' + params.schema + '/search',
method: 'GET'
})
// once the stream starts we can resolve the defer.
.start(function() {
defer.resolve();
})
// for every node containing an _id
.node('{_id}', function(node) {
// we push the node to the page
page.push(node);
counter++;
// if the pagesize is reached return the page using the promise
if (counter % pagesize === 0) {
promise.then(callback(page));
// initialize the page
page = [];
}
})
.done(function() {
// when the stream is done make surethe last page of nodes is returned
promise.then(callback(page));
});
return promise;
}
};
});
Now I can call the service from within a controller and assign the response of the service to the scope:
$scope.mydata = Search({schema: 'tasks'});
Update august 30, 2014
I have created an angular-oboe module with the above solution a little bit more structured.
https://github.com/RonB/angular-oboe
I'm wanting to take advantage of the google maps loader callback as demonstrated here:
https://developers.google.com/maps/documentation/javascript/examples/map-simple-async
I have a working example of doing this using AMD and promises. To load and consume the API:
require(["path/to/google-maps-api-v3"], function (api) {
api.then(function (googleMaps) {
// consume the api
});
});
Here's my module def which I'd prefer return google.maps after it's full loaded instead of a deferred:
define(["dojo/Deferred"], function (Deferred) {
var d = new Deferred();
dojoConfig["googleMapsReady"] = function () {
delete dojoConfig["googleMapsReady"];
d.resolve(google.maps);
}
require(["http://maps.google.com/maps/api/js?v=3&sensor=false&callback=dojoConfig.ipsx.config.googleMapsReady&"]);
return d;
});
But solution returns a promise instead of the fully initialized google.maps. I'd prefer it to appear like a regular AMD module but can't see how.
Create an AMD plugin. Here's the one I created based on JanMisker's example:
define(function () {
var cb ="_asyncApiLoaderCallback";
return {
load: function (param, req, loadCallback) {
if (!cb) return;
dojoConfig[cb] = function () {
delete dojoConfig[cb];
cb = null;
loadCallback(google.maps);
}
require([param + "&callback=dojoConfig." + cb]);
}
};
});
Usage example:
require(["plugins/async!//maps.google.com/maps/api/js?v=3&sensor=false"]);
I tried to accomplish the tutorial here, and when I used their data service, it worked just fine.
I modified the source to my data service (WCF Data Service v5.6, OData V2), and the list just shows the Loading sign and nothing happens.
The code should load any data type, it just has to be mapped accordingly. My service is availabe through the browser, I checked.
Here is the code:
DevExTestApp.home = function (params) {
var viewModel = {
dataSource: DevExpress.data.createDataSource({
load: function (loadOptions) {
if (loadOptions.refresh) {
try {
var deferred = new $.Deferred();
$.get("http://192.168.1.101/dataservice/dataservice.svc/People")
.done(function (result) {
var mapped = $.map(result, function (data) {
return {
name: data.Name
}
});
deferred.resolve(mapped);
});
}
catch (err) {
alert(err.message);
}
return deferred;
}
}
})
};
return viewModel;
}
What else should I set?
The try-catch block would not help is this case, because data loading is async. Instead, subscribe to the fail callback:
$.get(url)
.done(doneFunc)
.fail(failFunc);
Another common problem with accessing a web service from JavaScript is Same-Origin Policy. Your OData service have to support either CORS or JSONP. Refer to this discussion.