I want to draw markers for zipCode. But I can see only a few markers.
I thought it was because of async and await, but I don't know where to add them.
Somebody please help me.
var zipCode=[...]; //zipCode is array of zip codes.
function func1() {
zipCode.forEach((item, index) => {
drawZipCodeMarker(item.zip);
});
}
function drawZipCodeMarker(zip){
geocoder.geocode({'address':zip}, (results, status) => {
console.log(zip);
console.log(results);
if (results != null) {
var temp = new google.maps.Marker({position : results[0].geometry.location, map:map, title:zip});
}
});
}
You are using Geocoding service of Maps JavaScript API. The services in Google Maps JavaScript API have a per session limit described in the documentation as.
Note: The additional rate limit is applied per user session, regardless of how many users share the same project. When you first load the API, you are allocated an initial quota of requests. Once you use this quota, the API enforces rate limits on additional requests on a per-second basis. If too many requests are made within a certain time period, the API returns an OVER_QUERY_LIMIT response code.
The per-session rate limit prevents the use of client-side services for batch requests, such as batch geocoding. For batch requests, use the Geocoding API web service.
source: https://developers.google.com/maps/documentation/javascript/geocoding
As far as I know, initially you have a bucket of 10 requests. Once the bucket is empty request is denied. The bucket is refilled at the rate 1 request per second. So, you have to throttle your geocoding requests in order to stay within allowed per session limits.
You should check the status of the response. If status is OVER_QUERY_LIMIT, so you exhausted your bucket and need retry the request. You can use Exponential Backoff approach for retrying logic (https://en.wikipedia.org/wiki/Exponential_backoff).
var zipCode=[...]; //zipCode is array of zip codes.
var delayFactor = 0;
function func1() {
zipCode.forEach((item, index) => {
drawZipCodeMarker(item.zip);
});
}
function drawZipCodeMarker(zip) {
geocoder.geocode({'address':zip}, (results, status) => {
if (status === google.maps.GeocoderStatus.OK) {
console.log(zip);
console.log(results);
if (results != null) {
var temp = new google.maps.Marker({position : results[0].geometry.location, map:map, title:zip});
}
} else if (status === google.maps.GeocoderStatus.OVER_QUERY_LIMIT) {
delayFactor++;
setTimeout(function () {
drawZipCodeMarker(zip)
}, delayFactor * 1100);
} else {
console.log("Error: " + status);
}
});
}
I hope this helps!
Related
I'm building a minting site that requires me to check the number of NFTs minted and display that number in real time to the user.
At first I was just making a request every few seconds to retrieve the number, but then I figured I could use an event listener to cut down on the requests, as people would only be minting in short bursts.
However, after using the event listener, the volume of requests has gone way up. Looks like it is constantly calling blockNumber, chainId, and getLogs. Is this just how an event listener works under the hood? Or do am I doing something wrong here?
This is a next js API route and here is the code:
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import { ethers } from 'ethers'
import { contractAddress } from '../../helpers'
import type { NextApiRequest, NextApiResponse } from 'next'
import abi from '../../data/abi.json'
const NEXT_PUBLIC_ALCHEMY_KEY_GOERLI =
process.env.NEXT_PUBLIC_ALCHEMY_KEY_GOERLI
let count = 0
let lastUpdate = 0
const provider = new ethers.providers.JsonRpcProvider(
NEXT_PUBLIC_ALCHEMY_KEY_GOERLI,
'goerli'
)
const getNumberMinted = async () => {
console.log('RUNNING NUMBER MINTED - MAKING REQUEST', Date.now())
const provider = new ethers.providers.JsonRpcProvider(
NEXT_PUBLIC_ALCHEMY_KEY_GOERLI,
'goerli'
)
const contract = new ethers.Contract(contractAddress, abi.abi, provider)
const numberMinted = await contract.functions.totalSupply()
count = Number(numberMinted)
lastUpdate = Date.now()
}
const contract = new ethers.Contract(contractAddress, abi.abi, provider)
contract.on('Transfer', (to, amount, from) => {
console.log('running event listener')
if (lastUpdate < Date.now() - 5000) {
getNumberMinted()
}
})
export default function handler(req: NextApiRequest, res: NextApiResponse) {
try {
res.setHeader('Content-Type', 'application/json')
res.status(200).json({ count })
} catch (err) {
res
.status(500)
.json({ error: 'There was an error from the server, please try again' })
}
}
If you use the AlchemyProvider or directly the StaticJsonRpcProvider (which ApchemyProvider inherits) you will eliminate the chainId calls; those are used to ensure the network hasn’t changed, but if you using a third-party service, like Alchemy or INFURA, this isn’t a concern which is why the StaticJsonRpcProvider exists. :)
Then every pollingInterval, a getBlockNumber is made (because this is a relatively cheap call) to detect when a new block occurs; when a new block occurs, it uses the getLogs method to find any logs that occurred during that block. This minimizes the number of expensive getLogs method.
You can increase or decrease the pollingInterval to trade-off latency for server resource cost.
And that’s how events work. :)
Does that make sense?
I have a service with a method called "getGmapsDistance()". Here im using the google maps api to get the distance between an origin an an destination.
export default Ember.Service.extend({
getShortestDistanceInMeters: function(location) {
var service = new google.maps.DistanceMatrixService();
service.getDistanceMatrix({
...
}, this.callback); //<<<<<< !!!
},
callback: function(response, status) {
....
}
});
In my controller if got a array with locations and now I want to iterate over it and want check each element if the distance is <= the max destination.
locationsNearby: Ember.computed('locations', function() {
//...
var filteredResult = [];
locations.forEach(function(locat) {
if (this.get('distanceService').getShortestDistanceInMeters(locat) <= maxDistance) {
filteredResult.pushObject(locat);
}
});
return filteredResult;
})
Unfortunately the GMaps API for distance calculation uses a callback so the request is async.
How can I solve that problem?
You can not make an async call synchronous! This is an javascript language limitation and is important to understand! Javascript has only one thread, so this can't be changed by a library or so!
The fancy new way to handle callbacks are Promises.
You really really should checkout the specifications!
It's one of the most beautiful specifications you will ever read!
Ember uses Promises heavily! For example a routes model hook waits for a Promise to resolve before going on with the transition.
In your case you want to update the computed property when the promise resolves. Because ember-data causes this to happen often they provide two fancy classes: PromiseObject and PromiseArray. A computed property depending on a computed property that returns a PromiseObject/Array will recompute when the promise resolves:
locationsNearby: Ember.computed('locations', {
get() {
let promise = Ember.RSVP.all(this.get('locations').map(location => Ember.RSVP.hash(({
location,
distance: this.get('distanceService').getShortestDistanceInMeters(location)
})))).then(hashs => hashs.filter(hash => hash.distance <= maxDistance).map(hash => hash.location));
return DS.PromiseArray.create({promise});
}
})
To explain it a little:
I build an array with hash's of the location and a promise to the distance:
let locationsWithDistancePromise = this.get('locations').map(location => {
distance: this.get('distanceService').getShortestDistanceInMeters(location),
location
})
Then I use RSVP.hash on all of them to get an array of promises that will resolve to an array of hashes with distance and location:
let hashPromiseArr = locationsWithDistancePromise.map(h => Ember.RSVP.hash(h));
Now I use Ember.RSVP.all to get an promise that will resolve to an array of hashes with location and distance:
let hashArrPromise = Ember.RSVP.all(hashPromiseArr);
An finally I .then on the promise and filter the nearby locations. Also I map the hash to a array of locations.
let promise = hashArrPromise.then(hashs => {
return hashs.filter(hash => hash.distance <= maxDistance)
.map(hash => hash.location);
});
And wrap it as an PromiseArray
return DS.PromiseArray.create({promise});
You can just loop over this Computed Property from handlebars with {{#each}} or use it in another Computed Property:
allNearbyLocations: Ember.computed('locationsNearby.[]', {
get() {
return this.get('locationsNearby').toArray().join(' - ');
}
}
Of course you need to rewrite getShortestDistanceInMeters so that it returns a Promise:
getShortestDistanceInMeters(location) {
var service = new google.maps.DistanceMatrixService();
return new Ember.RSVP.Promise((resolve, reject) => {
service.getDistanceMatrix({
//...
}, (response, status) => {
if(status.error) {
reject(response);
} else {
resolve(response);
}
});
});
}
I'm using NodeJS and an npm package called oauth to communicate with Twitter's search API. For some reason however, twitter is returning to me an empty array of statuses without any error... What is even more confusing is the fact that using a tool like Postman with the exact same request and keys returns the list of tweets? It makes no sense! Here is my request:
URL: https://api.twitter.com/1.1/search/tweets.json?count=100&q=hello&since_id=577103514154893312&max_id=577103544903462913
Here is my code:
var twitter_auth = new OAuth(
"https://api.twitter.com/oauth/request_token",
"https://api.twitter.com/oauth/access_token",
config.consumer_key,
config.consumer_secret,
"1.0A",
null,
"HMAC-SHA1"
);
var request = twitter_auth.get(
"https://api.twitter.com/1.1/search/tweets.json" + url,
config.access_token,
config.access_token_secret
);
var chunk = "", message = "", that = this;
request.on("response", function(response){
response.setEncoding("utf8");
response.on("data", function(data){
chunk += data;
try {
message = JSON.parse(chunk);
} catch(e) {
return;
}
console.log(message);
if(message.statuses)
{
for(var i = 0; i < message.statuses.length; i++)
{
var tweet = message.statuses[i];
that.termData[term.name].push(tweet);
}
if(message.search_metadata.next_results)
{
that.openRequests.push(that.createNewSearch(message.search_metadata.next_results, term));
}
else
{
that.termCompleted(term);
}
}
else if(message)
{
console.log("Response does not appear to be valid.");
}
});
response.on("end", function(){
console.log("Search API End");
});
response.on("error", function(err){
console.log("Search API Error", err);
});
});
request.end();
The console.log(message) is returning this:
{
statuses: [],
search_metadata: {
completed_in: 0.007,
max_id: 577103544903462900,
max_id_str: '577103544903462913',
query: 'hello',
refresh_url: '?since_id=577103544903462913&q=hello&include_entities=1',
count: 100,
since_id: 577103514154893300,
since_id_str: '577103514154893312'
}
}
Any ideas what is going on? Why is the statuses array empty in my code but full of tweets in Postman?
This issue was described at twittercommunity.com.
Accordingly answer of user rchoi(Twitter Staff):
"Regarding web vs. API search, we're aware that the two return different results at the moment. We made upgrades to the web search. There is no timeline for those
changes to be brought to other parts of our system."
Try to use
https://dev.twitter.com/rest/reference/get/statuses/mentions_timeline
https://dev.twitter.com/rest/reference/get/statuses/user_timeline
if you get empty result with api search functionality.
Please follow this link
https://twittercommunity.com/t/search-tweets-api-returned-empty-statuses-result-for-some-queries/12257/6
I'm pretty new to Angular so maybe I'm asking the impossible but anyway, here is my challenge.
As our server cannot paginate JSON data I would like to stream the JSON and add it page by page to the controller's model. The user doesn't have to wait for the entire stream to load so I refresh the view fo every X (pagesize) records.
I found oboe.js for parsing the JSON stream and added it using bower to my project. (bower install oboe --save).
I want to update the controllers model during the streaming. I did not use the $q implementation of pomises, because there is only one .resolve(...) possible and I want multiple pages of data loaded via the stream so the $digest needs to be called with every page. The restful service that is called is /service/tasks/search
I created a factory with a search function which I call from within the controller:
'use strict';
angular.module('myStreamingApp')
.factory('Stream', function() {
return {
search: function(schema, scope) {
var loaded = 0;
var pagesize = 100;
// JSON streaming parser oboe.js
oboe({
url: '/service/' + schema + '/search'
})
// process every node which has a schema
.node('{schema}', function(rec) {
// push the record to the model data
scope.data.push(rec);
loaded++;
// if there is another page received then refresh the view
if (loaded % pagesize === 0) {
scope.$digest();
}
})
.fail(function(err) {
console.log('streaming error' + err.thrown ? (err.thrown.message):'');
})
.done(function() {
scope.$digest();
});
}
};
});
My controller:
'use strict';
angular.module('myStreamingApp')
.controller('MyCtrl', function($scope, Stream) {
$scope.data = [];
Stream.search('tasks', $scope);
});
It all seams to work. After a while however the system gets slow and the http call doesn't terminate after refreshing the browser. Also the browser (chrome) crashes when there are too many records loaded.
Maybe I'm on the wrong track because passing the scope to the factory search function doesn't "feel" right and I suspect that calling the $digest on that scope is giving me trouble. Any ideas on this subject are welcome. Especially if you have an idea on implementing it where the factory (or service) could return a promise and I could use
$scope.data = Stream.search('tasks');
in the controller.
I digged in a little further and came up with the following solution. It might help someone:
The factory (named Stream) has a search function which is passed parameters for the Ajax request and a callback function. The callback is being called for every page of data loaded by the stream. The callback function is called via a deferred.promise so the scope can be update automatically with every page. To access the search function I use a service (named Search) which initially returns an empty aray of data. As the stream progresses the factory calls the callback function passed by the service and the page is added to the data.
I now can call the Search service form within a controller and assign the return value to the scopes data array.
The service and the factory:
'use strict';
angular.module('myStreamingApp')
.service('Search', function(Stream) {
return function(params) {
// initialize the data
var data = [];
// add the data page by page using a stream
Stream.search(params, function(page) {
// a page of records is received.
// add each record to the data
_.each(page, function(record) {
data.push(record);
});
});
return data;
};
})
.factory('Stream', function($q) {
return {
// the search function calls the oboe module to get the JSON data in a stream
search: function(params, callback) {
// the defer will be resolved immediately
var defer = $q.defer();
var promise = defer.promise;
// counter for the received records
var counter = 0;
// I use an arbitrary page size.
var pagesize = 100;
// initialize the page of records
var page = [];
// call the oboe unction to start the stream
oboe({
url: '/api/' + params.schema + '/search',
method: 'GET'
})
// once the stream starts we can resolve the defer.
.start(function() {
defer.resolve();
})
// for every node containing an _id
.node('{_id}', function(node) {
// we push the node to the page
page.push(node);
counter++;
// if the pagesize is reached return the page using the promise
if (counter % pagesize === 0) {
promise.then(callback(page));
// initialize the page
page = [];
}
})
.done(function() {
// when the stream is done make surethe last page of nodes is returned
promise.then(callback(page));
});
return promise;
}
};
});
Now I can call the service from within a controller and assign the response of the service to the scope:
$scope.mydata = Search({schema: 'tasks'});
Update august 30, 2014
I have created an angular-oboe module with the above solution a little bit more structured.
https://github.com/RonB/angular-oboe
I tried to accomplish the tutorial here, and when I used their data service, it worked just fine.
I modified the source to my data service (WCF Data Service v5.6, OData V2), and the list just shows the Loading sign and nothing happens.
The code should load any data type, it just has to be mapped accordingly. My service is availabe through the browser, I checked.
Here is the code:
DevExTestApp.home = function (params) {
var viewModel = {
dataSource: DevExpress.data.createDataSource({
load: function (loadOptions) {
if (loadOptions.refresh) {
try {
var deferred = new $.Deferred();
$.get("http://192.168.1.101/dataservice/dataservice.svc/People")
.done(function (result) {
var mapped = $.map(result, function (data) {
return {
name: data.Name
}
});
deferred.resolve(mapped);
});
}
catch (err) {
alert(err.message);
}
return deferred;
}
}
})
};
return viewModel;
}
What else should I set?
The try-catch block would not help is this case, because data loading is async. Instead, subscribe to the fail callback:
$.get(url)
.done(doneFunc)
.fail(failFunc);
Another common problem with accessing a web service from JavaScript is Same-Origin Policy. Your OData service have to support either CORS or JSONP. Refer to this discussion.