Google references this:
https://developers.google.com/maps/documentation/javascript/geocoding
And states:
The per-session rate limit prevents the use of client-side services for batch requests, such as batch geocoding. For batch requests, use the Geocoding API web service.
However, when you go to the Geocoding API web services page, I see no reference to batch processing. The above sentence infers that you can do batch processing. I need to send a large number of addresses to get lat and longitude, but doing individual calls for each address is taking extremely long periods of time and need a more efficient method. Hopefully, a single batch call to send all the addresses.
Any ideas of how to batch process addresses on google to get lat and longitude?
I have seen this Google Batch Geocoding API
However, it states you can not which is not what the above google statement infers.
Per the Geocoding API web service documentation:
Other Usage Limits
While you are no longer limited to a maximum number of requests per day (QPD), the following usage limits are still in place for the Geocoding API:
50 requests per second (QPS), calculated as the sum of client-side and server-side queries.
You are just limited to 50 requests per second and have to pay for them (after you use up the $200 credit)
The best way I found to solve this problem is to use the Directions API with up to 27 destinations (origin, destination and 25 waypoints) and get your geolocation for the response legs. The position accuracy is slightly lower than in the geocode case from what I observed, but it is still a great tradeoff.
In the worst case you will have to call the Directions API twice when one or more addresses are not found in your call. The good thing in this case is that the Directions API will give you a response with the geocoded_waypoints which will specify the NOT_FOUND locations with a geocoder_status. After that, you can eliminate the bad ones and call again.
There are currently no available feature for a Geocoding API to handle multiple address at a single call, however, you may implement the batch process via cURL, by doing this, you can call multiple requests at once automatically. Implementation will be up to your use case as well.
Related
I'm trying to see if there's a way in a single API call to find the ideal route, order not mattering, between X destinations.
For example, the program has 3 destinations, Jeff's house, Amy's house, and Valerie's house. Don't really care the order we go in, but we'd like to visit each house with the least amount of driving.
Right now, I have it set up such that we try every ordering of destinations, and settle on the one with the fastest time. But having so many API calls seems inefficient, but I can't see a way in the API to do what I want. Is what I want presently possible in the google maps API?
You can use Waypoints in Directions API web service which returns a route that includes pass throughs or stopovers at intermediate locations.
By default, the Directions service calculates a route through the
provided waypoints in their given order. Optionally, you may pass
optimize:true as the first argument within the waypoints parameter to
allow the Directions service to optimize the provided route by
rearranging the waypoints in a more efficient order.
Sample request:
https://maps.googleapis.com/maps/api/directions/json?
origin=Adelaide,SA&destination=Adelaide,SA
&waypoints=optimize:true|Barossa+Valley,SA|Clare,SA|Connawarra,SA|McLaren+Vale,SA
&key=YOUR_API_KEY
Note that requests using waypoint optimization are billed at a higher rate.
If you will be using client-side Maps JavaScript Directions Service. Refer to this documentation and example.
Hope this helps!
I'm using the google maps distance matrix javascript API to query the travel time from one point to about 50 destinations.
in order to do this, I need to break my query up into multiple chunks, since the google maps API allows only a maximum of 25 destinations per request.
the problem that I'm having is the synchronization of the request and the result.
other APIs offer the option to include a key that lets the developer match a request and an asynchronous result.
however, since such a parameter is missing from the request specification, I have no idea how to make sure that I'm matching the right data to my incoming results.
I would be glad for a "clean" workaround - currently the only idea that I'm having is to have a unique number of destinations for each request, for instance the first with 25, the next with 24 destinations and so on. but I would not consider this a satisfactory approach.
thank you
Quick question, just to clarify the wording and meaning (because it's changed a couple of times for Map loading...)
There are the two following statements in the Places API FAQs:
The Google Places API has the following query limits:
Users with an API key are allowed 1 000 requests per 24 hour period.
Users who have verified their identity through the APIs console are
allowed 100 000 requests per 24 hour period. A credit card is required
for verification, by enabling billing in the console. We ask for your
credit card purely to validate your identity. Your card will not be
charged for use of the Places API. While the lower limit is sufficient
for development and testing, we recommend enabling the higher limit
before launching your application. It is possible to request an
additional quota. If granted, the additional quota is free of charge.
If, at some stage in the future, an option becomes available to pay
for an additional quota, that quota will be over and above the
existing free quota, and you will need to sign up for it explicitly.
Note that some services may have a multiplier:
The Text Search service is subject to a 10-times multiplier. That is,
each Text Search request that you make will count as 10 requests
against your quota. The Radar Search Service is subject to a 5-times
multiplier. That is, each Radar Search request that you make will
count as 5 requests against your quota. If you've purchased the Google
Places API as part of your Maps API for Business contract, the
multiplier may be different. Please refer to the Google Maps API for
Business documentation for details.
This implies that use of the Google Places API is restricted to 100,000 queries per day, or 10,000 if you're doing a Text Search.
However, on the Uplift page, it says the following:
If you are developing a web based application that only needs to
search for places, and is not submitting new places or Place Bumps,
you should use the Places library of the Maps API rather than using
the Places API web service. The Places library assigns a quota to each
end user rather than to each key. This means that your available quota
increases with your user base rather than being capped at a fixed
amount.
I am using the Places API in the following way:
https://maps.googleapis.com/maps/api/js?key=XXX&libraries=places
...
service = new google.maps.places.PlacesService(map);
service.textSearch(request, callback);
And also for some details searching and photo searching.
Therefore my question is; Given my usage of the Places API, am I subject to the 100,000 queries limit on my app, or am I essentially uncapped at an app level, as my quota is per end-user? (i.e. per unique IP? if I had 10,000 users I have an effective quota for my entire user base of 100,000*10,000?)
EDIT:
For clarity: if I throw my API key into https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=-33.8670522,151.1957362&radius=500&types=food&name=harbour&sensor=false&key=XXXX, it will increment my Places quota count on the Google API console, however if I run queries through my JS app, I see no quota use change. I want to make sure that i'm not suddenly going to be hit with a massive quota result.
Use of the JavaScript API services, like those provided by the Places library, have quota assigned to each end-user. It's a very similar setup as objects like the google.maps.Geocoder, which has been discussed in a bit more detail in this Geocoding Strategies article.
You may apply for an uplift to the quota so that the restriction is taken away and you have unlimited access:
https://docs.google.com/forms/d/18pkOdu0uofeI8tbQoReDVfkbOIAscLvjiKc9ZP06hEM/viewform
This form is applicable to Android, IOS and Web.
We have a page that accepts a city, state combination or a zip code. Upon submit, it passes that information to the Geocode Webservice for Google Maps and returns a list of store location results based on the information passed. We sporadically have issues with bots hitting that page multiple times and then Google shuts down our usage of the geocode webservice for the day. Is there a way to ask Google to restore this more quickly? How should we handle this?
Use the client-side geocoder, the limits there will be based on the client not the server.
Geocoding Strategies
As specified on in the documentation, "Use of the Google Geocoding API is subject to a query limit of 2,500 requests per day (User of Google Maps API for Business may perform up to 100,000 requests per day.)"
There's a nice article on Geocoding Strategies which discusses caching and other options to optimize geocoding. The "Quota Considerations" section should be useful.
Remember to use client-side geocoding if at all possible, so that the quota limit will be counted per client, instead of per service/server. If you're still hitting the limits, you might need to go for the Business version and pay for a larger limit.
The following are the usage limits as specified by Google on their Developer Guide pages:
Google Maps JavaScript API v3 => For-profit web sites are permitted to generate up to 25,000 map loads per day
Google Geocoding API => subject to a query limit of 2,500 geolocation requests per day
Google Maps API for Business => may perform up to 100,000 requests per day
Am trying to evaluate using any one of the above for use with Visualforce on Salesforce.com (SFDC) platform [*]
I understand for a public website the requests are per IP. Now for SFDC, there could be many different Organizations on a particular server (say, NA1). So, two different companies using SFDC and Google Maps API could have an URL at https://na1.salesforce.com/something_here and their requests should be counted separately.
Will it be so? What will happen in case of each API?
[*]SFDC is a SaaS cloud for the purpose of our discussion. All users login through the same page but they could be logged into different "orgs"/"organizations" but their URLs might look similar
It's important to differentiate between the server-side and client-side limits here. The server-side geocoding api would have have the 2500 limit enforced across the shared Salesforce instance based on how many machines the requests come from (I assume NA1 isn't 1 huge server). Multiple organization using the free geocoding API would all share the same server-side geocoding limit. I've actually run into the same limits using Google's own App Engine platform, where a bunch of applications share the same outbound IP address.
For any sort of guaranteed performance you'll need to send the queries from your own server or go the Maps for Business route which lets you authenticate your queries to get those higher limits.
Client-side geocoding via the JavaScript API doesn't have these server-limits, so if users do any sort of action to trigger a geocode or two using the JS API is the best route.
You can already create your own "bucket" to track your 25K map loads per day by signing up for an API Key.
This question on SO addresses the geocoding API specifically being run from a visualforce page directly, Salesforce: Google maps query status 620 G_GEO_TOO_MANY_QUERIES and it does seem to mean that without a key the limits are shared. I would suspect that unless you plan on giving the app away that you are working on, you will pretty much be forced to pick up an upgraded API key. One thing you may want to look at to work around this is hosting the maps portion in another location, and iframing it into Salesforce.