Salesforce: Google maps query status 620 G_GEO_TOO_MANY_QUERIES - google-maps

In Salesforce I have created a future method that makes a Google Maps geocode callout. I know there is a limit of 2,500 requests per day but our instance has made no more than 100 requests today. Maybe the same number of requests yesterday. Yet the response code is 620 G_GEO_TOO_MANY_QUERIES.
Could it be that Google is seeing the IP address of the instance of Salesforce and aggregating all of these requests as coming from one location. So other companies that are sharing the address are causing my instance to hit this limit?
If not can anyone suggest another cause?

This discussion suggests that it is the shared origin from salesforce that is messing it up.
This only makes sense though if you are doing the geocode lookup from the server and not from a client. If you would do it fromt the client it would use the ip from the client and you are dealing with the local clients lookup limits (see below).
If you are doing it on the server you might also have to check if you are actually doing something that is legal and not breaking the ToS from google. In this discussion you will get some background on that and also a solution if you need to fix this on the server (buy a licence)
To be complete the G_GEO_TOO_MANY_QUERIES can mean one of 2 things:
You exceeded the daily limit (too many in a day)
You exceeded the speed limit (too many request is too short period)
Google has not specified an exact limit as far as I know but they don't seem to enjoy automated lookups. If you look at various libraries and plugins you'll see that all of them force a delay between each request and often they add a little randomness to the delay. You could experiment with this if it makes any difference

Did this end up working for you? we're hitting a server-side 620 and all configuration looks a.o.k... we have a premier license and upgraded to 250k requests per day.

Related

Any Way to Detect Micro Delays using EWS?

I am encountering some very long response times from Exchange Online called via the EWS Managed API 2.0 in C#. I suspect I am being throttled, but I cannot find anything that lets me prove this in the Admin portal for my O365 account. I have seen in some search results that using PowerShell you can see messages indicating "micro delays" have been applied, but I'm stuck in C#/EWS, so my question is: is there anything I can look at coming back in the responses to my EWS calls that can identify if these micro delays have been applied? BTW, response times are very close to the 100 second timeout time, which is killing my code.
Thx,
Paul
100 Seconds ins't a micro delay, micro-delays are milliseconds(capped at 500 ms) and are more aimed at delaying a large volume of requests. (eg if an app is going to may 100 sequential requests a microdelay would spread the load of those request out over a greater time by punishing the app more and more and that would lower the resource load on the server). One request taking 100 seconds to fulfill is probably more to do with the request itself. Eg overuse of search filters or overcomplex search etc which my also impact throttling or if your using batching each request withing the batch could have a micro-delay applied.
EWS doesn't return metrics of throttle usage (the new REST API does give a little more information back in this regards). What you need is access to the EWS logs which has that information. Each Exchange request the EWS Managed API makes has and Client Requestid to help correlate the request to log entry there more detail in https://msdn.microsoft.com/en-us/library/office/dn720380(v=exchg.150).aspx

Documents List API authorizes less than 10 requests/second?

I am forced to use the Documents List API (and not the Drive SDK) because I need exact info about the ACLs, while the Drive SDK usually only provides the first/last name in the ACL info.
I am processing more than 1 million docs for which I need the ACLs. However, I discovered that when I try to perform more than 10 requests/second, I get "Request rate limit exceeded" errors from the Documents List API.
The answer to this question makes me think that the quota is supposed to be much bigger, can anyone confirm ?
You might want to double check in google apis that you have set it up to 10 requests a second by default its set lower then that. The Max you can set it to is 10 requests a second there is nothing you can do to change that. Its probably to prevent you from flooding the server.
If you do encounter the error you can Implementing exponential backoff and then try again after the alloted time has passed.
You could try batching as described here https://developers.google.com/google-apps/documents-list/#batching_resource_operations_into_a_single_request
My experience when using the Drive SDK equivalent wasn't great, ie. I got "rate limit exceeded" at the same rate as with individual requests. However since you're using the older API, you might have more luck.

What calls count toward the Google Drive Calls Per Second Rate Limit?

Also, is there a way to check what your current QPS limit is? We are currently experiencing a large number of errors with the following info:
<errors xmlns='http://schemas.google.com/g/2005'>
<error>
<domain>GData</domain>
<code>rateLimitExceeded</code>
<internalReason>Rate limit exceeded, lower query rate</internalReason>
<extendedHelp>Request rate limit exceeded.</extendedHelp>
</error>
</errors>
We have stats around all our drive calls. As far as I can tell, we should be well under our QPS.
When only counting "drive API" calls (calls made using the drive API) we're at about half of what our QPS limit should be. If you add in calls made to the export links (which are in the format of the old Document List API) we are still below what our limit should be.
I have tried reaching out to google to confirm what our limit is, and whether or not we are are actually going over our QPS (or if there's something else going on) but have not received any reply. Anyone have any thoughts?
Also, we are using exponential backoff and have implemented it as suggested in the Drive API docs.
In the api console you can see and increase the rate limit per second. Increasing it also means that a bad user could consume the entire quota. If youre making the api calls using a single user's auth is worse than distributing it across each user's own auth token since the limit is per user.

Google Javascript API Geocoding Limits

What are the limits for client side geocoding with Google Maps JavaScript API v3?
My research:
Google Maps PHP API has a limit of 2500 geocode requests per day (https://developers.google.com/maps/documentation/geocoding/#Limits)
Google Maps Javascript API v3 has a limit of 25000 map loads per day (https://developers.google.com/maps/documentation/javascript/usage)
Google suggests using javascript API for geoocoding to avoid the 2500 limit through the PHP API. It states "running client-side geocoding, you generally don't have to worry about your quota" (https://developers.google.com/maps/articles/geocodestrat#client)
However, nowhere does it state in any of the documentation what the geocoding limits are through the Google Maps JavaScript API v.3.
(This has been bothering me for a while, and I have researched it on more than one occasion and failed to find a solid answer)
I work in Maps for Business support at Google. The following is my personal opinion, not Google's, but let's just say I'm rather familiar with this topic!
First, it's important to distinguish between client-side geocoding (JavaScript calls to google.maps.Geocoder) and server-side geocoding (HTTP requests to /maps/api/geocode). This question and answer are specifically about client-side geocoding; for server-side limits, see here. In particular, the oft-mentioned 2,500 requests per IP per day limit applies only to server-side geocoding, not client-side.
So the short answer is, there is no documented query limit specifically for client-side geocoding. Once a client has loaded the Google Maps JavaScript API library, it can make as many geocode requests as it likes, provided they're done at a sane speed. Two rules of thumb to ensure you don't run into problems:
Initiate geocoding in response to user interaction, and you'll be fine even if there are short request bursts (eg. click a button to geocode several addresses).
Catch any OVER_QUERY_LIMIT errors and handle them gracefully (eg. exponential backoff). Here is some sample code in Python.
But please do not try to busy-loop the geocoder or hammer away at it for hours on end, there are protections against abuse.
UPDATE
As of 2017 the client side quota is calculated against corresponding web service quota. Please have a look at Christophe Roussy's answer.
Always read the latest you can find on an official Google website for the proper API version, a dated stackoverflow answer is not an up-to-date reference !!!
https://developers.google.com/maps/documentation/geocoding/usage-limits
Here is what it said when I wrote this answer:
Google Maps Geocoding API Usage Limits
2,500 free requests per day, calculated as the sum of client-side and
server-side queries.
50 requests per second, calculated as the sum of client-side and
server-side queries.
Enable pay-as-you-go billing to unlock higher quotas:
$0.50 USD / 1000 additional requests, up to 100,000 daily.
In general I think Google is happy as long as the queries come from real human users as this is of some value to them.
There are also ways to save a few requests by caching results for up to 30 days:
https://developers.google.com/maps/premium/optimize-web-services#optimize
Based on https://developers.google.com/maps/articles/geocodestrat,
Client-side geocoding through the browser is rate limited per map
session, so the geocoding is distributed across all your users and
scales with your userbase.
So, what is rate limits per map session?
As geocoding limits are per user session, there is no risk that your
application will reach a global limit as your userbase grows.
Client-side geocoding will not face a quota limit unless you perform a
batch of geocoding requests within a user session. Therefore, running
client-side geocoding, you generally don't have to worry about your
quota.
Ok, what is the limit? Here we go
https://developers.google.com/maps/documentation/business/articles/usage_limits
With a rate limit or 10 QPS (queries per second), on sending the 11th
request your application should check the timestamp of the first
request and wait until 1 second has passed. The same should be applied
to daily limits.
Still not clear? me too.
This is the closet answer that I found from their document and it says "rate limit of 10 requests per second"
https://developers.google.com/maps/documentation/business/webservices/quota
I don't think a user session can send 10 requests per second, thus I consider it as limitless.
I asked this recently about the Places library (or rather, researched it and came to a conclusion):
Regarding Places Library for Google Maps API Quota limits
Basically, it seems that Map loads with key are limited to 25,000 per day per domain (that might not be right, but i'm pretty sure thats the case).
Geocoding limits are limited in the following way:
2,500 if you're using an API key without setting up billing
100,000 if you're using an API key and have set up billing
If you're doing server-side geocoding (pre-caching results or similar), then your geocoding is per domain, and you're going to run out of requests pretty quickly. (i.e. in a similar vein to if you're doing places requests using the REST API like so: https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=-33.8670522,151.1957362&radius=500&types=food&name=harbour&sensor=false&key=XXXX)
If you're using the javascript library, using the autocomplete functionality or PlacesService like so:
service = new google.maps.Geocoder();
then the limit (2,500 or 100,000 respectively), is per end-user, so your limit scales with your user base.
According to the article you linked (under When to Use Server-Side Geocoding):
The 2,500 request limit is per IP address.
So - what exactly do you mean by limit? Limit for the hosting application, or for a client running it?
The statement above tells me that there is no limit for the "host"; without a key or enforceable referrer URLs, there is no (good) way for Google to determine from whence a geocoding request originates. That's kind of the beauty of REST, among other things.
It also tells me that even if they could enforce it, they haven't set a limit. This would likely be counter-productive to their business model.

Bypass Google Geocoding API IP Rate Limits

We are a division of a large corporation and are running into a MAJOR problem with the Google maps and geocoding API. The problem is that all 70 divisions of our corporation are behind the same IP address. So a customer service agent in India would appear to be coming from the same IP address as a developer in Cleveland, OH who would appear to be coming from the same IP address as a division president in western Europe. With over 130,000 employees, we routinely get blocked for exceeding the IP rate limit.
Aside from individuals just browsing to websites that happen to be using Google maps with client-side geocoding requests, any division that attempts to do batch geocoding or provides their own application for showing maps will all contribute to the same limit! We actually use our own Where To Buy service internally, which is publicly available, (http://www.ridgid.com/Tools/Where-To-Buy/), to give customers information about local distributors.
While we are not running into an API limit (at least not yet, and when we do we can always buy the enterprise license), we are running into the IP rate limit and currently have no workaround. We are already following best practices in terms of caching up geocoding results and reducing wasteful calls. Thankfully this isn't a problem for our customers as they are not on the same IP as users within our corporation.
The question is whether there is ANY way we can get an exception from Google to our specific IP address to improve the IP-based limitations given our setup? This is really a question for Google. Thanks for your help!
A division of a large corporation with tens of thousands of employees should have an enterprise licence.
I don't think there's a limit associated with IP if you are using the client-side geocoding as described here:
https://developers.google.com/maps/articles/geocodestrat
Server-side geocoding through the Geocoding Web Service has a quota
of 2,500 requests per IP per day, so all requests in one day count
against the quota. In addition, the Web Service is rate-limited, so
that requests that come in too quickly result in blocking.
Client-side geocoding through the browser is rate limited per map session, so the geocoding > is distributed across all your users and scales with your userbase.
A "map session" means every time the Javascript Map API is loaded. However there might be a limit on that as well, whether or not it's associated with IP / domain / sub-domain / URL still needs further clarification.
And the "rate limited" on client side doesn't mean there is a daily quota or something, I had tested this method, on a single day, it can geocode more than 2500 addresses. You will have to think and test how the "rate limit" works, my observation is it let you burst to around 10 addresses in no time, then limit you to a rate of around 1 geocode per second for the next 150 request, further request from the same "session" would take exponentially longer time.
One possiblity is to use a batch of reverse geocode instead of a single request. Google allows up to 24 waypoints in a single direction request. Instead of sending a single reverse geocode you can wait until you have 24 reverse geocode accumulated in a batch. Or you can use Yahoo map like in this answer: Issue in displaying static Google maps.
I wondering if there is a lot of overlap in the requests, like everyone keeps looking up that same 50 locations? You write a caching proxy that makes requests to a internal database first then requests from Google when the results arn't found. This could be done at the software level changing the GeoCode call to a InternalGeoCode or maybe at the network level and catch all the requests to the Google Geocode servers and redirect them? I'm not 100% how hard this would be to implement or if it would help maybe the requests are all unique.
If you are doing forward geocoding only, I suggest trying out GeocodeFarm. I just got an email from them the other day saying they would start releasing "reverse" geocoding soon too and that they are still working out the kinks. Maybe it would help you since they have no request rate limits in place and the only limit is your daily 2,500 limit, same as google... IDK. Just a suggestion..
http://www.geocodefarm.com == the link
The limit in standard license is in 50 to 100 QPS (Queries per second) limit depending upon the service (i.e. Elevation service has a 50 QPS limit, Geocoding has a 100 QPS limit).
Despite you may have a higher quota you cannot perform more than 50 requests per second otherwise the server will block your IP for about 2 hours (based on my experience). The solution is to cache the requests and perform them on a scheduled basis and not upon occurrence.