we use a custom script to retrieve data from Bookeo API with UrlFetchApp.fetch. µ
Everything went well but today, we have the following error "Service invoked too many times for one day: urlfetch"
We are aware of the limitation of 20.000 calls/day as mentionned here https://developers.google.com/apps-script/guides/services/quotas, but we don't think that we come close to this (maybe 1.000 - 1.500/day max)
The portion of the code where the error happen is
var responseBooking = UrlFetchApp.fetch(urlBooking);
So i'm sure it's related to quota issue
The weird thing is it's working like 1 time / 5-6 try
My questions are :
has Google changed it's quota limitation? (I didn't see any communication about it)
Is there a way to see how many calls was made for each service?
Is there a sort of chat for technical support for Google Apps Script?
Answer(s):
has Google changed it's quota limitation? (I didn't see any communication about it)
No.
Is there a way to see how many calls was made for each service?
No.
Is there a sort of chat for technical support for Google Apps Script?
No.
More Information:
Aside from the 20,000 calls/day limit, there are also limits which restrict the number of calls in short periods of time.
The quota works based on a rolling average of service invocations. You have a quota of 20,000 per day, but if you exceed the rate of ~0.231 calls per second (20,000/86,400) for a sustained period of time, you can still trigger an error.
You can rectify this by waiting for a while so that the impulse of invocations goes down. I would also suggest adding some form of exponential backoff to your code to stop this from happening again in future.
References:
Quotas for Google Services | Apps Script | Google Developers
Exponential backoff - Wikipedia
we use a custom script to retrieve data from Bookeo API with UrlFetchApp.fetch. µ
Everything went well but today, we have the following error "Service invoked too many times for one day: urlfetch"
We are aware of the limitation of 20.000 calls/day as mentionned here https://developers.google.com/apps-script/guides/services/quotas, but we don't think that we come close to this (maybe 1.000 - 1.500/day max)
The portion of the code where the error happen is
var responseBooking = UrlFetchApp.fetch(urlBooking);
So i'm sure it's related to quota issue
The weird thing is it's working like 1 time / 5-6 try
My questions are :
has Google changed it's quota limitation? (I didn't see any communication about it)
Is there a way to see how many calls was made for each service?
Is there a sort of chat for technical support for Google Apps Script?
Answer(s):
has Google changed it's quota limitation? (I didn't see any communication about it)
No.
Is there a way to see how many calls was made for each service?
No.
Is there a sort of chat for technical support for Google Apps Script?
No.
More Information:
Aside from the 20,000 calls/day limit, there are also limits which restrict the number of calls in short periods of time.
The quota works based on a rolling average of service invocations. You have a quota of 20,000 per day, but if you exceed the rate of ~0.231 calls per second (20,000/86,400) for a sustained period of time, you can still trigger an error.
You can rectify this by waiting for a while so that the impulse of invocations goes down. I would also suggest adding some form of exponential backoff to your code to stop this from happening again in future.
References:
Quotas for Google Services | Apps Script | Google Developers
Exponential backoff - Wikipedia
I am forced to use the Documents List API (and not the Drive SDK) because I need exact info about the ACLs, while the Drive SDK usually only provides the first/last name in the ACL info.
I am processing more than 1 million docs for which I need the ACLs. However, I discovered that when I try to perform more than 10 requests/second, I get "Request rate limit exceeded" errors from the Documents List API.
The answer to this question makes me think that the quota is supposed to be much bigger, can anyone confirm ?
You might want to double check in google apis that you have set it up to 10 requests a second by default its set lower then that. The Max you can set it to is 10 requests a second there is nothing you can do to change that. Its probably to prevent you from flooding the server.
If you do encounter the error you can Implementing exponential backoff and then try again after the alloted time has passed.
You could try batching as described here https://developers.google.com/google-apps/documents-list/#batching_resource_operations_into_a_single_request
My experience when using the Drive SDK equivalent wasn't great, ie. I got "rate limit exceeded" at the same rate as with individual requests. However since you're using the older API, you might have more luck.
What are the limits for client side geocoding with Google Maps JavaScript API v3?
My research:
Google Maps PHP API has a limit of 2500 geocode requests per day (https://developers.google.com/maps/documentation/geocoding/#Limits)
Google Maps Javascript API v3 has a limit of 25000 map loads per day (https://developers.google.com/maps/documentation/javascript/usage)
Google suggests using javascript API for geoocoding to avoid the 2500 limit through the PHP API. It states "running client-side geocoding, you generally don't have to worry about your quota" (https://developers.google.com/maps/articles/geocodestrat#client)
However, nowhere does it state in any of the documentation what the geocoding limits are through the Google Maps JavaScript API v.3.
(This has been bothering me for a while, and I have researched it on more than one occasion and failed to find a solid answer)
I work in Maps for Business support at Google. The following is my personal opinion, not Google's, but let's just say I'm rather familiar with this topic!
First, it's important to distinguish between client-side geocoding (JavaScript calls to google.maps.Geocoder) and server-side geocoding (HTTP requests to /maps/api/geocode). This question and answer are specifically about client-side geocoding; for server-side limits, see here. In particular, the oft-mentioned 2,500 requests per IP per day limit applies only to server-side geocoding, not client-side.
So the short answer is, there is no documented query limit specifically for client-side geocoding. Once a client has loaded the Google Maps JavaScript API library, it can make as many geocode requests as it likes, provided they're done at a sane speed. Two rules of thumb to ensure you don't run into problems:
Initiate geocoding in response to user interaction, and you'll be fine even if there are short request bursts (eg. click a button to geocode several addresses).
Catch any OVER_QUERY_LIMIT errors and handle them gracefully (eg. exponential backoff). Here is some sample code in Python.
But please do not try to busy-loop the geocoder or hammer away at it for hours on end, there are protections against abuse.
UPDATE
As of 2017 the client side quota is calculated against corresponding web service quota. Please have a look at Christophe Roussy's answer.
Always read the latest you can find on an official Google website for the proper API version, a dated stackoverflow answer is not an up-to-date reference !!!
https://developers.google.com/maps/documentation/geocoding/usage-limits
Here is what it said when I wrote this answer:
Google Maps Geocoding API Usage Limits
2,500 free requests per day, calculated as the sum of client-side and
server-side queries.
50 requests per second, calculated as the sum of client-side and
server-side queries.
Enable pay-as-you-go billing to unlock higher quotas:
$0.50 USD / 1000 additional requests, up to 100,000 daily.
In general I think Google is happy as long as the queries come from real human users as this is of some value to them.
There are also ways to save a few requests by caching results for up to 30 days:
https://developers.google.com/maps/premium/optimize-web-services#optimize
Based on https://developers.google.com/maps/articles/geocodestrat,
Client-side geocoding through the browser is rate limited per map
session, so the geocoding is distributed across all your users and
scales with your userbase.
So, what is rate limits per map session?
As geocoding limits are per user session, there is no risk that your
application will reach a global limit as your userbase grows.
Client-side geocoding will not face a quota limit unless you perform a
batch of geocoding requests within a user session. Therefore, running
client-side geocoding, you generally don't have to worry about your
quota.
Ok, what is the limit? Here we go
https://developers.google.com/maps/documentation/business/articles/usage_limits
With a rate limit or 10 QPS (queries per second), on sending the 11th
request your application should check the timestamp of the first
request and wait until 1 second has passed. The same should be applied
to daily limits.
Still not clear? me too.
This is the closet answer that I found from their document and it says "rate limit of 10 requests per second"
https://developers.google.com/maps/documentation/business/webservices/quota
I don't think a user session can send 10 requests per second, thus I consider it as limitless.
I asked this recently about the Places library (or rather, researched it and came to a conclusion):
Regarding Places Library for Google Maps API Quota limits
Basically, it seems that Map loads with key are limited to 25,000 per day per domain (that might not be right, but i'm pretty sure thats the case).
Geocoding limits are limited in the following way:
2,500 if you're using an API key without setting up billing
100,000 if you're using an API key and have set up billing
If you're doing server-side geocoding (pre-caching results or similar), then your geocoding is per domain, and you're going to run out of requests pretty quickly. (i.e. in a similar vein to if you're doing places requests using the REST API like so: https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=-33.8670522,151.1957362&radius=500&types=food&name=harbour&sensor=false&key=XXXX)
If you're using the javascript library, using the autocomplete functionality or PlacesService like so:
service = new google.maps.Geocoder();
then the limit (2,500 or 100,000 respectively), is per end-user, so your limit scales with your user base.
According to the article you linked (under When to Use Server-Side Geocoding):
The 2,500 request limit is per IP address.
So - what exactly do you mean by limit? Limit for the hosting application, or for a client running it?
The statement above tells me that there is no limit for the "host"; without a key or enforceable referrer URLs, there is no (good) way for Google to determine from whence a geocoding request originates. That's kind of the beauty of REST, among other things.
It also tells me that even if they could enforce it, they haven't set a limit. This would likely be counter-productive to their business model.
In Salesforce I have created a future method that makes a Google Maps geocode callout. I know there is a limit of 2,500 requests per day but our instance has made no more than 100 requests today. Maybe the same number of requests yesterday. Yet the response code is 620 G_GEO_TOO_MANY_QUERIES.
Could it be that Google is seeing the IP address of the instance of Salesforce and aggregating all of these requests as coming from one location. So other companies that are sharing the address are causing my instance to hit this limit?
If not can anyone suggest another cause?
This discussion suggests that it is the shared origin from salesforce that is messing it up.
This only makes sense though if you are doing the geocode lookup from the server and not from a client. If you would do it fromt the client it would use the ip from the client and you are dealing with the local clients lookup limits (see below).
If you are doing it on the server you might also have to check if you are actually doing something that is legal and not breaking the ToS from google. In this discussion you will get some background on that and also a solution if you need to fix this on the server (buy a licence)
To be complete the G_GEO_TOO_MANY_QUERIES can mean one of 2 things:
You exceeded the daily limit (too many in a day)
You exceeded the speed limit (too many request is too short period)
Google has not specified an exact limit as far as I know but they don't seem to enjoy automated lookups. If you look at various libraries and plugins you'll see that all of them force a delay between each request and often they add a little randomness to the delay. You could experiment with this if it makes any difference
Did this end up working for you? we're hitting a server-side 620 and all configuration looks a.o.k... we have a premier license and upgraded to 250k requests per day.