Google App Script limits on number of times doGet can be called - google-apps-script

The google script limitations link shows that we can do URL Fetch calls 20,000 / day. Now thats looks quiet ambiguous to me. Inside script, you can use the UrlFetchApp to make get/post requests to external urls. But what if we are calling a deployed google script from external non script client(e.g. web browser/mobile device ).
Does that imply that we can only call the script(with url say abc/exec)20000 times a day(20000=total sum of times the script is called from all client devices) from outside of google app script?

I don't see the relationship between fetching an URL from within a script and running an app from a browser. I never saw any mention of a limit about how many times a webapp could be called but there are probably limits on the total processing time a script can use. The quota dashboard specifies the maximum processing time used by triggers, it does not however specify a limit on processing time by a human user.
If Google does not specify it that means they don't care or that they don't want us to know... in both case the result is the same: we have no way to get the info.
That said, I never encounter any issue with an app being called too often even if I know that some of them are heavily used sometimes.
Was your question purely rhetorical or did you experience some real situation?

Related

Assuming I have edit access to a Google Sheet, how can I write to it programmatically?

A friend has given me edit access to a Google Sheet he owns. I want to write to it via web server code hosted elsewhere---at Wix, actually, where I've built a page that wants to update the spreadsheet from time to time, perhaps a dozen updates per day.
I have found several technologies that might solve the problem. Google Apps Script is one, the Sheets API another. The latter of these is (as far as I can tell) really three distinct options depending on authentication scheme: API key, service account, or OAuth2.
The question is this: Given my specific situation, which of these four approaches (or others I haven't found) is feasible and most appropriate? I'm not asking for opinions; I just don't want to go down one path only to learn later that it's an unworkable dead end (as preliminary research suggests that API key might be) or absurd overkill (as preliminary research suggests that OAuth2 with a Google-approved app might be). Note in particular that I have edit access to the spreadsheet in question and can give that access to others if necessary. If the choice depends on factors I haven't mentioned, what are those factors?

Suddenly: Service using too much computer time for one day

I have been putting out fires all day. Can't seem to make heads or tails of this error...
Today I started for the first time after months of using the same script. It is triggered when a new record is added to a google sheet.
It seems to work on and off but every few minutes I am getting a failure notice indicating "Service using too much computer time for one day".
Looking through the documentation and the post on Stack, it is clear I am not the first to deal with this issue, but there does not seem to be any concise answer to how to resolve. I looked for some way to reach some type of google assistance but am always directed back to stack overflow to submit my issue for consideration. Understand this could be an issue with my script, but cant seem to find what might be causing this issue. Also confusing the matter is that the script does seem to be firing 90% of the time.
My questions:
How do I check the "computer time" quota?
Should I turn off that script/trigger until 24 hours have passed?
Does anyone know how to get a hold of Google support directly?
I don't know of any way that the total script run time can be seen in a dashboard.
You can see duration times of individual script executions at:
https://script.google.com/home/executions
You could scroll through your executions to look for long durations times. That might indicate an endless loop in your code.
To calculate the total run time of all your running scripts, you'd need to use the Apps Script API.
https://developers.google.com/apps-script/api/how-tos/view-processes
https://developers.google.com/apps-script/api/concepts/processes
I don't have any code to list and compile all the durations.
If anyone does, that would be very interesting.
I don't know if deleting the trigger until the next day would gain you anything. I'm guessing that it shouldn't.
Google does not provide "on demand" support people to answer questions about Apps Script. Even G Suite customers don't get "on demand" support contacts for Apps Script. You can report bugs and request features through the Google Issue Tracker, but that won't get you direct contact with a Google support person. Even if you purchase a support plan, Google doesn't have people who are designated to support Apps Script. If you purchased a support plan, they might try to help you with an Apps Script question, but officially they aren't qualified to help, or obligated to provide Apps Script support. And even if a support person tries to help you with an Apps Script problem, the first thing they'll do is a search of Stack Overflow, and give you links to SO posts.
So, it's extremely unlikely that you're going to be able to talk with someone directly at Google.
The best thing to do is to review your code for performance issues. Avoid reading the writing data often. The ideal situation is to get all the data that you need just once, process it, and write it back once. Cache data if you can. Avoid lots of calls to Properties Service. Find what part of your code is taking the longest time, and try to improve it.

Any known limits to calls per minute for webapps?

Im getting inconsistent behaviours when sending POST requests to a google apps script deployed as webapp.
I have a desktop app sending POST calls to a GAS webapp. This calls may be totally variable in their cadence, from 1 in several minutes, to bursts of dozens per second.
In my tests I have found requests seemingly lost, requests that don't progress along the webapp internal logic flow (like script instances that get cut or interrupeted (?)), while others work flawlessly. There is no evident pattern.
However, trying things around, I found that if I space the calls, adding a pause between requests, everything normalizes.
Are there stablished and known limits for this? The only option I have to solve this is to introduce this artificial intervals between calls? I have not found information on this in the GAS quotas page.
Any help and ideas would be appreciated.
Confirming in the answer: there is no evident or documented per-minute limit on the number of requests to a GAS webapp.
The issue I'm experimenting is related to concurrency. Even when they are coming from the same source, fast paced requests can produce concurrency issues when accesing storage services like Cache or Properties.
This should be handled using the Lock Service.

Google Apps Script close spreadsheets programmatically

Google lists a maximum of 50 concurrent users allowed edit permission at once in a shared spreadsheet
https://support.google.com/docs/answer/2494827?hl=en&rd=1
I've got a Google Spreadsheet that will have a large number of potential editors at any moment and am concerned about idle users filling up the 50 cap and blocking edits from fresh logons. There's a lot of interaction with the data that discourages me from using Forms submissions, and there would be a large bound script to the document.
Is there an idle timeout function that can be called through the spreadsheet's bound script that will close the document or set an idle user to view only status? This is in a Google Apps for Business domain.
This is a question that was asked very often 6 years ago on the old Google product forum when I began to work with Google spreadsheets.
Strangely the question disappeared after a while and never came back (until today !)
There are two main reasons for this question to disappear :
Since Google spreadsheets are a representation of a documents hosted on a distant server, we can not interact with what happens in another user browser from a script that runs on this distant server
I suppose this is rarely an issue and that in most case it was mainly a theoretical question that vanished in real use. (but that is really a supposition, I admit...)
As far as I know there is no practical solution to avoid that situation.

urlFetch terms of service and quotas - using Google Docs to monitor website uptime

I've been making improvements to a Google Docs spreadsheet (and the underlying script) that monitors websites for uptime/downtime. You can see a description and copy the spreadsheet here: http://agileadam.com/google-docs-uptime-monitor
I'm a web developer and need to monitor the uptime of 199 (the number will grow) websites. This is an accumulation of many years of development - all sites we control. So, I had a 5 minute time-driven trigger that ran my function, which loops through every URL and checks the HTTP status using urlFetch.
I have two questions:
Is this a violation of the Google Docs TOS?
At 5:57am I got a message: "Exception: Service invoked too many times for one day: urlfetch." According to https://docs.google.com/macros/dashboard the limit is 20,000 for my account. According to my calculations I should have only been around 14,000 uses. Why did it bomb out on me?
I am going to remove many of the domain names that are similar (I suppose I don't NEED to monitor .com and .org for the same site) and I've changed my trigger to run every 15 minutes. This will reduce the number of executions considerably.