Max size for POST request sent to webapps - google-apps-script

I use doPost() function callbacks a lot in my Google Apps Script projects.
Have rencently been looking for documentation regarding the max size to be admitted by GAS webapps expecting POST requests, to no avail. In the limits and quotas page, it mentions url fetch related data, which I presume refers to the URL fetch api calls, and not payloads of incoming POST request.
Anyone found info regarding this, has done tests, or can share any insight?
Thanks in advance.

After doing some tests, I uploaded data payloads using POST methods of up to 24MB without issues. Reached a "Max file size" limit with sizes beyond 24mb, but seems related to Drive rather than GAS.

Related

can i set up a google sheet to receive data from an external webhook/PUSH API

i'm keen to understand if i can use a google sheet / script to receive incoming data from an external (3rd party / non google) webhook.
the webhook requirements:
a defined / expected method from POST (my preferred), PUT, PATCH
an endpoint to post to which, if this is possible, i assume would be something like https://docs.google.com/spreadsheets/d/[sheet_id]/[service_name] or even https://script.google.com/d/[project_id]/[service_name] where ScriptApp can perform handling of the data
method of authentication; the current options available from the sending system are "none" or "oauth"
incoming content; the body is a simple { "id": integer }
oauth inputs include authentication_url, azure_subscription_key, app_key, app_secret, resource_id.
hopefully this is enough information to determine if this is possible. if not then please comment with questions and i'll do my best to answer them. thanks in advance :)
Short answer: Yes, for the most part.
Explanation -
...receive incoming data from an external (3rd party / non google) webhook.
Google Apps Script (GAS) provides a format called as Web Apps where, you write a script to handle incoming requests and "deploy" the Apps Script itself as a web app. In doing so, GAS provides you with it's own endpoint.
Adding this as a point of clarification where the endpoint/URL would not be that of a 3rd party, but that of GAS itself, which will need to be used in a 3rd party's application where they require you to provide an endpoint :)
You only have GET & POST requests (as of now) that can be handled by Apps Script's Web App and not the others that you've stated.
The non-dev, prod-ready link would look something like this - https://script.google.com/macros/s/Unique-Script-ID-Goes-Here/exec
The available auth/permission settings are described here.
The request parameters also describes the format of data that can be processed by the Web App.
Hope this helps but please feel free to ask for any clarifications too, as required.

Google Drive Rest API - Create File - Quota

I have a program where I have to copy about 500,000 files onto google drive to different folders. I use the google drive v3 nodejs api. I issue about 2 uploads per second (every 450ms). After a while, I get ECONNRESET or socket hang up from API.
When I look at the quota on the console.cloud.google.com. I am nowhere near my quota. Why is it failing?
For kicks, I have tried google filestream and it has no problems pushing into the drive under my user account. It's about 5 times faster.
Did anyone run into this problem?
I think your quota per se is not the problem here. This is happening when you're writing too much data within a short time frame. Try to slow it down and try to shard the requests across different user accounts. This should help with the heavy lifting of the many requests you are performing. Also, don't forget to implement exponential backoff for 4xx error retries. My two cents.
This does happen when I call passing a stream. There is no warning in the developers.google.com but there is a warning at their github repository.
You can also upload media by specifying media.body as a Readable stream. This can allow you to upload very large files that cannot fit into memory.
Note: Your readable stream may be unstable. Use at your own risk.
Once I have changed it not to use the streams, I started getting the proper error message such as status code 403, going over your rate limit.
I simply changed my code to use a straight buffer. Buffer is read via fs.readFileSync before the call.
media: {
mimeType: 'text/plain',
body: buf
}

google maps requests are forbidden with a status of 403 after working very well for at least a day

I am using the Google Maps Javscript Api, v3 and everything is working well up to a point where the requests for the map images are forbidden with a status of 403. Usually the map stops loading after a period of time in which the page/session is open: it may be 24 hours, it may be more than 48h, I couldn't actually find a more accurate period.
Given the fact that we want to have a live website and a testing one – different domains, I generated 2 different keys, and I am loading them conditionally, but the html rendered is the one expected.
var mapKey = VanillaRate.Domain.Settings.AppSettings.GoogleMapsApiKey;
and the script tag is:
script src="https://maps.googleapis.com/maps/api/js?key=#(mapKey)&libraries=places" async defer
The usage limits were not exceeded, the referrer is well set.
The error appears when the map is zoomed and it's:
Failed to load resource: the server responded with a status of 403 () - maps.googleapis.com/maps/api/js/StaticMapService.GetMapImage?....
Since I couldn’t find any exact posted situation nor documentation about it, it is possible to be a timeout on google servers for security reasons and this is why the requests are forbidden for a session longer than a day?
EDIT: I forgot to mention that after refreshing the tab, everything works well. If it was indeed the usage limit, would the server respond with success after refresh? I've read that in this case, the map wouldn't work all day. Is that right?
If the response is still a HTTP 403 (Forbidden) error, the signature was not necessarily the problem, it may be related to usage limits instead.
This typically means your access to the web service has been blocked on the grounds that your application has been exceeding usage limits for too long or otherwise abused the web service.
I find this answer on google developer. There is no simply way to resolve this problem. Google recommended two solutions:
Reduce requests to the server;
Or, 'purchasing additional allowance for your Google Maps APIs for Work license.'
You can also try to access to the the Google Cloud Support Portal to signal your problem.
I find this informations in google developer here. You can find on this link some solutions like I detail to you and the explanation of your problem.
"The usage limits were not exceeded"
Are you sure? You're loading the places library, in which case this applies:
Google Places API Web Service
Default 1,000 free requests per day,
increased to 150,000 free requests per day after identity
verification.
https://developers.google.com/maps/pricing-and-plans/
See also:
https://developers.google.com/places/web-service/usage
https://developers.google.com/maps/documentation/javascript/places#UsageLimits

Google Drive multiple files download

We have a client-server architecture that uses Google Drive for sharing files between the client and the server, without having to actually send them.
The client uses the Google Drive API to get a list of file IDs of all files it wants to share with the server.
The server then downloads the files with the appropriate authorization token.
Server response time is crucial for user experience.
We tried a few approaches:
First, we used the webContentLink. This worked until we started receiving large files from the client. Instead of getting the files' content, we got an html with a warning "exceeds the maximum size that Google can scan". We could not find a header we can use to skip this check.
Second, we switched to the Google API resource URL with the alt=media query param. This works, but we then hit API quota errors (User Rate Limit Exceeded). Since this is server code, it was identified as a single user for all requests.
Then we added the quotaUser param to represent on behalf of which user each request is. We still got many 403 responses.
In addition, we implemented exponential backoff for the failed requests.
We also added a cache for the successful requests.
Our current solution is a combination of the two. Using the webContentLink whenever possible (which appears not to affect the Google API quota). If the response is not as expected, (i.e. an html, wrong size, etc.), we try the Google API resource URL (with exponential backoff).
(Most of the files are small enough to not exceed the scan size limit)
Both client and server uses the same OAuth 2.0 client ID.
Here are my questions:
1. Is it possible to skip the virus scan, so that all files can be downloaded using the webContentLink?
2. Is the size threshold for the virus scan documented? Assuming we know the file size we can then save the round-trip of the first request (using the webContentLink)
3. Is there anything else we can do other than applying for a higher quota?
Is it possible to skip the virus scan, so that all files can be downloaded using the webContentLink?
If it is greater than 25MB it is not possible with webContentLink but since you are using authorized request use files.get with alt=media. Apply appropriate error handling options (which you have done using exponential backoff). The next step would be checking if you code is optimized then after checking and applied recommended optimization and still received Error 403 Limit Exceed, time to apply for a higher quota.
Is the size threshold for the virus scan documented? Assuming we know the file size we can then save the round-trip of the first request (using the webContentLink)
To answer this, you can refer to the Google Drive Help Forum : How can I successfully download large files from google drive without network errors at the most end of the download:
Only files smaller than 25 MB can be scanned for viruses.
Is there anything else we can do other than applying for a higher quota?
You can do the following before applying for a higher quota:
Performance Tips
Drive Platform Best Practices
Handling API Errors
After all optimization is done, the only option is to apply for higher quota limit.
Hope this helps!

What is Google app script URLFetch service response size limit?

I'm writing a Google App Script for a Google spreadsheet and I'm facing a problem with the URLFetch service.
I'm requesting an external service and I sometimes receive an empty response. The external service is pretty stable and must return something, at least an error message if something wrong happens.
But I sometimes receive an empty response, just nothing.
I can only solve this by modifying the request so the expected response should be less in size, and this always fix the issue. Which makes me think its a response size limitation.
I doubt its a random problem because rerunning the script again to issue the same request always fails, unless, as I said, I modify the request to receive a response less in size.
But on Google's quota page, I can't find a clear answer to my question.
At the time of asking this question, I'm facing a problem of reading a response that is supposed to be around 14.1 KB. I knew the size by running the request manually on my browser.
Does anyone know if there is a limitation and what exactly is it ?
In my experience the limit is 10MB. It is definitely larger than 14.1KB, an application I developed (http://www.blinkreports.com) routinely receives responses in excess of 1MB.
Under the assumption that the same limits apply for UrlFetch in Google Apps script as in App Engine.. these limits apply :
request size 10 megabytes
response size 32 megabytes
see https://cloud.google.com/appengine/docs/java/urlfetch/#Java_Quotas_and_limits