Check and warn about Google Mail Storage Usage automatically - google-apps-script

I need to automatically check and warn (send mail) if my gmail size increases about a particular limit (say > 25 GB) so that I could cleanup my inbox before it reaches the full capacity of 30 GB.
I thought of using Google App Scripts for this. But I could get only the size of the Google drive using 'DriveApp.getStorageUsed()'. But I couldnt find a similar method for GmailApp.
Is there any other way I could use Google App scripts to monitor the size of my Inbox.

For Gmail:
You can use advanced services and https://developers.google.com/admin-sdk/reports/v1/reference/usage-ref-appendix-a/users-accounts and reporting https://developers.google.com/apps-script/advanced/admin-sdk-reports
total_quota_in_mb integer Total storage (in MB) for the user.
used_quota_in_mb integer Total storage (in MB) used by the user.
For Drive:
You can use the funcions listed below: https://developers.google.com/apps-script/reference/drive/drive-app#getStorageUsed()
getStorageLimit()
Gets the number of bytes the user is allowed to store in Drive.
Return
Integer — the number of bytes the user is allowed to store in Drive
getStorageUsed()
Gets the number of bytes the user is currently storing in Drive.
Return
Integer — the number of bytes the user is currently storing in Drive

Related

Using BigQuery as a datastorage/cache system for google sheets addon

I have a google sheets addon with custom formulas that fetch data from my API to show in their result.
The problem for many users is the addon frequently reaches the Urlfetch quota. So I'm trying to use another source of data for my formulas, I been trying to setup BigQuery for that ( I know is not meant to be used like that).
My approach would be something like this:
When a user executes a formula, I look first in BigQuery to see if data already there, if not fetch from API then stores the result in a BigQuery.
I tried a proof of concept, where I added a custom function to my addon with the code sample in https://developers.google.com/apps-script/advanced/bigquery
where I replaced projectId for my own and queried a sample table
when executed the formula got this error:
GoogleJsonResponseException: API call to bigquery.jobs.query failed with error: Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.
I tried also execute the same code from a function called from the sidebar frontend, where I got this error instead:
User does not have bigquery.jobs.create permission in project
From these errors I'm guessing I have to assign a BigQuery role in IAM to each user. But I have hundreds of them.
Is there any why that any addon user can access the BigQuery project? or my whole approach is wrong.
How about using the Cache service instead of BigQuery? The Cache service is simple to use, does not require authentication, and also works in a custom function context.
Note these limitations:
The maximum length of a key is 250 characters.
The maximum amount of data that can be stored per key is 100KB.
The maximum expiration time is 21,600 seconds (6 hours), but cached data may be removed before this time if a lot of data is cached.
The cap for cached items is 1,000. If more than 1,000 items are written, the cache stores the 900 items farthest from expiration.

Google Drive API /files slow response

I want to ask for help/ideas on the issue I will describe below.
Our iOS app allows users to access their Google Drive files.
We use Changes API (https://developers.google.com/drive/api/v3/reference/changes). The main pre-condition to using this API is to build a local DB that holds the snapshot of the user's Drive file tree and the token. To initially fill the DB we must request the list of all files from user's Drive. Getting the list of all files (with metadata) takes too long for many of our users. This is the issue I want to address.
We request files with the series of Files requests (https://developers.google.com/drive/api/v3/reference/files/list). Most requests are plain files?q=trashed%20%3D%20false.
For example, at my own private Google Drive:
69K files
initial request of all files takes 5+ minutes with my current network speed (Download 527 Mbps, Upload 417 Mbps; ping www.googleapis.com – 40–45 ms)
~150 requests
each request brings information about ~460 files
each request takes around 2-2.5 seconds
Sometimes I observed requests to take up to 6 seconds, which means that getting all files list took 15 minutes at my account.
If I look at the Developer Console, the latency is below 0.1s
Many of our users have Drives far bigger than mine. Standard iOS app user's session is not long enough to complete the initial request. We do save every intermediate page token so that all data received during single app session is not lost if user leaves the app – next session we will keep downloading data from the last saved token. But still there're some cases when our app needs the DB to be filled out with data before starting some operations – in that case our users see "Pending..." progress and they complain that our app is slow.
So, questions:
is it possible to improve the described request speed/latency?
maybe there's some quota that we are missing and it can be changed?
maybe someone can advice a more effective way of getting all files list?
P.S. We could potentially reduce the amount of requests. We have to perform some double checks for Shared with Me folders as we observed that sometimes request of all files doesn't list all files from Shared folders. That's a bit of a side story, and I don't think this will dramatically improve situation for us. I can provide more details on the actual set of requests we perform if necessary.
Are you returning all the fields - I would assume so since the only query param provided is trashed=false as the query param. Do you need all the fields? Can you try to reduce the query to only return the fields you really care about (using a field mask) and see if that improves your performance?

What does Google mean by "user" in Maps Time Zone API quota?

Looking at the Time Zone API quota limit option that says:
"Requests per 100 seconds per user"
Was trying to find a clear definition of user but to no avail.
My intuitive guess it's an IP address... yet would like to avoid unpleasant surprises.
Would be good to know what does user mean exactly. Any ideas?
I believe this is a confusion that comes from the fact that Google Maps APIs share the same developer console with Google Cloud Platform. The "Requests per 100 seconds per user" make sense in Google Cloud Platform, but I don't think they are really supported in Google Maps APIs.
If we check the documentation of Google Cloud Platform we will see the following explanation
To prevent individual users from using up your API quota, limit the number of requests per second per user for an API. Each API includes a default per-user limit, but you can modify that value as described in the previous section.
Individual users are identified by a unique string; if you're creating a server-side application (where the calling code is hosted on a server that you own) that makes requests on behalf of users, your requests must include the quotaUser parameter, as described below.
To identify a user, use the quotaUser=userID parameter. This value is for short term quota enforcement only, so you don't need to use a real user ID. You can choose any arbitrary string under forty characters long that uniquely identifies a user.
The quotaUser parameter is only used for capping requests per user per second. If you don't send the quotaUser parameter, then all calls are attributed to your server machines, in which case calls can't be capped by user.
source: https://cloud.google.com/apis/docs/capping-api-usage#limiting_requests_per_second_per_user
As far as I know none of Google Maps APIs support a quotaUser parameter, so this value doesn't make sense for Google Maps web services and as mentioned in the last paragraph Google Maps web services will attribute usage to server machines. In other words to IP addresses of your backend servers.
I hope this clarifies your doubt.

Google Drive multiple files download

We have a client-server architecture that uses Google Drive for sharing files between the client and the server, without having to actually send them.
The client uses the Google Drive API to get a list of file IDs of all files it wants to share with the server.
The server then downloads the files with the appropriate authorization token.
Server response time is crucial for user experience.
We tried a few approaches:
First, we used the webContentLink. This worked until we started receiving large files from the client. Instead of getting the files' content, we got an html with a warning "exceeds the maximum size that Google can scan". We could not find a header we can use to skip this check.
Second, we switched to the Google API resource URL with the alt=media query param. This works, but we then hit API quota errors (User Rate Limit Exceeded). Since this is server code, it was identified as a single user for all requests.
Then we added the quotaUser param to represent on behalf of which user each request is. We still got many 403 responses.
In addition, we implemented exponential backoff for the failed requests.
We also added a cache for the successful requests.
Our current solution is a combination of the two. Using the webContentLink whenever possible (which appears not to affect the Google API quota). If the response is not as expected, (i.e. an html, wrong size, etc.), we try the Google API resource URL (with exponential backoff).
(Most of the files are small enough to not exceed the scan size limit)
Both client and server uses the same OAuth 2.0 client ID.
Here are my questions:
1. Is it possible to skip the virus scan, so that all files can be downloaded using the webContentLink?
2. Is the size threshold for the virus scan documented? Assuming we know the file size we can then save the round-trip of the first request (using the webContentLink)
3. Is there anything else we can do other than applying for a higher quota?
Is it possible to skip the virus scan, so that all files can be downloaded using the webContentLink?
If it is greater than 25MB it is not possible with webContentLink but since you are using authorized request use files.get with alt=media. Apply appropriate error handling options (which you have done using exponential backoff). The next step would be checking if you code is optimized then after checking and applied recommended optimization and still received Error 403 Limit Exceed, time to apply for a higher quota.
Is the size threshold for the virus scan documented? Assuming we know the file size we can then save the round-trip of the first request (using the webContentLink)
To answer this, you can refer to the Google Drive Help Forum : How can I successfully download large files from google drive without network errors at the most end of the download:
Only files smaller than 25 MB can be scanned for viruses.
Is there anything else we can do other than applying for a higher quota?
You can do the following before applying for a higher quota:
Performance Tips
Drive Platform Best Practices
Handling API Errors
After all optimization is done, the only option is to apply for higher quota limit.
Hope this helps!

What is the limit on Google Drive API usage?

Because this product is new, I am looking forward to develop an app on it so is there any limit on the API usage such as:
upload and download quota
requests per app
etc
To view your allowed quota please create a project in the Google APIs Console. In the "Service" tab, the default quota allowed for each service is indicated.
Currently for the Drive API it reads "Courtesy limit: 1,000,000,000 queries/day". It's a per app quota.
After you've enabled the Drive API you can also set a per user rate limit (by default 1000 req per 100 sec) to prevent one user from depleting your app's quota. That's available in the "Quotas" tab.
There is also a link to request more quota in the "Quotas" tab in case you need more than the default 10M req/day such requests will go through a (light) manual review process.
Also files have per-files playback limit which depends on many factors (Is the file shared publicly or just to your domain/users? Is it a video? An audio file? etc...). These rules are not disclosed at this point unfortunately but for instance a publicly shared video can't be viewed by millions of anonymous sessions per day (use Youtube for that). Nor can an image be used on a high traffic website. Google Drive cannot be used as a Web scale CDN, it is scaled for personal content sharing (you share files with friends/work group/company).
I'm getting a message saying "Sorry, you can't view or download this file at this time.
Too many users have viewed or downloaded this file recently. Please try accessing the file again later. If the file you are trying to access is particularly large or is shared with many people, it may take up to 24 hours to be able to view or download the file. If you still can't access a file after 24 hours, contact your domain administrator."
so there IS a limit, just undocumented :e
this is a file only i downloaded about 50 times in the last hour (testing some stuff)
There's no mention of rate limiting on the best practices page, or in the performance tips but—most conspicuously of all—the documentation on handling errors does not contain any errors for going over rate limits.
I seen the free request quota as for Google Drive Api's is.
1,000,000,000 requests/day.
and Default Per-user limit is:(you can increase it)
10 requests/second/user
you can visit this and login with valid account for more information
https://console.developers.google.com/project/bionic-path-686/apiui/apiview/drive/quotas
It's not only about API-Queries .. there is also a Download-Limit, i always ran into .. But Google doesn't publish the allowed amount
There are some types of limits which exists in addition to "Quotas".
Error ex.:
code=403
reason=subscriptionRateLimitExceeded
message=Rate limit exceeded for creating file subscriptions.
Similar rate limits are not well documented and can make some pain to app developers.
Read the documentation and you should find what you're looking for.
"Any type of file can be stored in Drive, up to the user's storage limit or maximum file size of 10GB"
https://developers.google.com/drive/