Google Drive Rest API Limitations - google-drive-api

I want to use the Google Drive Rest API for my Application as a CDN. Are there any bandwith or traffic limitations?Lets say 100.000 users want to download the same file (200MB). Is this a problem?

The Google Drive API comes with a free quota as follows:
Queries per day 1,000,000,000
Queries per 100 seconds per user 1,000
Queries per 100 seconds 10,000
https://console.developers.google.com/apis/api/drive.googleapis.com/quotas
These limits can be increased on application.
There may also be other limits as outlined in this post
What is the limit on Google Drive API usage?

Related

Trying to create a local copy of our google drive with rclone bringing down all the files, constantly hitting rate limits

As the title states, I'm trying to create a local copy of our entire google drive, we currently are using it as a file storage service which is obviously not the best use-case, but to migrate else where I of course need to get all the files, the entire google drive is around 800gb~.
I am using rclone specifically the copy command to copy the files FROM google drive TO the local server, however I am constantly running into user Rate Limit errors.
I am using a google service account to authenticate this as well, which I believe should provide more usage limits.
2021/11/22 07:39:50 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User
Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may
consider re-evaluating expected per-user traffic to the API and adjust project quota
limits accordingly. You may monitor aggregate quota usage and adjust limits in the API
Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?
project=, userRateLimitExceeded)
But I don't really understand since according to my usage it is not even coming close, I am just wondering what exactly can I do to either increase my rate limit (even if that means paying) or is there some sort of solution to this issue? Thanks
Error 403: User
Rate Limit Exceeded. Rate of requests for user exceed configured project quota.
User rate limit is the number of requests your user is making per second. its basically flood protection. You are flooding the server. It is unclear how google calculates this, beyond the 100 requests per user per second. If you are getting the error there is really nothing you can do besides slow down your code. Its also unclear from your question how you are running these requests.
If you could include the code we could see how the requests are being preformed. How ever as you state you are using something called rclone so there is no way of knowing how that works.
Your only option would be to slow your code down if you have any control over that though this third party application. If not you may want to contact the owner of the product for direction as to how to fix it.

Support 5000 concurrent users of Apps Script Web App

I am in the process of architecting a small application for a G Suite customer. The app will be used by all employees at the customer and during peak times could see as many as 5,000 concurrent users.
They do not have App Maker and do not plan to enable App Maker anytime soon.
Will an Apps Script Web App be able to handle this many concurrent users?
Database:
I was originally thinking of using a Google Sheet as the database. There are no documented Apps Script limits around reading or writing data to a Google Sheet. So as long as I stay within the Google Sheets API quota limits I should be good.
With that said, I am also debating on using Cloud SQL as the database. App Maker forces you to use Cloud SQL so I assume it is a superior option to a Google Sheet. The only way I see to connect to Cloud SQL from Apps Script is via the JDBC service. However there is a hard limit of 50,000 connections per day. https://developers.google.com/apps-script/guides/services/quotas#current_quotas
Does anyone know if this limit is per app script or per user?
If per app script, then 5,000 users would only have 10 calls each per day. That would be unusable for my needs.
Side note, Google Cloud SQL has a maximum of 4,000 connections. I am banking on the fact that reads and writes will be extremely fast so max connections at a single point in time will be less than 4,000.
As TheMaster noted in the above comment you have a max of 30 simultaneous executions which limits you to a max of 30 concurrent users for a GAS Web App.
As an alternative you might be able to leverage Cloud Functions (basically just a Node.js/Express.js module). ̶M̶a̶x̶ ̶n̶u̶m̶b̶e̶r̶ ̶o̶f̶ ̶c̶o̶n̶c̶u̶r̶r̶e̶n̶t̶ ̶u̶s̶e̶r̶s̶ ̶i̶s̶ ̶1̶0̶0̶0̶ ̶t̶h̶o̶u̶g̶h̶. It's not a free platform, but it supports CloudSQL and may be cheaper than getting a Google for Business account (US$11.00 per month per user) as required by App Maker.

What is google sheets maximum access limit

i currently have a google web app that takes a value then checks it against 14,0000 cells in a master google sheet. i've had the idea to break the function down into 14 functions that run simultaneously(each function searching 1000) therefore cutting the runtime from 6 seconds to below 1. The issue i am concerned about is the limit of 30 simultaneous functions running. this web app will be used by more than one person at a time so it needs to be able to do more than 30. My idea is to create an account for each user with a copy of the web app for each account, so that the simultaneous executions will stay around 14 for each user. so my question is this, Does google sheets have a limit to the amount of simultaneous read/write? any help is greatly appreciated.
If you can implement this I think there are no limits.
This version of the Google Sheets API has a limit of 500 requests per
100 seconds per project, and 100 requests per 100 seconds per user.
Limits for reads and writes are tracked separately. There is no daily
usage limit.
https://developers.google.com/sheets/api/limits

Youtube API: Get subscribers counts of a million channels

scenario: MySQL database with a million entries, containing Youtube channel IDs and subscribers counts. Now I would like to update the subscribers counts periodically (maybe once a week).
I am wondering if such large requests are even possible with the API? And if so, what would be the most efficient way to get the subscribers count of a million channels? And if not so, can you think of a work-around?
Thank you very much!
I think you may want to consider doing them in batches you should check the Google developer console see what your quota is for the API currently. If there is a pencil icon next to the quota you can increase it.
Also checking the YouTube Data API (v3) - Quota Calculator might help you decide how many to run each day.

Google Drive API Historical quota data

I had an interesting experience with my Google account quota, which could be solved with access to the following information (if it exists!). Basically, can I get my Google account quota data WITH a time stamp? In other words, can I determine how much memory my account was allocated on a certain date?
Context: I had 122 GB allocated because I was paying for 100GB extra memory. At some point I cleared the memory to get back down to under 22GB used, so I cancelled the 100GB extra memory subscription. Upon doing so, my total allocated storage dropped to 17GB. When I asked why the support simply said 15GB+2GB for the security update (from 2016). And they said there is no historical record so they couldn't verify the 122GB number, and I do not know how I ended up with 20GB of memory +2 for the security update. I also didn't think about taking screenshots with these numbers before cancelling the subscription because this is Google we're talking about and I figured they had all this kind of data in case something went wrong. I've had a Google account since 2005 so I'm thinking maybe at some point I picked up a 5GB bonus somehow.
Anyway, I'm a data scientist so even if there's a complicated way to get this kind of info I'd be interested in hearing what kind of data is available that might verify my account of the memory allocation.
Maybe this post from Google Drive Help Forum may help you.
The first thing to note is that when you delete or remove files from
your google drive storage quota, you will want to empty your trash in
order to free up that space. If you have orphaned files, you can find
them In the Drive search field, type: is:unorganized owner:me
Your Google Drive Storage Quota consist of Google Drive, Google
Photos, and Gmail files. Sometimes changes made to your google drive
account will take the google servers 24-48 hours to properly sync and
reflect back to your drive. Usually this takes far less time but it
can happen. Here are some helpful links I use when managing my google
drive storage quota:
There is no 'history' option but if you want to to check your storage, here is the path that will lead you. From there you will see how much is the capacity of your account can use.
To see how much each of your files is taking up the quota, you may visit this link.
If you want to manage your quota, you may check this link for guidance.