Google Drive API Historical quota data - google-drive-api

I had an interesting experience with my Google account quota, which could be solved with access to the following information (if it exists!). Basically, can I get my Google account quota data WITH a time stamp? In other words, can I determine how much memory my account was allocated on a certain date?
Context: I had 122 GB allocated because I was paying for 100GB extra memory. At some point I cleared the memory to get back down to under 22GB used, so I cancelled the 100GB extra memory subscription. Upon doing so, my total allocated storage dropped to 17GB. When I asked why the support simply said 15GB+2GB for the security update (from 2016). And they said there is no historical record so they couldn't verify the 122GB number, and I do not know how I ended up with 20GB of memory +2 for the security update. I also didn't think about taking screenshots with these numbers before cancelling the subscription because this is Google we're talking about and I figured they had all this kind of data in case something went wrong. I've had a Google account since 2005 so I'm thinking maybe at some point I picked up a 5GB bonus somehow.
Anyway, I'm a data scientist so even if there's a complicated way to get this kind of info I'd be interested in hearing what kind of data is available that might verify my account of the memory allocation.

Maybe this post from Google Drive Help Forum may help you.
The first thing to note is that when you delete or remove files from
your google drive storage quota, you will want to empty your trash in
order to free up that space. If you have orphaned files, you can find
them In the Drive search field, type: is:unorganized owner:me
Your Google Drive Storage Quota consist of Google Drive, Google
Photos, and Gmail files. Sometimes changes made to your google drive
account will take the google servers 24-48 hours to properly sync and
reflect back to your drive. Usually this takes far less time but it
can happen. Here are some helpful links I use when managing my google
drive storage quota:
There is no 'history' option but if you want to to check your storage, here is the path that will lead you. From there you will see how much is the capacity of your account can use.
To see how much each of your files is taking up the quota, you may visit this link.
If you want to manage your quota, you may check this link for guidance.

Related

Trying to create a local copy of our google drive with rclone bringing down all the files, constantly hitting rate limits

As the title states, I'm trying to create a local copy of our entire google drive, we currently are using it as a file storage service which is obviously not the best use-case, but to migrate else where I of course need to get all the files, the entire google drive is around 800gb~.
I am using rclone specifically the copy command to copy the files FROM google drive TO the local server, however I am constantly running into user Rate Limit errors.
I am using a google service account to authenticate this as well, which I believe should provide more usage limits.
2021/11/22 07:39:50 DEBUG : pacer: low level retry 1/10 (error googleapi: Error 403: User
Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may
consider re-evaluating expected per-user traffic to the API and adjust project quota
limits accordingly. You may monitor aggregate quota usage and adjust limits in the API
Console: https://console.developers.google.com/apis/api/drive.googleapis.com/quotas?
project=, userRateLimitExceeded)
But I don't really understand since according to my usage it is not even coming close, I am just wondering what exactly can I do to either increase my rate limit (even if that means paying) or is there some sort of solution to this issue? Thanks
Error 403: User
Rate Limit Exceeded. Rate of requests for user exceed configured project quota.
User rate limit is the number of requests your user is making per second. its basically flood protection. You are flooding the server. It is unclear how google calculates this, beyond the 100 requests per user per second. If you are getting the error there is really nothing you can do besides slow down your code. Its also unclear from your question how you are running these requests.
If you could include the code we could see how the requests are being preformed. How ever as you state you are using something called rclone so there is no way of knowing how that works.
Your only option would be to slow your code down if you have any control over that though this third party application. If not you may want to contact the owner of the product for direction as to how to fix it.

Is it better to use cache or use a spreadhsheet is google apps script?

I'm writing google apps script web apps that share data between end users who are using the app at the same time.
I can write the data to a spreadsheet and allow others to read it, or put the data into script cache.
Either way I need a server call. The data is not large....I was just wondering if cache was more server efficient/ faster / better practice ?
Thanks
If you use the Cache Service the max time to live for data in a key 6 hours if you set the expiration, otherwise it lives in the cache for 10 mins. And also the maximum length of a key is 250 characters.
So it really depends on the architecture of your app, but using sheets as a database only perhaps isn't the best solution either, although it may be convenient in many cases.

How can I develop my own cloud storage with access to other users like Google Drive but much simpler?

My problem is that I need to build a cloud storage for my customers/clients/users, who can log in to my Cloud Storage Service.
I need to understand how they work in the back end, or how they’re developed, or how can I build my solution using a server, where I can thin provision my hard drive, let users see their data, etc. What resources and articles, along with the required skills, can I use? Or is there a software like WordPress is for websites?
Some additional points to better understand the problem:
How does Google Drive or Dropbox work in the background? Do they create a folder directory or a disk drive partition for each user?
Some of what I have in my mind: I develop a website where users purchase a plan of say 10 GB. The site then sends the userId, password, plan information to my Cloud Server, where I can assign storage to him.
At first, I thought to solve the problem with a root folder, where each new user will have a folder of his own. But that's where I found my first stumbling block: how to assign a size limit to a folder?
I also need to use the free storage (that the user is not using) to allocate to other users. And I don't think that can be done in directories (correct me if I'm wrong).
So far, I've searched about cloud storage, folder sizing, thin provisioning, public cloud, private cloud, etc. Most of the courses I've found so far teach about Amazon, Google, etc. However, I need to host my own cloud service.
Someone suggested to use Nextcloud or Syncthing, but they are not what I'm looking for (according to my understanding).
1- Syncthing works off of a peer-to-peer architecture rather than a client-server architecture.
2- NextCloud, from what I get, offers cloud storage for myself.
I apologize for the long post, but I'm in a real bind here. Any help will be much appreciated.
Nextcloud does what you want. You can create group folders and give permissions to registered users or groups. You can share files or folders with external users. It is not only for single private users. You have NC instances with 1000's of users or more

Google Drive API - Service account storage limitation - Alternatives

I'm developing a web application that's going to start with 200gb of data to be storaged. Over the years, the same application possibly can reach 1tb, perhaps 2tb in 5 years.
What I want from this application is the clients to upload files to the server and the server then upload files do Google Drive, persisting the webviewlink on database. It's working this way on localhost.
I know two options for authentication for Google Drive API: client account and service account.
Service Account's option fits better for me because I want the server to have control of the files, not the client have control.
But Service Account can storage too few data and the storage limit can't be increased. The limit is something around 15gb I guess, not sure.
If the Service Account will not help me, what options would I have to storage 2tb of data or more? Should I find another way to storage the files?
I'd like to stay using Google. If there's not any option using Google Drive API, please, suggest anything else for this scenario.
You have a couple of options.
Use a regular account instead of a Service Account. You will still need to pay for the storage, but it will work and you'll have everything in a single account. From your question "I want the server to have control of the files, not the client have control" I suspect you have looked at the OAuth quickstart examples and concluded that only end users can grant access. That's not the case. It's perfectly valid, and really quite simple, for your server app to grant access to an account it controls. See How do I authorise an app (web or installed) without user intervention? for how to do this.
Use multiple Service Accounts and shard your data across them. The various accounts could all share their folders to a kinda master account which would then have a coherent view of the entire corpus.
Personally I'd go with option 1 because it's the easiest to set up and manage.
Either way, make sure you understand how Google will want to charge you for the storage. For example, although each Service Account has a free quota, it is ultimately owned by the regular user that created it and the standard user quota limits and charges probably apply to that user.

response time google spreadsheet with large number of rows

I'm developing a web application that will use a google spreadsheet as a database.
This will mean parsing up to 30.000 (guestimated size) rows in regular operations for searching ID's etc...
I'm worried about the response times i will be looking at. Does anybody have experience with this? I don't want to waste my time on something that will hit a deadend at an issue like that.
Thanks in advance
Using spreadsheets as a database for this data set is probably not a good idea. Do you already have this spreadsheet set up?
30K rows will allow you to have only 66 columns, is that enough for you? Check the Google Docs size limits help page for more info.
Anyway, Google Spreadsheets have a "live concurrent editing" nature to it that makes it a much slower database than any other option. You should probably consider something else.
Do you intend to use the spreadsheet to display data or only as a storage place.?
In this second option the relative slowness of the spreadsheet will not be an issue since you'll only have to read its data once to get its data in an array and play with that...
This implies of course that you build every aspect of data reading and writing in a dedicated UI and never show the spreadsheet itself , the speed will be depending only on the JavaScript engine on arrays, the speed of UI and the speed of your internet connection... all 3 factors being not very performing if compared to a 'normal' application but with the advantage of being easily shareable and available anywhere.:-)
That said, I have written such a database app with about 20 columns of data and 1000 rows and it is perfectly useable although having some latency even for simple next / previous requests. On the other hand the app can send mails and create docs.... the advantages of Google service integration :-)
You could have a look at this example to see what I'm talking about