googleapi: 403: The download quota for this file has been exceeded. Creds are good, file is new - google-drive-api

Please note, this issue seems unique from other "quota exceeded" issues I've seen. It's not bad creds, and it's not a file that has been downloaded a lot.
2020/02/21 15:44:49 ERROR : test.mkv: Failed to copy: multpart copy: failed to open source: open file failed: googleapi: Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded
I'm only receiving this error through the API.
Other files in this shared drive are accessible with the same credentials.
This doesn't affect the file for other users (ie: I tested with an actual Google account [rather than a Service Account] and was able to download the file via the UI.
So what can be going on? I'm used to 403s with Google APIs when the user doesn't have permission, and have even seen this "reason" in a scenario where I just hadn't granted permissions yet. But in this case, the permissions seem just fine [they're all set at the Drive level, anyway].
Important context:
The files that are having this problem were copied from... public sources. Those files likely have exceeded their download quota. But these are copies that I made into my own drive.
To be more speciic, I have copied files into my Drive via the web UI and these files have never presented an issue.
However, these new files that are having issues - I copied them via the Drive API. But they say they're owned by my Service Account, and I can't find ANY linkage back to the original file they were copied from. Just like when I copy them by hand, they have their own file IDs, different than the ones they were copied from.
Things I'm suspicious of:
Maybe the copy performed via the UI, and the copy performed by the API are somehow different?
Somehow my "copy" of the file is more strongly linked to the original file I copied from originally?
Otherwise, I'm out of ideas. Any suggestions would be highly appreciated.
EDIT:
Manual copy method: File file, add star to it, navigate to Drive web ui, go to Starred items, right-click on the file, "Make a Copy".
New API method: use Rust to do this:
let url = format!("https://www.googleapis.com/drive/v3/files/{}/copy", file_id);
let auth_hdr_val = format!("Bearer {}", token.as_str());
let mut query_params = HashMap::new();
query_params.insert("supportsAllDrives", true);
query_params.insert("supportsTeamDrives", true);
let mut body = HashMap::new();
body.insert("parents", [destination_parent_dir]);
client
.post(&url)
.query(&query_params)
.header(reqwest::header::AUTHORIZATION, auth_hdr_val)
.json(&body)
.send()
Nothing too wild, just calling the v3 drive copy API, with a parent directory specified which is a Shared Drive. (Seems like it also happens if I API copy to my drive first, then manually move to Shared Drive.)

Google Drive has undocumented API limits. It appears that one of them is a per-user, per-file limitation.
I switched the service account that I was using to download the particularly problematic files, and they downloaded successfully.
It appears that the "new" method was not the problem, but rather that another service was indexing "new" files in Google Drive and bumped up against this limit. The service account used for downloading was shared with the one indexing.
So, avoid hitting these limits, or use different service accounts for different roles, etc.

I know you found a workaround for your issue, but it seems the issue you were having, it was previously reported by other people on Issue Tracker:
"downloadQuotaExceeded": The download quota for this file has been exceeded..
Receiving downloadQuotaExceeded error in python client.
You can hit the ☆ next to the issue number in the top left on this page as it lets Google know more people are encountering this and so it is more likely to be seen to faster.

Related

Why can't I see the files my code has uploaded to Google Drive?

I have code that uses the Google Drive v3 SDK.
I have a regular (free) Gmail account, and have created a service account. I created a folder in my Google Drive space and shared it with the service account. Lots of kudos to Linda Lawton, whose tutorials are waaay better than the Google docs!
My code uses the Id of the shared folder as its root, and can upload files to the folder and download them again. However, if I look in Google Drive in my browser when signed in with my regular account, the shared folder is empty.
When I upload files, I set a permission of anyone for the Reader role, so that shouldn't be the issue.
Can anyone explain why I can't see the files? I know that the service account has its own GD space, but I'm using the shared folder, which is in my GD space, not the service account's, so I expected to see the files.
Update I'm pretty sure the problem is that the permissions are not being set correctly on the files. I'm doing the following...
Permission permission = new() { Role = "reader", Type = "anyone" };
await _service.Permissions.Create(permission, fileId);
Note that this uses the .NET SDK, but I don't think that should make a lot of difference, the code is almost identical in (say) Java.
I added a call to query the permissions on a file that my script had uploaded, and got the following two...
Type: anyone, Role: reader, Kind: drive#permission, Email:
Type: user, Role: owner, Kind: drive#permission, Email:
The first permission comes from the code shown above. The second one is from when I tried setting a permission for the Gmail account. It seems to have set the permission, but not stored the email address.
Either way, I would have expected that the first permission would have been enough to allow the Gmail account to see them from a browser.
Thanks
Well, after many hours of banging my head against the screen, I finally realised that when I created the top-level folder, I forgot to specify the shared folder, so my code was creating its folders and files in the service account's drive space, not in the shared folder.
I set the correct parent folder, and all works fine now! Don't know if this will ever help anyone, but I'll leave the question and this answer in case it does.

requests: Get last modified time of Google Doc or Sheet?

I want to download a Google Sheet (and/or Doc, or Colab Notebook) from an "Anyone can View" sharing URL, if the file is newer than my local copy. To do that, I need to find out when the remote file was last modified. Which I thought shouldn't be hard.
There are threads explaining how to do this for regular files on websites that make use of the HTML Last-Modified property, but Google doesn't provide this field in its headers. It provides a Date: but that's just the download date/time that updates every moment.
I see threads about doing this from within the Doc or Sheet itself. My question is not about that. I'm talking about getting the info remotely by running a python script on my local machine.
I see a thread about using the Google Drive API v3, but....is it really necessary to go through all that (e.g. install oauth, register an API key, etc. effectively create an entire Google app *) just to find out when a publicly-available file was last modified? Is there an easier way?
Thanks!
EDIT: * I started down the road of Google Drive API but I find it confusing and overwhelming. It's like they think I'm trying to create an app for general users for the Android Store, instead of just myself. (??)

Library with identifier tmplib is missing (perhaps it was deleted, or maybe you don't have read access?)

I am using Legacy free edition of G Suite account to run my web application. What I want to achieve is to force users to log in. So I deployed the web application with these settings
Execute as User accessing the web app
Who has access Anyone with Google account
Now I was able to identify the user accessing the web application - I got their email from Session.getActiveUser().getEmail(); perfect. Then I want to save all their "data" in 'their own' spreadsheet. When I save a file using DriveApp.createFile(fileName, 'Hello, world!') the file is save on their Google disc. I want all the application data to be on my disc.
So I created a new project - library - and deployed as library and passed it the email because it will be the file name. It works if I access the web as me - owner of both projects. If I access the web as someone else I got this error We're sorry, a server error occurred while reading from storage. Error code NOT_FOUND.
Then I deployed the library as a web application with these settings
Execute as Me
Who has access Anyone or Anyone with Google account
again if I access the web as the owner it works but someone else gets an error Library with identifier tmplib is missing (perhaps it was deleted, or maybe you don't have read access?)
Also there is no log. Just error. Actually there are no logs from the library - I used Logger.log() and console.log()
I tried to add the library from the old editor as there was some error reported in past that might cause such error.
Is there any way I can use an web app and library intended way so the file is always saved as me?
UPDATE
If I share the library using share button and anyone is Editor then the web user needs to give permission to the web application to access their Drive and the file is saved in their Drive. I want ALWAYS the file to be saved in my Drive.
This appears to be a bug!
There is already a report on Google's Issue Tracker which detail the same kind of behaviour:
Error when including library
You can also hit the ☆ next to the issue number in the top left on the aforementioned pages which lets Google know more people are encountering this and so it is more likely to be seen to faster.

Can download files (without file name or extension), but unable to see them?

I'm using the Google Drive API to generate a downloadable link for a file that looks like this:
https://www.googleapis.com/drive/v3/files/{file_id}?access_token={access_token}&alt=media
The problem is files are being downloaded without name and without extension.
If I add the extension manually after the file downloads it still works, but it's bad to tell my users they have to do that every time, and the user would have to know which extension the file actually has.
If I call https://www.googleapis.com/drive/v3/files/?access_token={access_token}
I can't see the files I've uploaded... I think it could be a problem related to having access to the files' metadata, but I did authorize the auth.files.metadata scope.
I generate the access token using the refresh token and obtained the refresh token from the Oauth Playground... I'm confident I used the proper credentials for my application, but I could triple-check if someone suggests this could be the problem.
What can I do to debug this?

Can chrome extension modify a file in our hard drive?

I am making a chrome extension which needs to add/delete/modify file in any location in our hard drive. The location can be temporary folder. How is it possible to make it. Please give comments and helpful links which can lead to me have this work done.
You can not, but adding a local server (nodejs/deno/cs-script/go/python/lua/..) to have a fixed logic (security) to do file stuff and providing a http server to answer back in an ajax/jsonp request would work.
The extension will not be able to install the software part.
edit: if you want to get started using nodejs, this could help
edit2: With File and Directory Entries API (this could help) you can get hold of a FILE OR complete FOLDER (getDirectory(), showDirectoryPicker()).
Thankfully, this is impossible.
Google or any other company wouldn't have many friend if their extension(s') installation caused compromise including complete control over any files(ie. control over machine) on your hard drive. The extension can save information to disk in a location that is available for storing local information as mentioned. You will not have any execute permission on the root or anywhere nor will you have any read or write permission outside of the storage location.
However, extensions can still be malicious if they gather information from a user of a web page (I am sure that Google can filter some suspicious extensions).
If you really need to make changes on your hard drive you can store information on a server and poll for changes with a windows client application or perhaps you can find where the storage information is kept and access it from there from a windows app.