I have a directory (temp0) on Google Drive which holds 100,000 small files. I cannot open it in Google colab, presumeably because there are too many files.
So I have created a temp0.tar.gz file using python tarfile which is 489KB and I want to download that from my Google Drive and untar it in the Colab environment.
I have used :
!wget -O temp0.tar.gz https://drive.google.com/open?id=1lmGFLXtkvhucF033MmBW9yNiAaEgk4_d&authuser=alantjohnstone%40gmail.com&usp=drive_fs
It seems to work reporting:
Resolving drive.google.com (drive.google.com)... 74.125.31.113, 74.125.31.102, 74.125.31.138, ...
Connecting to drive.google.com (drive.google.com)|74.125.31.113|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://accounts.google.com/ServiceLogin?service=wise&passive=1209600&continue=https://drive.google.com/open?id%3D1lmGFLXtkvhucF033MmBW9yNiAaEgk4_d&followup=https://drive.google.com/open?id%3D1lmGFLXtkvhucF033MmBW9yNiAaEgk4_d [following]
--2021-12-06 16:14:13-- https://accounts.google.com/ServiceLogin?service=wise&passive=1209600&continue=https://drive.google.com/open?id%3D1lmGFLXtkvhucF033MmBW9yNiAaEgk4_d&followup=https://drive.google.com/open?id%3D1lmGFLXtkvhucF033MmBW9yNiAaEgk4_d
Resolving accounts.google.com (accounts.google.com)... 64.233.170.84, 2607:f8b0:400c:c07::54
Connecting to accounts.google.com (accounts.google.com)|64.233.170.84|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘temp0.tar.gz’
temp0.tar.gz [ <=> ] 89.18K --.-KB/s in 0.001s
2021-12-06 16:14:13 (60.9 MB/s) - ‘temp0.tar.gz’ saved [91323]
However it has only downloaded 90K of the 489K
I have not been able to open the result to see what it actually is.
Could somebody tell me what I am doing wrong?
I discovered how to move a large file FROM Google Drive TO the colab environment.
1 Get the file ID by right clicking the file and selecting the Get Link.
Copy the link and change the permission to Anybody with Link
2 In Colab paste the link into a cell and prefix with !gdown --id
3 NOTE You have to remove all the Garbage BEFORE AND AFTER the ID
eg https://drive.google.com/file/d/1m3NvCCyuRptopEPBXSPHaNCkfVx4E6ZR/view?usp=sharing
The ID is 1m3NvCCyuRptopEPBXSPHaNCkfVx4E6ZR
i don't think you're able to download stuff directly from google drive, also you need to put the urls in a quote. one solution would be to mount your google drive with the button on the left.
Related
I have been trying in vain to update my website. I use the following information for a connection:
server
name + password
port
SFTP
When I save the file, I get a message that the file was uploaded successfully.
However, the website does not update.
In addition, the file on the server does not seem to correspond to the file displayed on the e.g. homepage.
I am happy about every answer.
Thank you in advance.
Maybe the files are cached so even though you change the files, server might be responding you with cached results. To check if that's the case, open your developer console, then go to Network tab and then refresh your page. When the data is loaded, check if it says "cached" on "Status" column. If so, try force refreshing your page by pressing CTRL+F5.
I just created an new API Management Service.
I am following the tutorial instructions here
I am copying the value
http://conferenceapi.azurewebsites.net?format=json
straight from the tutorial. Why is it not valid?
This is a false warning. Clicking Create does work.
At the end of my frustration with the same, I copied the contents of the generated swagger.json file into Notepad and saved as swagger.json on my Desktop. Skipped the validation url for a file upload option and was able to Create.
All seems to work fine now.......I don't know..........I just don't know........
Followed through and it appears DNS related:
$ ping https://weatherdataapi157e60163e.azure-api.netping: https://weatherdataapi157e60163e.azure-api.net: Name or service not known
I've got trouble with shared links pointing to ZIP files. Even though I am still able to download the ZIP file, I've got a message saying "A problem occurred with the file preview".
This message is displayed only when I am not authenticated and only with ZIP files in my case. If I am authenticated, the preview does work fine.
Anyway, to proceed with the download directly and not use the preview at all ?
D?
I'm working with the Chrome FileSystem API and I keep running into an issue with the number of files in a directory. It looks like there's a limit of 100 files per directory. Unfortunately I can't find any confirmation of this and was wondering anyone else has encounterd this or can confirm?
Found the answer here: https://bugs.chromium.org/p/chromium/issues/detail?id=378883
You have to keep using readEntries until the response is empty
My app creates a Drive folder, then uploads 100 files into it.
Using my app, I can read, update and delete the files individually.
However if I try to delete the folder, I get "The authenticated user may not have granted the app 698xxx995 write access to all of the children of file 0B6B-xxx".
NB. The upload is done using a server, the delete is being done in Javascript.
I'm using the drive.file scope for my app, as recommended.
Any ideas?
Here are the steps ...
using the server, create a folder containing 100 files
try to delete the folder using the client, get the 403
using the same client, I delete each of the 100 files in the folder. no problem so now the folder is empty
try to delete the folder , get the 403 about the children
To people with a similar error, maybe you can check your app's scope, change https://www.googleapis.com/auth/drive.file to https://www.googleapis.com/auth/drive.
This took me hours to find out what happened.
For detail: https://developers.google.com/drive/web/scopes
The OAuth 2.0 credentials you are using for server and client side should be from the same API console project.
This is a recreateable bug.
Using the code in 403 rate limit on insert sometimes succeeds ...
create a folder containing 97 files
delete it (POST to trash URL)
observe 200 response and the folder and files are gone
create a folder containing 100 files
delete it (POST to trash URL)
observe 403 The authenticated user may not have granted the app 698xxx995 write access to all of the children of file 0Bxxx
Post when deleting a folder with 97 files
Request URL:https://content.googleapis.com/drive/v2/files/0Bw3h_yCVtXbg/trash
Request Method:POST
Status Code:200 OK
Post when deleting a folder with 100 files
Request URL:https://content.googleapis.com/drive/v2/files/0Bw3h_/trash
Request method:POST
Status Code:403 Forbidden
I was having a similar thing happen when trying to delete folders that contained other nested folders. After a bit of experimenting, I found that I was able to prevent the error by deleting all child folders before deleting the parent folder, rather than just trying to delete the single top-level parent folder.
I haven't had this error as of yet when deleting folders that contain only assorted files (text files, images, etc...). Of course, the error may just pop again in the future. I'll update this post if I start getting the error again...