How to connect google drive folder to google colab - google-drive-api

Need to call data folder but how to do it in colab. I tried it on spyder but not able to do in colab, already mounted google drive.
import torch
import cv2
from data import BaseTransform, VOC_CLASSES as labelmap
from ssd import build_ssd
import imageio
Image of google drive in collab
It will be better if I able to interect with whole Windows 1 folder?

You can connect to google colab using
from google.colab import drive
import os
drive.mount('/content/drive')
To call you can copy the path. You can get the path of the folder by right click --> copy path. You can use that path for reading/writing the files in that folder.

Thank you, I got the answer
I just did
cd /content/drive/Mypath

Related

How to upload file from shared folder to google colab?

i used
from google.colab import drive
drive.mount('/content/drive')
But only my disk is loading
And the folder in which I have access is not loaded.
I want to upload a file from someone else's google drive to my colab and then use it.
How can I do that?
I have already tried posting the link but get
Mountpoint must be in a directory that exists

Exporting Google Drive/Docs files to Google Cloud Storage

We need to scan files with google dlp. However google dlp scanning is only supported in gcs. (https://cloud.google.com/blog/products/identity-security/take-charge-of-your-data-scan-for-sensitive-data-in-just-a-few-clicks)
So I need to export the files with known file ID, to GCS and apply the google dlp scanning there.
Is there anybody here know that how to export files from google drive to GCS without manual intervention?
This might do the trick
https://colab.research.google.com/drive/1Xc8E8mKC4MBvQ6Sw6akd_X5Z1cmHSNca
from google.colab import drive
drive.mount('/content/drive')
from google.colab import auth
auth.authenticate_user()
project_id = 'your-project-id'
!gcloud config set project {project_id}
!gsutil ls
bucket_name = 'medium_demo_bucket_190710'
!gsutil -m cp -r /content/drive/My\ Drive/Data/* gs://{bucket_name}/

How to load data to google colaboratory for others to use without downloading the data folders?

I use the 'mount drive' to use the dataset folder.
However, the google drive belongs to only me so others who want to run my code on colab have to download the dataset and upload to their google drive, then mount drive to run the code.
Is there any solution for others to run my the code without downloading the dataset?
I made a library to do just that
!pip install kora
from kora import drive
drive.download_folder(folder_id)

Reading gscript file with Colab

Imagine I got some gscript file sitting in my Google Drive in some Test folder. I can see it from Colab like so:
from google.colab import drive
drive.mount('/content/drive')
import os
files = os.listdir("drive/My Drive/Test")
files
['Test_app.gscript']
However, I tried some options to read Test_app.gscript, but all of them have given me OSError: [Errno 95] Operation not supported: 'drive/My Drive/Test/Test_app.gscript'
Is there a way to read this slippery gscript doc?

Google colab and google drive: Copy file from colab to Google Drive

There seem to be lots of ways to access a file on Google Drive from Colab but no simple way to save a file from Google Colab back to Google Drive.
For example, to access a Google Drive file from Colab, you can mount the Google Drive using
from google.colab import drive
drive.mount('/content/drive')
However, to save an output file you've generated in Colab on Google Drive the methods seem very complicated as in:
Upload File From Colab to Google Drive Folder
Once Google Drive is mounted, you can even view the drive files in the Table of Contents from Colab. Is there no simple way to save or copy a file created in Colab and visible in the Colab directory back to Google Drive?
Note: I don't want to save it to a local machine using something like
from google.colab import files
files.download('example.txt')
as the file is very large
After you have mounted the drive, you can just copy it there.
# mount it
from google.colab import drive
drive.mount('/content/drive')
# copy it there
!cp example.txt /content/drive/MyDrive
Other answers suggest how to copy a specific file, I would like to mention you can also copy the entire directory, which is useful when copying logs from callbacks from Colab to Drive:
from google.colab import drive
drive.mount('/content/drive')
In my case, the folder names were:
%cp -av "/content/logs/scalars/20201228-215414" "/content/drive/MyDrive/Colab Notebooks/logs/scalars/manual_add"
You can use shutil to copy/move files between colab and google drive
import shutil
shutil.copy("/content/file.doc", "/content/gdrive/file.doc")
When you are saving files, simply specify the Google Drive path for saving the file.
When using large files, Colab sometimes syncs the VM and Drive asynchronously. To force the sync, simply run:
from google.colab import drive
drive.flush_and_unmount()
in my case I use the common approach with the !cp command.
But sometimes, it didn't work in Colab because we didn't enter the right file path.
basic code: !cp source_filepath destination_filepath
implementation code:
!cp /content/myfolder/myitem.txt /content/gdrive/MyDrive/mydrivefolder/
in addition, to correctly enter the path, you can copy the path location from the table of contents on the left side by clicking the dot menu -> copy path.
Once you see the file in the Table of Contents of Colab on the left, simply drag that file into the "/content/drive/My Drive/" directory located on the same panel. Once the file is inside your "My Drive", you will be able to see it inside your Google Drive.
After you mount your drive...
from google.colab import drive
drive.mount('/content/drive')
...just prepend the full path, including the mounted path (/content/drive) to the file you want to write.
someList = []
with open('/content/drive/My Drive/data/file.txt', 'w', encoding='utf8') as output:
for line in someList:
output.write(line + '\n')
In this case we save it in a folder called data located in the root of your Google Drive.
You may often run into quota limits using the gdown library.
Access denied with the following error:
Too many users have viewed or downloaded this file recently. Please
try accessing the file again later. If the file you are trying to
access is particularly large or is shared with many people, it may
take up to 24 hours to be able to view or download the file. If you
still can't access a file after 24 hours, contact your domain
administrator.
You may still be able to access the file from the browser:
https://drive.google.com/uc?id=FILE_ID
No doubt gdown is faster but i copy my files using the command below and avoid quota limits
!cp /content/drive/MyDrive/Dataset/test1.zip /content/dataset