How to make virtual environment on google colab that can be saved on drive? - google-drive-api

please, how can i build a virtual environment where the libraries will be preserved and reused on google colab/drive without having to reinstall them again each time?
I tried to clone repos from github and put it in drive but this can't be done for other dependencies like open cv, pandas, numpy ...

Related

Is there a way to mount Google Drive on my local machine like what could be done in Colab?

In Colab, the following code snippet is used for mounting Google Drive.
from google.colab import drive
drive.mount('/test', force_remount=True)
And I'm wondering if it could work on my local machine. When implementing this locally, it says "no module named google", even after having executed pip install google.
Is there another package that should be installed, or it just cannot be achieved? I've searched for a while, but it seems that the only solution is to install Google Drive Desktop to give access to remote files.
Although google.colab python library can be found here, this library is a collection of tools meant to work in conjunction with the Google Colab product.
Indeed, Google Drive Desktop is your best option to "mount" your Google Drive to your local machine.
Alternatively, there are several 3rd party Google Drive clients available.
Use ocamlfuse.
Here are the step by step details: https://medium.com/#enthu.cutlet/how-to-mount-google-drive-on-linux-windows-systems-5ef4bff24288
Instead of mounting it to a home folder (named googledrive in tutorial), I suggest mounting it such that folder structure would be same for both colab and local machine. To do that:
create your mounting folder at root (it's not a recommended practice but there is no harm). You need to use sudo. i.e at /, use sudo mkdir test
then create MyDrive inside test.
Chnage test or MyDrive owner to yourself: sudo chown <your username> MyDrive/
Mount to MyDrive by: google-drive-ocamlfuse MyDrive/
Enjoy!

How to load data to google colaboratory for others to use without downloading the data folders?

I use the 'mount drive' to use the dataset folder.
However, the google drive belongs to only me so others who want to run my code on colab have to download the dataset and upload to their google drive, then mount drive to run the code.
Is there any solution for others to run my the code without downloading the dataset?
I made a library to do just that
!pip install kora
from kora import drive
drive.download_folder(folder_id)

copy file from Google Drive to Google Cloud Storage within Google

Is it possible to copy a file from Google Drive to Google Cloud Storage? I imagine it would be very fast since both are on a similar storage system.
I haven't seen any information on either ways to do this seamlessly, without having to download the file (probably to an out-of-google system) and then re-uploading it. About one year ago, a similar question was asked: Transfer files from dropbox/drive to Google cloud storage.
Is there a way to do this directly, with the file staying within-Google the entire time?
#frunkad commented a link to a nice workaround, but for completeness I will recite this here, as it is currently the top result in search.
You can open a Colab (Juypiter Notebook on Google servers), mount your gDrive and use the gCloud CLI to copy files.
open https://colab.research.google.com/ and connect to a machine
Mount your Drive by executing the code below, clicking on the link and pasting the authentification key
from google.colab import drive
drive.mount(‘/content/drive’)
Connect to your Google Cloud Storage. (Project id can be found here)
from google.colab import auth
auth.authenticate_user()
project_id = 'your-project-id'
!gcloud config set project {project_id}
!gsutil ls
Copy files using gsutil. Use -m tag for multi-threading to increase speed. (There is a Subfolder called "My Drive" that you have to address in your mounted drive)
bucket_name = 'your_bucket_name'
!gsutil -m cp -r /content/drive/My\ Drive/Your-Data/* gs://{bucket_name}/
Original Author Philip Lies
Link to his colab: https://colab.research.google.com/drive/1Xc8E8mKC4MBvQ6Sw6akd_X5Z1cmHSNca
There's no direct way to do this. You could write an App Engine or Compute Engine app that reads from Drive using its API and writes to GCS using its API.

Google Cloud Functions: How do you share source code?

I have a Node server and multiple controllers that perform DB operations and helpers (For e-mail, for example) within that directory.
I'd like to use source from that directory within my functions. Assuming the following directory structure:
src/
server/
/app/controllers/email_helper.js
fns/
send-confirm/
What's the best way to use email_helper within the send-confirm function?
I've tried:
Symbolically linking the 'server' directory
Adding a local repo to send-confirm/package.json
Neither of the above work.
In principle, your Cloud Functions can use any other Node.js module, the same way any standard Node.js server would. However, since Cloud Functions needs to build your module in the cloud, it needs to be able to locate those dependency modules from the cloud. This is where the issue lies.
Cloud Functions can load modules from any one of these places:
Any public npm repository.
Any web-visible URL.
Anywhere in the functions/ directory that firebase init generates for you, and which gets uploaded on firebase deploy.
In your case, from the perspective of functions/package.json, the ../server/ directory doesn't fall under any of those categories, and so Cloud Functions can't use your module. Unfortunately, firebase deploy doesn't follow symlinks, which is why that solution doesn't work.
I see two possible immediate fixes:
Move your server/ directory to be under functions/. I realize this isn't the prettiest directory layout, but it's the easiest fix while hacking. In functions/package.json you can then have a local dependency on ./server.
Expose your code behind a URL somewhere. For example, you could package up a .tar and put that on Google Drive, or on Firebase Cloud Storage. Alternatively, you can use a public git repository.
In the future, I'd love it if firebase deploy followed symlinks. I've filed a feature request for that in Firebase's internal bug tracker.

Syncing network folder with Google Drive

I am not a developer but was looking into google drive for my business. I asked if they could sync a folder on a networked extrnal harddrive. I was instructed to write an app. I run a business I am not a developer. Does anyone have any suggestions?
You can try something like gdrive_sync
You don't have to write any code but you will need to install python, download this program, and do some typing in a command line window.
After you download and install python... download gdrive_sync and extract it to a directory with 7zip. Then open a command line and go to the directory you extracted to and type
setup.py install
to install the program. Then follow the instructions under the "Usage" section of the link above to synchronize your drives with google drive. It's a powerful and free way to use google drive if you are willing to spend a bit of time on it.