I use Xampp for localhost. I would like to have my files on localhost synced on different computers. I figure that I could just install Google Drive inte the Xampp directory eg. "localhost/Google Drive" (of course I will have to do this on all computers).
Before I do so I wonder if there would be any disadvantages doing so?
Also I wonder how to get "localhost/Google Drive/some-website/index.php" to work (note the space in "Google Drive")?
The best way to do this is to use a local service that calls your google drive files by url, check the google drive documentation so you can do this integration.
Couldn't wait.
Works like a CHARM! Did not experience any disadvantages.
I solved the "Google Drive space problem" like this:
Quit Google Drive
Move the folder to your "htdocs" folder and rename the folder to eg. "google-drive" (my location and filename is "C:\xampp\htdocs\google-drive")
Restart Google Drive (it will say that folder could not be found)
Choose the option to relink to the new folder.
Since I installed the Google Drive folder in "C:\xampp\htdocs\google-drive" I have the option to work on localhost without syncing files eg. files in "C:\xampp\htdocs\my-uncynked-folder".
Related
In Colab, the following code snippet is used for mounting Google Drive.
from google.colab import drive
drive.mount('/test', force_remount=True)
And I'm wondering if it could work on my local machine. When implementing this locally, it says "no module named google", even after having executed pip install google.
Is there another package that should be installed, or it just cannot be achieved? I've searched for a while, but it seems that the only solution is to install Google Drive Desktop to give access to remote files.
Although google.colab python library can be found here, this library is a collection of tools meant to work in conjunction with the Google Colab product.
Indeed, Google Drive Desktop is your best option to "mount" your Google Drive to your local machine.
Alternatively, there are several 3rd party Google Drive clients available.
Use ocamlfuse.
Here are the step by step details: https://medium.com/#enthu.cutlet/how-to-mount-google-drive-on-linux-windows-systems-5ef4bff24288
Instead of mounting it to a home folder (named googledrive in tutorial), I suggest mounting it such that folder structure would be same for both colab and local machine. To do that:
create your mounting folder at root (it's not a recommended practice but there is no harm). You need to use sudo. i.e at /, use sudo mkdir test
then create MyDrive inside test.
Chnage test or MyDrive owner to yourself: sudo chown <your username> MyDrive/
Mount to MyDrive by: google-drive-ocamlfuse MyDrive/
Enjoy!
I'm working on a project that I have stored on my Google Drive mount on Windows, and I would like to use Linux for portions of that project. The Windows Subsystem for Linux has served me well for most of my projects, but I've never had the need to mount a network drive. While it's not imperative that I use my Google Drive mount for this project (I could easily place it in my /downloads or /documents folder), I was curious as to how I could access my Google Drive from WSL.
I attempted to create a new mount via:
sudo mkdir /mnt/googledrive
This successfully created the directory, and then I used the command:
sudo mount -t drvfs G: /mnt/googledrive
This too seemed to be successful.
I was able cd to the /mnt/googledrive directory, but I couldn't access any of my files (it reported the '.' location was unavailable).
Perhaps I've simply misunderstood what I was doing?
Any help would be greatly appreciated!
I found a workaround, not using the "Google Drive" application but the "Backup and Sync" for individuals (https://www.google.com/drive/download/).
Basicaly it's doing the same for me but in a different way. Backup and Sync will permit you to backup your drive to Google but also Sync your Google Drive localy.
By choosing to sync your drive localy, you can even select some folders, the files are sync to the "C" drive under your user profile at the same level of your "My Documents" folder.
Using that way, you can access your files from your linux with the working /mnt/c/... link.
If that answer is too late for you, might be still in time for others ;-)
Is it possible to copy a file from Google Drive to Google Cloud Storage? I imagine it would be very fast since both are on a similar storage system.
I haven't seen any information on either ways to do this seamlessly, without having to download the file (probably to an out-of-google system) and then re-uploading it. About one year ago, a similar question was asked: Transfer files from dropbox/drive to Google cloud storage.
Is there a way to do this directly, with the file staying within-Google the entire time?
#frunkad commented a link to a nice workaround, but for completeness I will recite this here, as it is currently the top result in search.
You can open a Colab (Juypiter Notebook on Google servers), mount your gDrive and use the gCloud CLI to copy files.
open https://colab.research.google.com/ and connect to a machine
Mount your Drive by executing the code below, clicking on the link and pasting the authentification key
from google.colab import drive
drive.mount(‘/content/drive’)
Connect to your Google Cloud Storage. (Project id can be found here)
from google.colab import auth
auth.authenticate_user()
project_id = 'your-project-id'
!gcloud config set project {project_id}
!gsutil ls
Copy files using gsutil. Use -m tag for multi-threading to increase speed. (There is a Subfolder called "My Drive" that you have to address in your mounted drive)
bucket_name = 'your_bucket_name'
!gsutil -m cp -r /content/drive/My\ Drive/Your-Data/* gs://{bucket_name}/
Original Author Philip Lies
Link to his colab: https://colab.research.google.com/drive/1Xc8E8mKC4MBvQ6Sw6akd_X5Z1cmHSNca
There's no direct way to do this. You could write an App Engine or Compute Engine app that reads from Drive using its API and writes to GCS using its API.
We are planning on opening a company account on google drive which will be accessible to only company people.
The issue is we want to put several files on our drive and download them programatically. We tried using google drive APIs but the download speed it very low.
Then we also tried wget but that requires that all the files are made public which we cannot do.
Is there any way to use wget with credentials which will allow a file to be downloaded via an URL.
Our typical file size is 50GB.
There is actually a command from wget to specify user and password. Have you tried the following?
wget --user='username' --ask-'password' https://docs.google.com/'ItemType'/Export='DocumentId'
I've built nice browsing window which shows all of the pdf files on my (or any user) Google Drive for managing purposes.
What i looking to do is simple, i want to take a pdf file from my google drive (i have all the info related to this file - "downloadUrl","webContentLink" etc) and just copy it to my server (remote).
Any thoughts?
I guess I'm pretty late here, but this may help other people too.
You could try using Grive. Here's a straightforward tutorial: http://xmodulo.com/2013/05/how-to-sync-google-drive-from-the-command-line-on-linux.html
Even if you don't have root access on the server, you can simply build from source, and:
$ mkdir ~/google_drive
$ cd ~/google_drive
$ grive -a
You'll receive an auth URL which you need to paste on your browser and click on "Allow Access" and you're done. Go to the google_drive dir on your server and run grive to sync between your local dir and your GDrive.