Mirror synchronization of local folders with google drive - google-drive-api

I want to synchronize some local folders from my desktop to my Google Drive account. I have to mention that I have more than 2 million files totalizing 1 To and file sizes are from 1o to 100 Go (a zip archive).
Using drive for desktop, the google application for the synchronization, takes years since each time this app is opened, it is checking all the files. Considering the number of files I have, you understand that it is quite long. Additionally, I have the feeling that only 3 files can be simultaneously uploaded on the drive with this "Google Drive for desktop" app.
I am looking for an alternative solution that would allow me to save my local folder in a "mirror" way. I mean that, when modifications are performed in my local folder on my computer, they are pushed to my google drive in real-time.
Do you know about such a free software I could use, that would not take years in an infinite checking loop before synchronizing my files?

I use Viper FTP for Mac for a similar task. There you can define what they call an "observed folder" - a folder that is watched by ViperFTP and if any modifications are detected, new / modified files are uploaded to the defined server (s). The app also supports Google Drive.

Related

How can I access the files I created in my google project

I have spent a lot of times creating a project in https://console.cloud.google.com, enabling the Drive API, creating server account credentials, and finally writing a small NodeJS integration allowing me to read and write files to Google Drive.
My intent is to be able to store files (organized in folders) on Google Drive from my server, and see them in the classic Google Drive desktop app with my Google account to check everything is fine.
My project seems correctly setup and I was able to create files from my NodeJS program (the files exist, I can list them with the same program), but I can't see the files anywhere in Google Drive with my Google account, with which I created the project.
I was expecting this to be extremely simple. That I would have a out of the box Drive UI allowing me to review the changes.
In the documentation they say I can configure a UI integration but I don't know if this is what I am looking for or not. It seems complicated, talking about my "app" etc, while I just want a simple Google Drive UI for it !
Could anyone help me understand all this ?
Thanks

Getting files from Google Drive "Computers" into Google Colab

I need to make use of Google Colab's GPUs, but also need to constantly upload new files and make slight adjustments to other files so I used Google's Backup/Sync tools to automatically stream a folder from my local machine into Google Drive so that new/updated/deleted files are automatically loaded. The problem is that I can't figure out how to get the data from the computer into Colab.
Most solutions I've seen on stack overflow use
from google.colab import drive
drive.mount("/content/gdrive",force_remount=True)
The problem with this is that upon doing this, inside of /content/gdrive there is only the folder "My Drive", whereas the files from my computer get saved in a different area "Computers/My Computer/". Thus these files aren't accessible using this method. Is there a way in Colab to be able to access the content in Computers/My Computer/ ?
The only other solutions I have seen have some code inside of Colab allowing you to directly upload files, which doesn't suit my purposes since I don't want to have to manually upload files every time.
In Colab, you might not be able to access any folder or files other than "/content/drive/My Drive". For example, you are not able to access files under '/content/drive/Computers/My Laptop/Project_R_Py/lib_py'.
You can simply go to google drive, right click that folder ("lib_py") and choose "Add Shortcut to Drive". After that you should be able to access that folder by specifying "/content/drive/My Drive/Project_R_Py/lib_py"
Hope this helps

Moving Google Drive Files to Another Computer Without Redownloading All of the Contents

I'm currently using Google Drive on my laptop. I'm getting a new laptop soon and want to use the same Google Drive account on this new laptop.
I know that I can install Google Drive and download all the files from the cloud, but I am concerned about my data usage since I'm living in India (my data gets capped at 20 GB every month).
I wanted to move all of the files in my current Google Drive folder manually to the new computer. Will Google Drive automatically recognize that the files are the same and link them, or will it download a new set from the cloud and mark them with (1) beside each file?
Is there any other solution that I am not considering that will allow me to move my files locally and not have to re-download / re-upload all of the existing files?
I know it's been 3 years, but for the sake of eternal internet knowledge, Michael Richardson's article might help you and maybe others that stumble here:
http://rainabba.blogspot.co.il/2013/07/how-to-really-move-your-google-drive.html
(I'm sorry, Michael, but StackOverflow won't allow me to paste the shortened URL)
The old guide said you should disconnect your account, move the files to the new folder, then edit the files in %appdata%\Local\Google\Drive\ using a Hex editor and change the path there to the new path.
In an update from April 2017, Michael said that Google has updated the client so that it can now recognize existing folders. So now you should just point the Drive client to the new folder. I couldn't figure out if it needs to be done before or after you disconnect from the account, but if anyone can try this (I can't do it right now) then please share your conclusions.
Good luck!

How to get an accurate file list through the google drive api

I'm developing an application using the Google drive api (java version). The application saves files on a Google drive mimicking a file system (i.e. has a folder tree). I started by using the files.list() method to retrieve all the existing files on the Google drive but the response got slower as the number of files increased (after a couple of hundred).
The java Google API hardcodes the response timeout to 20 seconds. I changed the code to load one folder at a time recursively instead (using files.list().setQ("'folderId' in parents) ). This method beats the timeout problem but it consistently misses about 2% of the files in my folders (the same files are missing each time). I can see those files through the Google drive web browser interface and even through the Google drive API if I search the file name directly files.list().setQ("title='filename'").
I'm assuming that the "in parents" search uses some inexact indexing which may only be updated periodically. I need a file listing that's more robust and accurate.
Any ideas?
could you utilize the Page mechanism to do multiple times of queries and each query just asks for a small mount of result ?

How i can get multiple files from google drive through the google drive api?

I would like to know how i could to obtain multiple files from google drive. I searched this in the reference but i not found this information. I'm building a web application that will talk to drive and retreive a link of a zip file to download. Zip of the files.
I'm using php with api v2.
That is currently not possible with the Drive API, you have to send multiple requests to retrieve multiple files.
I've been faced with a similar problem and while there's currently no way of doing this through Drive (to my knowledge), this is the solution I came up with.
I'm serving up hundreds of thousands of files to clients using a Drive folder as the storage with a custom built front-end built with the Drive API. With that many files, it's ridiculously tedious for users to download files one at a time. So the idea was to let the users select which files they'd like to download and then present them with a zip file containing the files.
To do that, first you'll need to get an array of the Drive files that you want to download whether that's some you generate programmatically or through check-boxes on the front-end. Then you'll loop through that array and grab the 'downloadURL' value for each file and perform a cURL to the url provided. Depending on how many files you're planning on handling per request you can either keep them all in memory or temporarily store them on the disk or in a database. Regardless, once you have all of the files, you can then zip them up using any number of zip libs that are out there. Then just send the resulting zip file to the user.
In our case we ended up sticking with individual file downloads because of the potentially massive amount of resources and bandwidth this can eat but it's a potential solution if you're not serving large numbers of files.
Assuming I am answering the correct query, if you place the files in a folder on google drive, then as far as I know it is possible to download as a complete group of files.