Transfer files from dropbox/drive to Google cloud storage - google-drive-api

We are planning to implement a feature to let customers browse their images in their own google drive or dropbox accounts (within our app), and select ones they wanna use in our app. Then, we will save these images into our Google cloud storage (GCS) place.
A simple approach is to use OAuth to download the customers' images from dropbox/drive to our local disk, then upload them to our GCS.
I am wondering if it is possible to make a direct transfer between dropbox/drive and GCS? GCS does provide transfer service for S3. You can also create a list of URL list, but each URL needs to be publicly accessible, which in our case does not seem possible.
Is there any better suggestion on how to implement this feature? Thanks

Well you could stream the files so at least you wouldn't have to save them to local disk, but other than that I think your solution is the way to go.
It depends on your language how streaming works exactly, but basically you download the file and upload it to GCS right away without ever writing any bytes to local disk.

I ran into your question during my search of finding a way to transfer data from G-drive to Google Cloud Storage. Since they are both under Google umbrella, so I guess they should be able to seamlessly upload/download. So far in my search, without programming, the answer is NO.
I don't have a exactly solution for your question, because it really depend on how much data you have and how often of the uploading/downloading. Unfortunately, there is not straight shot (even for the Gdrive!). But maybe thinking of trying the RESTful API calling to Google and make the connection with Dropbox API. Hoping this could help.

With this recipe, you can use Google Colab to mount Google Drive folder, and then copy from the mounted folder to the storage bucket using gsutil, avoiding the need to first save the data to local disk.

Related

Is there a way to store images in Azure or AWS without hosting the application on their platforms?

I am currently developing a full-stack web application for the first time. It is a store that needs to give users the ability to upload "books", edit, delete, and manage them. As of now, I have a React front-end, that calls an Express API using Axios that queries a MySQL database. This currently can manage the product details, titles, and simple labels and relations.
However, I now need to store images and .json files dynamically as well. So, I have researched and need to use Azure Storage to store these images and allow access to them by the end-user. I have researched it and the client would like to use Azure storage as well.
I have gotten quite overwhelmed looking into the Azure documentation for Javascript image uploading, and every "tutorial" starts with create a web application, storage account, and app service.
All I would like to do is store images from the user in Azure storage, so that when I eventually deploy the website, the data is available to be accessed and then my front end or API can call Azure to get the images for the user. I apologize if the question is confusing or vague, I am really just overwhelmed by Azure's documentation and it seems like such a simple and common problem. Any guidance or references would be greatly appreciated.
Yes, it's quite possible. You simply create an Azure Storage account and upload your files as blobs via that account. Then they will be available publicly on the Internet, and your web application can reference them from wherever it is hosted.
It is possible as I believe it is quite a common practice to have a Content Delivery Network to deliver the images. I am not very familiar with Azure but I am with AWS and I can tell you that you can use an AWS S3 bucket to store the images and JSON. It comes with many different configuration options to allow content to be protected or open to the general internet.

Running python script on database file located in google drive

I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.

indexedDB in a Chrome App

I'm building a chrome app which requires a persistent and local database, which in this case can be either indexedDB or basic object storage. I have several questions before i begin developing the app:
Is it possible to persist indexedDB data after un-installation of the chrome app and chrome browser?
If the indexedDB file/data persist can i locate and view it?
If I can locate but can't view it, is it possible to change the location of the indexedDB file?
Can I store the indexedDB in a file located on desktop or any other custom location?
If I had these requirements, I see a couple of options that you might pursue
Write a simple database backed by the FileSystem API, and periodically lock the database and back up that file. This would be pretty cool because I don't know of anyone who has implemented a simple FileSystem API backed database, but I could see it being useful for other purposes.
Any edits to the database would be also made to a copy of the database stored on your backup server, and I would write functions that could import snapshots from your backup.
Simply write functions to export from your indexedDB to some format into a backup, and to import from the backup.
All options seem quite time consuming. It would be cool if when you create an indexedDB, you could specify an HTML FileSystem API entry file to back it, and that way you wouldn't have to do 1 or 2.
I agree that it seems like quite an oversight that an indexedDB is quite difficult to back up.
I am writing a basic browser only application. No back end server code at this time. So I also have storage requirements. But I am not doing backup. I am looking at pouchdb as a solution: http://pouchdb.com/
Everything is looking good so far. They also mention that they would work well with Google Apps.
http://pouchdb.com/faq.html#native_support
The nice thing is you could sync your pouchdb data with a server couchdb instance.
http://pouchdb.com/api.html#replication
http://pouchdb.com/api.html#sync
If you want to keep the application local to the browser with no server support you could backup the entire database by using a batch fetch.
http://pouchdb.com/api.html#batch_fetch
I would run the result through gzip before you put it on the filesystem.
I am currently attempting this very same thing. I am using the Chrome Sync File System Api (http://goo.gl/5q8Z9M), but running into some instances where my file (or its contents) is deleted. With this approach I am writing out a JSON object. Hope this helps.

Google Realtime API - Using Collaborative Models with Files

Trying to understand this help doc. It says:
The Realtime API provides a persistent document for the data that
users are collaborating on.
What is this "persistent document"? Is this a file in Google Drive? I'm not planning to store data in Google Drive and most likely will be using shortcut files. So how does that relate to my situation?
Applications that provide collaborative
editing features, but need to work with files in various formats, must
adopt one or more strategies for converting data between the Realtime
API document and other file formats.
I'm storing my data as XML in database. Does this qualify for "files in various formats"? How does that effect my situation?
Each file on Drive has content as itself and a permanent storage for Realtime API. In your case, your shortcut files will have Realtime structures even though your files wont have any content all.

how to sync files from mysql hosting site to google drive

We are creating a web application using MySQL as our database, is there a way that some files from the hosting site of our application can be sync to the user's google drive?
There's no direct way of synchronizing your local data with Drive.
However, you can pseudo-sync with little load on your server by using Changes. Basically, you can get list of file changes since the time you specify. If I were you, I make a cron job of checking file changes from Drive and Files.get specific files that have changed.
Besides what JunYoung Gwak suggests -> to ask google by polling them you will also have to keep a last edited date in your app and there might be cases when the local file is newer. will have to keep the same time zone as google to make it work for <24 hour changes.
So both sides will need to have a give changes since -date-time-in-timezone and a way to take a file for one place and update to the other.
Might have conflicts that need to be resolved by a diff tool.