Running python script on database file located in google drive - google-drive-api

I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.

Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.

Related

How to run my colab notebook periodically

I've written a script in Google Colab that performs some queries via API to a server where I host some information, let's say weather data from several weather stations we own, that data is downloaded in a JSON format then I save those files in my Google Drive as I'll use them later in the same script to generate some condensed tables by country and by month, those summary files are also stored in my Google Drive account. After that, I send the summary files via email to some people.
I need to run this script every other day and, well, it is no big deal to do that, but I'd like to make it happen without human intervention, mainly because of scalability. One of the main points is that some libraries are not easy to install locally and it is way easier to make it work in Colab, that's why I don't use some solutions like crontab or windows scheduler to perform this, and I need to make the Colab notebook to run periodically.
I've tried solutions like PythonAnywhere, but I've spent too much time trying to modify the script to work with PyDrive and, so far, haven't achieved it. I've also tried to run it using GCP but haven't found how to access my Google Drive files yet using Google Cloud Functions. I've been also working on building an image using Docker and if I deploy it to someone else, it'll be working, but the idea is to make the running schedulable not only deployable. I've also read the Colabctl option but haven't been able to make it run yet.
Please, I'm open to receiving any suggestions in order to achieve my goal.
Thank you,
Billy.
there is a paid scheduler https://cloud.google.com/scheduler
or you can use https://github.com/TensorTom/colabctl with your own cron

Transfer files from dropbox/drive to Google cloud storage

We are planning to implement a feature to let customers browse their images in their own google drive or dropbox accounts (within our app), and select ones they wanna use in our app. Then, we will save these images into our Google cloud storage (GCS) place.
A simple approach is to use OAuth to download the customers' images from dropbox/drive to our local disk, then upload them to our GCS.
I am wondering if it is possible to make a direct transfer between dropbox/drive and GCS? GCS does provide transfer service for S3. You can also create a list of URL list, but each URL needs to be publicly accessible, which in our case does not seem possible.
Is there any better suggestion on how to implement this feature? Thanks
Well you could stream the files so at least you wouldn't have to save them to local disk, but other than that I think your solution is the way to go.
It depends on your language how streaming works exactly, but basically you download the file and upload it to GCS right away without ever writing any bytes to local disk.
I ran into your question during my search of finding a way to transfer data from G-drive to Google Cloud Storage. Since they are both under Google umbrella, so I guess they should be able to seamlessly upload/download. So far in my search, without programming, the answer is NO.
I don't have a exactly solution for your question, because it really depend on how much data you have and how often of the uploading/downloading. Unfortunately, there is not straight shot (even for the Gdrive!). But maybe thinking of trying the RESTful API calling to Google and make the connection with Dropbox API. Hoping this could help.
With this recipe, you can use Google Colab to mount Google Drive folder, and then copy from the mounted folder to the storage bucket using gsutil, avoiding the need to first save the data to local disk.

Migrate Cloud Compute VM to Separate Google Cloud Account

Just curious if there's a standard method to take one VM instance and migrate it to a completely different Google Cloud account. If not, I guess I could download all the site files and server configuration files (virtual hosts, apache config, php.ini, etc) but am hoping there's a more streamlined approach.
Just to be clear I have john#gmail.com and john2#gmail.com, both completely separate accounts each with VMs on their respective Google Cloud accounts. I want to move a VM instance from john#gmail.com to john2#gmail.com
I'm open to any method that will get the job done. I like Google's snapshot feature but I have doubts I'll be able to move the specific snapshot over. I'm thinking maybe it'll be possible by creating an image but even that I'm not 100% sure.
Thanks!!
Create a custom image, then share it, then import that image into the target project.

how to sync files from mysql hosting site to google drive

We are creating a web application using MySQL as our database, is there a way that some files from the hosting site of our application can be sync to the user's google drive?
There's no direct way of synchronizing your local data with Drive.
However, you can pseudo-sync with little load on your server by using Changes. Basically, you can get list of file changes since the time you specify. If I were you, I make a cron job of checking file changes from Drive and Files.get specific files that have changed.
Besides what JunYoung Gwak suggests -> to ask google by polling them you will also have to keep a last edited date in your app and there might be cases when the local file is newer. will have to keep the same time zone as google to make it work for <24 hour changes.
So both sides will need to have a give changes since -date-time-in-timezone and a way to take a file for one place and update to the other.
Might have conflicts that need to be resolved by a diff tool.

Looking for a webserver as an alternative to my localhost?

I have an android app which successfully reads .json files from my localsever, however when the application runs on a mobile it is unable to read the files. I have had a nightmare trying to resolve the issue so I've given up.
I figured that perhaps uploading the .json files to an online web server would be an easier option. I have no previous experience with online web servers and wondered if someone could recommend a website?
I would only be using it to upload 5 .json files that are no more than 5KB and would probably only need it to be hosted online for about a month, I'm not sure if there is an alternative to a monthly subscription.
There are a lot of free web services. However, if you want to make it fast and easiest, try google sites. It gives you an evaluation time of about a month and it should be enough for you.
However, if you just want few files, you can go with any cheap web host such as bluehost and if you just want subscription on monthly basis.
You can also use amazon EC2 which provide you services on hourly basis so if you do not want to use it, just turn it off.
http://aws.amazon.com/ec2/pricing/