We are creating a web application using MySQL as our database, is there a way that some files from the hosting site of our application can be sync to the user's google drive?
There's no direct way of synchronizing your local data with Drive.
However, you can pseudo-sync with little load on your server by using Changes. Basically, you can get list of file changes since the time you specify. If I were you, I make a cron job of checking file changes from Drive and Files.get specific files that have changed.
Besides what JunYoung Gwak suggests -> to ask google by polling them you will also have to keep a last edited date in your app and there might be cases when the local file is newer. will have to keep the same time zone as google to make it work for <24 hour changes.
So both sides will need to have a give changes since -date-time-in-timezone and a way to take a file for one place and update to the other.
Might have conflicts that need to be resolved by a diff tool.
Related
I've written a script in Google Colab that performs some queries via API to a server where I host some information, let's say weather data from several weather stations we own, that data is downloaded in a JSON format then I save those files in my Google Drive as I'll use them later in the same script to generate some condensed tables by country and by month, those summary files are also stored in my Google Drive account. After that, I send the summary files via email to some people.
I need to run this script every other day and, well, it is no big deal to do that, but I'd like to make it happen without human intervention, mainly because of scalability. One of the main points is that some libraries are not easy to install locally and it is way easier to make it work in Colab, that's why I don't use some solutions like crontab or windows scheduler to perform this, and I need to make the Colab notebook to run periodically.
I've tried solutions like PythonAnywhere, but I've spent too much time trying to modify the script to work with PyDrive and, so far, haven't achieved it. I've also tried to run it using GCP but haven't found how to access my Google Drive files yet using Google Cloud Functions. I've been also working on building an image using Docker and if I deploy it to someone else, it'll be working, but the idea is to make the running schedulable not only deployable. I've also read the Colabctl option but haven't been able to make it run yet.
Please, I'm open to receiving any suggestions in order to achieve my goal.
Thank you,
Billy.
there is a paid scheduler https://cloud.google.com/scheduler
or you can use https://github.com/TensorTom/colabctl with your own cron
I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.
I've got a nopCommerce project. I could publish it successfully to a server from my local but it took about 1,5 hours to upload complete ( depending my upload speed ).
Question: Is there a way to sync or update files which i only modify inside VS than uploading whole published project? (cause I told above it took much time for me)
You have to publish whole project on deployment server at the very first time, afterwords there is no need of publish a whole project again and again.
Is there a way to sync or update files which i only modify inside VS than uploading whole published project?
There are many version control tools available to track your file changes. You can use it and update only required files and DLLs.
At our company we have a Google Spreadsheets which is shared by a link with different employees. This spreadsheet is saved on a Google Drive to which only I have access. The link is configured as such that anyone with the link can edit the spreadsheet since all employees need to be able to make changes to the file.
Although this is very useful, it also presents a risk in the form of data loss. If a user were to (accidentally) delete or alter the wrong data and saves the file, this data is permanently lost.
To prevent this I was wondering if it is possible to automatically have a backup created, say every day. Ideally, this backup is saved in the same Google Drive. I know I could install the desktop client and have the file backed up by our daily company backup, but it seems a bit ridiculous to install it for just one file. I'm sure there has to be another solution to this, ie with scripts.
I followed the advice of St3ph and tried revision history. Not exactly what I meant, but an acceptable solution nonetheless.
I am using SQL Server 2008.
I have a table (TBL_FILE) that stores user uploaded files in binary column. However users do not want to open our system to access the file. They want to have a folder (network drive, Web folder, local drive, local folder... are accepted) that maps to the table (TBL_FILE). So they can directly open the file directly in File Explorer.
The key point is they want to open files directly in File Explorer.
Is it possible to do that? And what kind of program I need to write to do that? And how to do security?
Thanks!
Alex
Have you considered writing an application that would prompt for a login, then present a list of files to the user in a friendly user interface? You could drop a shortcut to that application in the folder they want these files to live in.
If you must have shortcuts directly from the filesystem into your binary data fields, then you are going to have to be a little bit hacky. Depending on how often the files are updated, you can try one of these options:
1 - Write an application that will run as a Windows Service or as a scheduled job. Periodically check for changed binary data, and save it to disk. Disadvantage: the file system will only be updated at intervals, so database changes will not be immediately available.
2 - Write a trigger on the table that saves the binary file to disk. Fire the trigger whenever the row changes- preferably by monitoring a 'last modified time' or similar field, rather than checking the binary value directly. The trigger could fire a CLR stored procedure, or you can do it directly from T-SQL using the method described here.
Disadvantage: You have a potentially time-consuming trigger on the table.
In either case, security becomes the problem of the Windows filesystem. Just grant access to the folder to whomever should see the files.
Good luck!
after searching in google, I finally find a solution of this problem.
We could create a logical drive with .NET technology or other third party libraries. One of the libraries is Doken http://dokan-dev.net/en/.
Doken is able to let us to create a drive in computer and do the logic ourselves. It seems that it is able to map a folder to a table in Database. But I haven't tried it yet.
Thanks!
Alex