Just curious if there's a standard method to take one VM instance and migrate it to a completely different Google Cloud account. If not, I guess I could download all the site files and server configuration files (virtual hosts, apache config, php.ini, etc) but am hoping there's a more streamlined approach.
Just to be clear I have john#gmail.com and john2#gmail.com, both completely separate accounts each with VMs on their respective Google Cloud accounts. I want to move a VM instance from john#gmail.com to john2#gmail.com
I'm open to any method that will get the job done. I like Google's snapshot feature but I have doubts I'll be able to move the specific snapshot over. I'm thinking maybe it'll be possible by creating an image but even that I'm not 100% sure.
Thanks!!
Create a custom image, then share it, then import that image into the target project.
Related
I've written a script in Google Colab that performs some queries via API to a server where I host some information, let's say weather data from several weather stations we own, that data is downloaded in a JSON format then I save those files in my Google Drive as I'll use them later in the same script to generate some condensed tables by country and by month, those summary files are also stored in my Google Drive account. After that, I send the summary files via email to some people.
I need to run this script every other day and, well, it is no big deal to do that, but I'd like to make it happen without human intervention, mainly because of scalability. One of the main points is that some libraries are not easy to install locally and it is way easier to make it work in Colab, that's why I don't use some solutions like crontab or windows scheduler to perform this, and I need to make the Colab notebook to run periodically.
I've tried solutions like PythonAnywhere, but I've spent too much time trying to modify the script to work with PyDrive and, so far, haven't achieved it. I've also tried to run it using GCP but haven't found how to access my Google Drive files yet using Google Cloud Functions. I've been also working on building an image using Docker and if I deploy it to someone else, it'll be working, but the idea is to make the running schedulable not only deployable. I've also read the Colabctl option but haven't been able to make it run yet.
Please, I'm open to receiving any suggestions in order to achieve my goal.
Thank you,
Billy.
there is a paid scheduler https://cloud.google.com/scheduler
or you can use https://github.com/TensorTom/colabctl with your own cron
I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.
I'm trying to create a Google apps script that adds a new user to a Ubuntu vm that I've created whenever a form is submitted. I'm wondering if there is some way to initiate an ssh connection from a Google apps script that would allow me to login to the vm and create a new user. I have the IP and login credentials for the vm. I've set it up so that the script will run whenever a form is submitted, however I'm not sure where to go from there. I apologize in advance if there is a better way to do this, I could just manually create the accounts based off form submissions, but I really need the automation. If there is a solution to this, even if it doesn't involve ssh, I would really appreciate the help!
This is not trivial.
Google AppsScript does not support SSH by default, so you have to work around that.
The user Perhaps you see this name has given you a great idea. I'll further explain how to do what he suggested below:
What you will need on the linux machine
A web service, callable from the google IPs (you can white list it or leave it open to the public (which is dangerous, and should be done only as last resort)).
A account with user creation permission on linux.
A script to create the new users from the data received on the web service.
For the first part, you can do this with any technology you want. I recommend Node.Js + Express.js, as it is easy to create what you want with child processes.
I'll assume you already have an user account able to create users. You probably want to use that.
The last part is just another linux command. You can just Google it and you'll find lots of examples.
There is one catch, while the real-time user creation option with APIs might look enticing to you, I would strongly advise against leaving a public service for something like creating users, as that could become a security risk.
What you might want to do instead is to have a machine with no value (AKA a cheap machine your planned to throw away, with no important data and no confidential information) hosting your web service and then make a script on your Ubuntu VM to fetch the data from said service in an encrypted secure way.
On GCP Compute Engine, I have a Project A with several VMs running. I am not using App Engine.
How can I duplicate or make an exact copy of Project A into a second Project B (having the same VMs config and all its source and data files in Project A) thus resulting in two exact Projects?
I can't seem to find a solution. Thank you.
Unfortunately there is no "copy project" internal tool to perform such a task. Speaking about VMs,the fastest way to move existing VM would be by making disk images and creating new instances at the new project using the image from your old project using the "-image-project IMAGE_PROJECT" flag.
Im currently in the process of switching to google compute engine for my web hosting because my current provider performance has been deteriorating over time. Giving me more flexibility to upgrade as I need to.
Ive got my website setup and working on the engine. But the next steps need to go smoothly to ensure my customers don't experience any downtime.
I have a few things I need to work out:
- Does google have a way of managing email addresses at your own domain? Then I can just send or receive from gmail.com or another email client on my domain? Or do I have to setup a email server in my VM? If so is there any way to setup a cpanel like management software on it?
- To my understanding I should just have to call my current provider to ask them for my SSL certificates and for them to switch my domain over to google and then point it to my VM? or is there something I'm missing here?
Are there any simple ways to ensure my server says secure when I'm managing it myself other then just updating packages manually? Like a website I can use to track known security problems with the packages I have installed?
Edit:
Please read Dan Cornilescu's comment on this question about setting up your own custom domain email. He said it can possibly be managed using google apps.
On the topic of SSL/Domains I called my current provider and they said they would help me switch over if its what I decided. They also upgrading my hosting plan and things seem better now and are comparable to the performance I was getting on my google VM so Ill be trying that for now.