Duplicating GCP Project in Compute Engine - google-compute-engine

On GCP Compute Engine, I have a Project A with several VMs running. I am not using App Engine.
How can I duplicate or make an exact copy of Project A into a second Project B (having the same VMs config and all its source and data files in Project A) thus resulting in two exact Projects?
I can't seem to find a solution. Thank you.

Unfortunately there is no "copy project" internal tool to perform such a task. Speaking about VMs,the fastest way to move existing VM would be by making disk images and creating new instances at the new project using the image from your old project using the "-image-project IMAGE_PROJECT" flag.

Related

Running python script on database file located in google drive

I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.

Is it possible to partially calculate the graphs of Open Trip Planner?

I have a question about Open Trip Planner. In the project I'm working on I've automated the download of GTFS files whenever an update is available.
Every time my server downloads a new update, however, it is recreating the entire graph (including the map and the GTFS that have not been modified).
Is it possible to update only the graphs related to specific GTFS in OTP?
In the guide it says that it is possible to save graphs, but this part of the tutorial is very poor, or maybe I cannot find it.
link to guide I am referring: http://docs.opentripplanner.org/en/latest/Basic-Tutorial/
I am using the following command to start OTP:
java -Xmx512m -jar "/otp-1.4.0-shaded.jar" --build "/downloads" --inMemory

Migrate Cloud Compute VM to Separate Google Cloud Account

Just curious if there's a standard method to take one VM instance and migrate it to a completely different Google Cloud account. If not, I guess I could download all the site files and server configuration files (virtual hosts, apache config, php.ini, etc) but am hoping there's a more streamlined approach.
Just to be clear I have john#gmail.com and john2#gmail.com, both completely separate accounts each with VMs on their respective Google Cloud accounts. I want to move a VM instance from john#gmail.com to john2#gmail.com
I'm open to any method that will get the job done. I like Google's snapshot feature but I have doubts I'll be able to move the specific snapshot over. I'm thinking maybe it'll be possible by creating an image but even that I'm not 100% sure.
Thanks!!
Create a custom image, then share it, then import that image into the target project.

Google compute engine not initialized on new project

I have started a new project on google cloud but I am unable to use the compute engine. I'm getting the following message even after 6-7 hours.
Error Google Compute Engine is not ready for use yet in the project.
It may take several minutes if Google Compute Engine has just been
enabled, or if this is the first time you use Google Compute Engine in
the project.
Is there something I am missing or there is a problem?
I deleted the project and created a new and it worked. Also the old one is stuck on pending deletion.
If someone from google wants to take a look, it would be nice to inform me on what happened.
When projects are deleted, it take a few days before the project is permanently deleted. About what happened, it could be that when the project was created the project wasn't provisioned correctly, causing this issue, but you solved it by creating a new project.

Google cloud fully global image

I want to create an image which is reachable in every project for everbody.
How can I make it?
Currently I have a snapshot. I made a persistent disk with this image so my idea is to create a google storage blob and copy it across the projects and create an image in every project.
Can I somehow copy a disk into the storage account.
Or anybody has better idea?
My suggestion is to export an image to Google Cloud Storage following these steps that you can find in the documentation.
After that you should be able to run the following command:
gcloud compute images create IMAGE_NAME --project PROJECT --source-uri gs://BUCKET_NAME/IMAGE_NAME.image.tar.gz
as per Adrian's answer:
Sadly Google don't make your life easy here... frankly the lack of easy Image migration project-to-project is a bit of a thumbs down at the moment for GCE.