Clone a google compute engine instance to a local Server - google-compute-engine

I have a vm instance on google compute engine, which deploys a python web application. But now i want to have a snapshot of the online vm deplyed on my local server. Is it feasible to do that? if yes, how do i proceed to do that?

Related

Can we migrate local beanstalkd to GCP Compute instance?

I have local system that have a beanstalkd application running.Now i want to migrate to GCP compute instances.
I have installed beanstalkd on CE.So how i can migrate this to CE?
The steps are straightworfard
make sure binlog is activated
locate the binlog file
stop the Beanstalkd instance, for fully graceful shutdown, so binlog file should be flushed
upload to Cloud Storage, (either using the Console or using gcloud utility)
on the Compute Engine instance, make sure you have Storage permission in the edit section
download using gcloud commands the binlog file to the machine itself
start the beanstalkd service and woala, your persisted messages are there
you can setup on the same machine or other machines Beanstalkd Console Admin

Azure API M - Local Development

What is the story for local development for API M? I want to pull down a local copy of API M for local developers to build their API's against, is that possible (also configuring the backend resources to local copies of the code that would run in the cloud)?
I was thinking maybe a docker image of API M, configured for running locally on a developer machine for our environment. Is this possible?
Yes, dev tools support is not there yet, but you can run it locally: https://learn.microsoft.com/en-us/azure/api-management/self-hosted-gateway-overview. It still requires connection to cloud though and configuration must be done either through Azure Portal or ARM.

Google Cloud Compute no storage left

Hi when i try to ssh to google cloud VM instance it doesn't connect and when i check the logs it says there is no storage available.
but when i connect using google cloud console it connects and when i check the storage there is enough storage
also one thing my current persistent disk is 20gb but here it shows twice the amount. if anyone can explain me whats going this would help me out a lot
The output that you are posting is from Cloud Shell link.
When you start Cloud Shell, it provisions a g1-small Google Compute
Engine virtual machine running a Debian-based Linux operating system.
Cloud Shell instances are provisioned on a per-user, per-session
basis. The instance persists while your Cloud Shell session is active;
after an hour of inactivity, your session terminates and its VM,
discarded. For more on usage quotas, refer to the limitations guide.
With the default Cloud Shell experience, you are allocated with an
ephemeral, pre-configured VM and the environment you work with is a
Docker container running on that VM. You can also choose to use a
custom environment to save your configurations, in which case, your
environment will be your very own custom Docker image.
Cloud Shell provisions 5 GB of free persistent disk storage mounted as
your $HOME directory on the virtual machine instance.
As Travis mentioned you run df -h --total in the Cloud Shell storage not the VM.
Here you can find a SO related question with possible solutions to fix your issue.
Disk is full, and I can't SSH to instance.

Invoking gsutil or gcloud from a Google cloud function?

I have a Google Firebase app with a Cloud Functions back-end. I'm using the node.js 10 runtime which is Ubuntu 18.04. Users can upload files to Google Cloud Storage, and that triggers a GCF function.
What I'd like that function to do is copy that file to Google Cloud Filestore (that's File with an L), Google's new NFS-mountable file server. They say the usual way to do that is from the command line with gsutil rsync or gcloud compute scp.
My question is: are either or both of those commands available on GCF nodes? Or is there another way? (It would be awesome if I could just mount the Filestore share on the GCF instance, but I'm guessing that's nontrivial.)
Using NFS based storage is not a good idea in this environment. NFS works by providing a mountable file system, something that will not work in Cloud Functions environment as the file system is read only with the exception of /tmp folder.
You should consider using cloud native storage systems like GCS for which the Application Default Credentials are already setup. See the list of supported services here.
According to the official documentation Cloud Filestore documentation
Use Cloud Filestore to create fully managed NFS file servers on Google
Cloud Platform (GCP) for use with applications running on Compute
Engine virtual machines (VMs) instances or Google Kubernetes Engine
clusters.
You can not mount the Filestore on GCF.
Also, you can not execute gsutil or gcloud commands from a Google cloud function Writing Cloud Functions.
Google Cloud Functions can be written in Node.js, Python, and Go, and
are executed in language-specific runtimes

Issues while setting up objectify with The Datastore

I have an application which uses objectify which i want to deploy in a Google Compute engine to access google datastore. I have been able to test this application in local development server using Objectify. I am also able to access the cloud datastore from the compute engine by following the documentation in https://cloud.google.com/datastore/docs/getstarted/start_java/.
But when I deploy my application in the google compute engine I am not able to communicate with the google cloud datastore and am getting the following exception:
No API environment is registered for this thread.
I should be missing something. Kindly help me out.
As far as I can tell, Objectify currently only works with Datastore if you are running in Google App Engine (GAE), not for Datastore Access via Google Compute Engine (GCE). There is an open issue https://github.com/objectify/objectify/issues/203
The reason it does not work on Google Compute Engine is apparently because the API for Datastore access in GCE is apparently different from the one used for GAE.