Hi I am curious is there a way to share my google cloud snapshot with another person, so that he can easily set up everything?
Assuming by sharing you mean "another user be able to create a VM which is a copy of mine but in their project". You could create a VM image rather than a snapshot, add the user as a READER on your project, then have them use gcloud compute instance create with --image-project and --image pointing to your VM image.
There are multiple ways to achieve what you want.
If you want the second person to have their own instance which is a duplicate of yours, then what you probably want to do is to take a snapshot of the root persistent disk, and create a new instance from that snapshot. You can find more details on how to do that here:
https://cloud.google.com/compute/docs/disks/create-root-persistent-disks
Note that in this case you probably want to add the second person's Google account to your project so that they can interact with the instance you create for them.
Alternatively, you could create an account for the other person on your instance, then configure SSH to allow them to log in. You can find more details on how to set up SSH for Linux instances in Google Cloud here:
https://cloud.google.com/compute/docs/instances/connecting-to-instance
I wanted merely add a comment to Alexey's answer, but my reputation is still below 50 :)
So to save you time - the exact command to create an instance from the source image is:
gcloud compute instances create _new-instance-name --image-project _name-of-source-project --image _name_of_source_image
Related
I have a requirement where I need to trigger a cloud function which in turn triggers a data flow job once a file is placed in google cloud storage bucket of another project. Google documentation says its not possible, please see here https://cloud.google.com/functions/docs/calling/storage
However, I tried doing this and my cloud function failed to deploy with below error. It says
Looks like this is a permission issue and if required permission are given , this will work.
Do I need to add and give owner permission to #appspot.gserviceaccount.com of the project(project A) from where I am trying to access the bucket of another project(Project B)
So if the above is true, In my project B IAM page, I will see 2 as below
#appspot.gserviceaccount.com OWNER
#appspot.gserviceaccount.com EDITOR
Any inputs on this is much appreciated.
It's not possible to catch event from other projects for now. But there is a workaround.
In the project with the bucket, create a PubSub notification on Cloud Storage
On the topic that you created, create a push subscription. Use the Cloud Functions URL, and secure the PubSub call (you can get inspiration from there. If you are stuck, let me know, I will take more time to describe this part)
On the Cloud Functions grant the PubSub service account as cloudfunctions.invoker role
EDIT 1
The security part isn't so easy at the beginning. In your project B (where you have your Cloud Storage), you have created a PubSub topic. On this topic you can create a notification with a service account created in the project B. Take care to well fill in the audience
Then, you need to grant this "project B" service account as roles/cloudfunctions.invoker on the Cloud Function of the project A
# Create the service account
gcloud iam service-accounts create pubsub-push --project=<ProjectB>
#Create the push subscription
gcloud pubsub subscriptions create \
--push-endpoint=https://<region>-<projectA>.cloudfunctions.net/<functionName> \
--push-auth-service-account=pubsub-push#<projectB>.iam.gserviceaccount.com \
--push-auth-token-audience=https://<region>-<projectA>.cloudfunctions.net/<functionName> \
--topic=<GCSNotifTopic> --project=<ProjectB>
#Grant the service account
gcloud functions add-iam-policy-binding --member=serviceAccount:pubsub-push#<ProjectB>.iam.gserviceaccount.com --role=roles/cloudfunctions.invoker <FunctionName> --project=<projectA>
Last traps:
The Cloud Functions in the project A haven't the same signature if it's an HTTP functions (callable by a pubsub push subscription) or a Background Function (callable by events, such as CLoud Storage event). You need to update this according to the documentation
The PubSub message sent to the Cloud Functions is slightly different. Take care to update the input param accordingly.
Google documentation says its not possible
This is all you need to know. It is not possible. There are no workarounds, regardless of what sort of error messages you might see.
Please can you help me, I'm receiving this error when I'm trying to deploy a google cloud function:
HTTP Error: 400, Default service account 'project-name#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Cloud Functions API), or specify a different account.
The command used to deploy is:
firebase deploy --only functions
A temporary solution is fine, but if you can help me to solve it permanently is better.
Thanks in advance.
In my case, the App Engine default service account was deleted. It looks like this: {project_id}#appspot.gserviceaccount.com
So I had to restore the service account like this:
You can now recover the deleted service accounts from https://cloud.google.com/iam/reference/rest/v1/projects.serviceAccounts/undelete
you have to get the UniqueID of the service account from https://console.cloud.google.com/home/activity
Source: https://stackoverflow.com/a/55277567/888881
The API explorer is an easy way to use the IAM API: https://developers.google.com/apis-explorer/#p/
I was struggling to resolve this issue, then I raised a case with Google.
here is a detailed article of my learnings :
https://medium.com/#ashirazee/http-error-400-default-service-account-appspot-gserviceaccount-com-accd178ea32a
Firstly navigate to Google Cloud Platform and view your service accounts.
try and find <project_id>#appspot.gserviceaccount.com' in your list of service accounts for the firebase project, it is linked to the App Engine.
if '#appspot.gserviceaccount.com' is missing you can not deploy anything(SEE EMAIL WITH GOOGLE BELOW), if it isn't, check and see if it's enabled, try disabling it and enabling it again.
#appspot.gserviceaccount.com is pre-installed by default, regardless of a paid account or not. try and recall if at any time you may have deleted it after or before deployment.
Now if you have for any reason deleted it over a period of more than 30 days than you can not retrieve it, and you must create a new firebase project. However, if it is within 30 days you can undelete it.
EMAIL FROM GOOGLE:
Email #1
"
Hello Ali
I am checking the logs of your project, unfortunately the service account was deleted on Ma, there is no chance to recover it nor recreate it
The only workaround available is to create a new project and deploy the service desired there. I know this could not be the best option for you nevertheless it is the way this works by design.
Do not hesitate to write back if you have more questions.
Cheers,"
Email #2,
"Hello Ali
I am glad to read that you have been able to deploy your functions successfully, unfortunately that service account cannot be recovered after 30 days of being deleted and that is the only solution. If you have other questions, please let us know by contacting us again through our support channel.
Cheers,"
lastly here is a helpful command line that will help you debug this, however, it won't help if there is no service account, it'll just highlight the obvious:
firebase deploy --only functions --debug
this was my error:
"HTTP Error: 400, Default service account '<project_id>#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Clo
ud Functions API), or specify a different account."
Following the error message, you could enable the API by console accessing this url and enable the api.
Or by gcloud command:
gcloud services --project <project_id> enable cloudfunctions.googleapis.com
As the other stated, sadly, you need to use the default service account.
If within the 30 days period, you can use this tutorial to find and undelete the service acc: Read this guide to get https://cloud.google.com/iam/docs/creating-managing-service-accounts#undeleting_a_service_account
You have to enter the commands through the Google Cloud Console (one of the buttons will open a terminal on the right of the top blue app bar)
try to select Google Cloud Platform (GCP) resource location in Setting of Firebase. enter image description here
I get the following error while trying to create a google container cluster.
An unknown error has occurred in Compute Engine: "EXTERNAL: Google
Compute Engine: Required 'compute.zones.get' permission for
'projects/access-jobs/zones/europe-west2-c'". Error code: "18" RETRY
Can someone please help me out. I am new to google cloud platform
Based on the error message, it looks like you do not have the compute.zones.get permission in the access-jobs project. This permission is required to get information about a zone in GCE.
Required 'compute.zones.get' permission for
'projects/access-jobs/zones/europe-west2-c'"
Enabling the Google Container Engine API
You will need to enable the API. You can visit this URL which will ask you to choose a project and then wait for a few minutes before the API is enabled. After that, you will be able to create the Container Clusters.
IAM Roles and Missing permissions
If you're still getting this error even after enabling the API, you might actually have a permission issue.
You can look at the list of Compute Engine IAM roles to understand the list of permissions any given role has. The account you use to access Google Cloud Platform needs to be granted one of these roles which provides compute.zones.get permission. For creating Cloud Clusters (i.e. Google Container Engine (GKE) clusters), you most likely will need more permissions than just this one. You can look at GKE IAM roles to understand better which role would be most suitable. You can also look at the second option.
Use an account which has been granted project owner / project editor role, so that you can edit your project freely. You might have to check with your project owner/admin if you're not one to get this role granted just like option 1.
I would like to use google storage for backing up my database. However, for security reason, i would like to use a "service account" with a write only role.
But it seems like this role can also delete objects! So my question here: can we make a bucket truly "write only, no deletion"? And of course how?
This is now possible with the Google Cloud Storage Object Creator role roles/storage.objectCreator.
https://cloud.google.com/iam/docs/understanding-roles#storage.objectCreator
You cannot do this, unfortunately. There is currently no way to grant permission to insert new objects while denying the permission to delete or overwrite existing objects.
You could perhaps implement this using two systems, the first being the backup service which wrote to a temporary bucket, and the second being an administrative service that exclusively had write permission into the final backup bucket and whose sole job was to copy in objects if and only if there are no existing objects at that location. Basically you would trust this second job as an administrator.
I'm trying to use the WHMCS API / provisioning module to auto-create a mysql database for a new WHM / cPanel client after their order for hosting goes through.
It seems like it should be simple enough to do, but I can't find anything about it in the docs. The WHMCS module is connected to the WHMCS mysql instance, but I would want to be connecting the "main" local mysql instance where the client's data is held.
Would I simply connect to localhost in the provisioning module, and create a new database and user? Or is there a more robust way to handle this without going outside the WHMCS / WHM system.
Please provide examples if possible!
Thanks for any help
I would use whmcs's hooks. A hook is function that is fired as a result of an internal action.
You should look for the invoice items after the invoice is paid but before the email in case you want to send database information if not then after paid should be fine
This would go in billing dir / includes / hooks
and it would look like
function init_InvoicePaidPreEmail($vars) {
// create your db code here be sure to look for the specific invoice item "order for hosting"
}
add_hook("InvoicePaidPreEmail",1,"init_InvoicePaidPreEmail","");