I need to take a snaphots of all my servers from script in all projects in GCP.
Project count : 10
Servers per project : 05
Written the script in a Server: Automation-server
Script has:
gcloud compute --project $PROJECT_ID disks snapshot ${DEVICE_NAME} --snapshot-names gcp-${DEVICE_NAME}-${DATE} --zone ${INSTANCE_ZONE})
As of now i have configured my E-mail id in the Server X
with gcloud auth
(My E-mail id has access to all of my projects so that i can able to take snapshots of all the servers)
So i can able to do the same via scripting.
I wish not to do this via a user authentication(mentioning the E-mail id) .
Is there any possiblity for doing the above via any application or using any api-key etc..,
By granting the access of all the projects to a application or api-key and by using that , taking the snapshots from the script
This will be used in :
If a user X has access to 5 projects , and an user Y has access to another set of 5 projects.
Need to take snapshots for all the 10 projects using script
at this time if the gcloud auth was done via an application or api-key etc..,
Is it possible or any other way for the above case
This is possible:
Create a service account in cloud project.
Go to each of your 10 projects, and grant the service account either "Editor", "compute.instanceAdmin" or "compute.storageAdmin" IAM permission.
Use gcloud auth activate-service-account in your script to have the script run as the service account.
You could also use multiple service accounts for different projects, and switch between the.
Related
I would like to know whether it is possible to attached a service account created in my-project-a to a Google Compute Engine instance in say my-project-b?
The following command:
gcloud beta compute instances create my-instance \
--service-account=my-service-account#my-project-a.iam.gserviceaccount.com \
--scopes=https://www.googleapis.com/auth/cloud-platform \
--project=my-project-b
gives me the following error:
(gcloud.beta.compute.instances.create) Could not fetch resource:
- The user does not have access to service account 'my-service-account#my-project-a.iam.gserviceaccount.com'. User: 'me#mysite.com'. Ask a project owner to grant you the iam.serviceAccountUser role on the service account. me#mysite.com is my account and I'm the owner of the org.
Not sure whether this is related, but looking at the UI (in my-project-b) there is no option to add a service account from any other projects. I was hoping to be able to add the account my-service-account#my-project-a.iam.gserviceaccount.com
You could follow these steps to authenticate a service account from my-project-a to an instance in my-project-b:
Create a service account in my-project-a with the proper role for compute engine
Download the JSON file.
Copy the my-project-a new service account email
On my-project-b, add a team member by using the copied email from the previous step
Connect via SSH to your instance in my-project-b
Copy the JSON file from the step 2 on your my-project-b instance
Run the following command to activate the service account:
gcloud auth activate-service-account --key-file=YOUR_JSON_FILE
Verify by using the following command:
gcloud auth list
Is it possible to give a Google Compute Engine instance permission to delete itself without also giving it permission to delete other instances?
That is, I'd like instance name ABC to be able to run:
gcloud compute instances delete ABC
using it's own name, ABC, but no other name.
From the delete instance API docs, to delete any instance in the project you have to have:
compute.instances.delete permission
One of the following OAuth scopes:
https://www.googleapis.com/auth/compute or https://www.googleapis.com/auth/cloud-platform OAuth scope.
Which seems to me that you either have permission to delete any instance in the project or none at all.
No, the service account that assigned to the instance it's running the gcloud command not the instance.
Permissions are granted by setting policies that grant roles to a user, group, or service account as a member of your project.
Example: The role "compute Instance Admin" can create, modify, and delete virtual machine instances, that's means all the instances in your project. You cannot specify for a specific instance.
The gcloud command below can be applied for the ABC instance or any other instances in your project.
gcloud compute instances delete ABC --zone <zone>
The permission compute.instances.delete is in these roles:
Compute Admin
Compute Instance Admin
Project Editor
Project Owner
You can as well create a custom Role that have mixed permissions and assign it to a service account that will, but you need to be sure that you set every permission required for the action.
Scopes is to Select the type and level of API access that you grant grant to the VM.
By Default: read-only access to Storage and Service Management, write access to Stackdriver Logging and Monitoring, read/write access to Service Control
But you can select which Cloud APIs that the VM I mean the service account can access.
I have multiple environments in Google Compute Engine (dev, staging, and production), each with its own Google Cloud SQL instance. The instances connect via Cloud SQL Proxy and authenticate with a credential file that is tied to a service account. I want to have a separate service account for each environment, which would be restricted to accessing the SQL instance specific to that environment. Currently, it appears that any service account with role Cloud SQL Client can access any Cloud SQL instance within the same project.
I cannot find any way to restrict access on a Cloud SQL Instance to a specific service account. Is it possible, and if so, how? If not, is there a different way to achieve the goal of preventing a server in one environment from accessing a Cloud SQL instance in another environment?
NOTE: this configuration is possible with Google Cloud Storage; one can assign a specific service account to have various permissions on each bucket, so that the dev service account cannot accidentally access Production files.
Unfortunately, Cloud SQL currently does not support instance level IAM policies.
The only workaround is hosting the instances in different projects.
As of the August 2021 release of Google Cloud SQL:
You can use IAM Conditions to define and enforce conditional, attribute-based access control for Google Cloud resources, including Cloud SQL instances
See the documentation for IAM Conditions for information about how to restrict a user or service account to specific Cloud SQL instances.
I'm trying to set up read/write access to a Cloud Storage bucket from a GCE instance, using a service account, but don't get the permissions. I have done the following:
Created service account, let's say 'my-sa'
Created a bucket, let's say 'my-bucket'
In IAM console for my project, assign role 'Cloud Storage admin' to service account
Created a new GCE instance via the console, assigned to service account 'my-sa'. Access scope is then automatically set to cloud-platform
Connect to instance using gcloud compute ssh as my user (project owner)
Run gsutil ls gs://my-bucket
Expected behaviour: get list of items in bucket
Observed behaviour:
gsutil takes about 5 seconds to think, then gives:
AccessDeniedException: 403 my-sa#my-project.iam.gserviceaccount.com does not have storage.objects.list access to bucket my-bucket.
Things I've tried:
gcloud auth list on the instance does show the service account, and shows it as being active
I've added more permissions to the service account (up to project owner), doesn't make a difference
I also can't use other permissions from the instance. When I give Compute Engine Admin role to the service account, I can't run gcloud compute instances list from the instance
I've removed the .gsutil dir to make sure the cache is cleared
With the default Compute Engine service account, I can list the buckets, but not write (as expected). When I add the Cloud Storage read/write access scope from the console, I can also write
I really don't have a clue on how to debug this anymore, so any help would be much apprreciated
I have a Linux VM on Google Compute Engine that I am accessing via SSH. It works just fine, but when I go to the Cloud Console, it asks me if I want to create a new VM as if I have none. I know I'm on the right account because it shows my billing balance has gone down.enter image description here Where did my server go?
It is weird. But it is important to make a differentiation that is not obvious once you start using Google Cloud Platform. The credentials you are using to access the Platform ( your email or a service account), the projects where an entity that any resource must be attached to and the billing account that is the payment profile that can have several projects associated.
In that case you could be in a different project, that is associated to the same billing account.
To check you can the project where your machine is, in the shell
Gcloud compute instances list
Here you will see the instances in your actual project. If nothing appears, reset gcloud configuration.
gcloud init
And change the project.