Need to submit dataflow job from an existing GCE VM in google cloud, learned that there has to be one service account with proper scope to be attached to that VM when the VM is created, what if VM already existed? how to attach a service account to an existing vm?
According to the GCE docs you cannot change the attached service account after instance creation:
After you have created an instance with a service account and specified scopes, you cannot change or expand the list of scopes.
See
https://cloud.google.com/compute/docs/authentication#using
for more details.
However if you don't want to recreate your VM you should be able to create a service account and authenticate to that using a private key, as described in the following:
https://developers.google.com/identity/protocols/OAuth2ServiceAccount
This is likely less convenient than the using a VM service account because you'll need to manage the private key and authentication yourself.
Related
I want to write a Google Cloud Function that can interact with GCP's Dataproc service to programatically launch Dataproc clusters. We already have a battle-hardened Dataproc infrastructure, we're just looking to extend the ways in which they get launched.
Our Dataproc clusters can only be launched using an appropriate IAM service account that is already a member of the appropriate IAM roles hence the Cloud Function will need to authenticate to the Dataproc service using that service account. What is the most appropriate way for a Cloud Function to authenticate to other GCP services/APIs using a service account?
Options I suspect include:
* running the function as that service account
* providing a JSON key file & setting GOOGLE_APPLICATION_CREDENTIALS environment variable
Is there a recognised way of achieving this?
I have had a look at :
* https://cloud.google.com/docs/authentication/
* https://cloud.google.com/docs/authentication/getting-started
but they are not specific to Cloud Functions.
I've also looked at
* https://cloud.google.com/functions/docs/writing/http
but that seems more concerned with how the caller of the function can authenticate.
I think this is what you're looking for: https://cloud.google.com/functions/docs/concepts/iam
At runtime, Cloud Functions defaults to using the App Engine default service account (PROJECT_ID#appspot.gserviceaccount.com), which has the Editor role on the project. You can change the roles of this service account to limit or extend the permissions for your running functions. You can also change which service account is used by providing a non-default service account on a per-function basis.
tl;dr gcloud functions deploy FUNCTION_NAME --service-account SERVICE_ACCOUNT_EMAIL
By the way, if you ever need more complex scheduling logic, consider looking into Cloud Composer (managed Apache Airflow): https://cloud.google.com/composer/
I'm trying to set up read/write access to a Cloud Storage bucket from a GCE instance, using a service account, but don't get the permissions. I have done the following:
Created service account, let's say 'my-sa'
Created a bucket, let's say 'my-bucket'
In IAM console for my project, assign role 'Cloud Storage admin' to service account
Created a new GCE instance via the console, assigned to service account 'my-sa'. Access scope is then automatically set to cloud-platform
Connect to instance using gcloud compute ssh as my user (project owner)
Run gsutil ls gs://my-bucket
Expected behaviour: get list of items in bucket
Observed behaviour:
gsutil takes about 5 seconds to think, then gives:
AccessDeniedException: 403 my-sa#my-project.iam.gserviceaccount.com does not have storage.objects.list access to bucket my-bucket.
Things I've tried:
gcloud auth list on the instance does show the service account, and shows it as being active
I've added more permissions to the service account (up to project owner), doesn't make a difference
I also can't use other permissions from the instance. When I give Compute Engine Admin role to the service account, I can't run gcloud compute instances list from the instance
I've removed the .gsutil dir to make sure the cache is cleared
With the default Compute Engine service account, I can list the buckets, but not write (as expected). When I add the Cloud Storage read/write access scope from the console, I can also write
I really don't have a clue on how to debug this anymore, so any help would be much apprreciated
I have a Linux VM on Google Compute Engine that I am accessing via SSH. It works just fine, but when I go to the Cloud Console, it asks me if I want to create a new VM as if I have none. I know I'm on the right account because it shows my billing balance has gone down.enter image description here Where did my server go?
It is weird. But it is important to make a differentiation that is not obvious once you start using Google Cloud Platform. The credentials you are using to access the Platform ( your email or a service account), the projects where an entity that any resource must be attached to and the billing account that is the payment profile that can have several projects associated.
In that case you could be in a different project, that is associated to the same billing account.
To check you can the project where your machine is, in the shell
Gcloud compute instances list
Here you will see the instances in your actual project. If nothing appears, reset gcloud configuration.
gcloud init
And change the project.
So I'm currently really new to google cloud platform and I have an issue to be solved.
I've already created a compute engine in gcloud. When it was created, it automatically assigned an external IP. I'd prefer the instances to not have any external IP.
I saw that you could put --no-address argument when creating the instances so it wouldn't be assigned an external IP, but how to release the external IP when the instance has already created?
> gcloud compute addresses list
NAME REGION ADDRESS STATUS
webserver europe-west1 130.211.70.XXX IN_USE
gcloud compute addresses delete 130....
Or use use 'network' tab in the web interface if it's a one-off need.
I'm trying to do gcloud init on my fresh GCE instance using a service account that I've created in the Developers Console. In the Developers Console, I see a few service accounts under Permissions, which I can't generate private key files for; I also see a service account that I made under Service accounts which I can get private keys for.
When I do gcloud init on the GCE instance, under "Pick credentials to use", I only see the service accounts in the Permissions tab (for which I don't have private keys). I'd like to use the service account that I have private keys for.
I can log in with my personal account for now, but this isn't scalable. Any advice?
You can use gcloud auth activate-service-account command to get credentials via the private key for a service account. For more information and example please visit this link.
Elaborating on #Kamaran's answer after further discussion.
The basic solution is to enable the service account on the GCE instance.
First use gcloud compute copy-files <private json key file> <instance name>:remote/path/to/key to copy the file to the remote instance. Then run gcloud auth activate-service-account <service account address> --key-file remote/path/to/key command on the remote. The new service account will then be available in the gcloud init menu.