Google Cloud Functions authentication to other GCP APIs - google-cloud-functions

I want to write a Google Cloud Function that can interact with GCP's Dataproc service to programatically launch Dataproc clusters. We already have a battle-hardened Dataproc infrastructure, we're just looking to extend the ways in which they get launched.
Our Dataproc clusters can only be launched using an appropriate IAM service account that is already a member of the appropriate IAM roles hence the Cloud Function will need to authenticate to the Dataproc service using that service account. What is the most appropriate way for a Cloud Function to authenticate to other GCP services/APIs using a service account?
Options I suspect include:
* running the function as that service account
* providing a JSON key file & setting GOOGLE_APPLICATION_CREDENTIALS environment variable
Is there a recognised way of achieving this?
I have had a look at :
* https://cloud.google.com/docs/authentication/
* https://cloud.google.com/docs/authentication/getting-started
but they are not specific to Cloud Functions.
I've also looked at
* https://cloud.google.com/functions/docs/writing/http
but that seems more concerned with how the caller of the function can authenticate.

I think this is what you're looking for: https://cloud.google.com/functions/docs/concepts/iam
At runtime, Cloud Functions defaults to using the App Engine default service account (PROJECT_ID#appspot.gserviceaccount.com), which has the Editor role on the project. You can change the roles of this service account to limit or extend the permissions for your running functions. You can also change which service account is used by providing a non-default service account on a per-function basis.
tl;dr gcloud functions deploy FUNCTION_NAME --service-account SERVICE_ACCOUNT_EMAIL
By the way, if you ever need more complex scheduling logic, consider looking into Cloud Composer (managed Apache Airflow): https://cloud.google.com/composer/

Related

Authenticating to cloud function from PubSub push subscription

We are using PubSub for queuing utilizing a push subscription pointing at an http-triggered cloud function. According to this documentation Cloud Run and App Engine will both authenticate requests from PubSub, cloud functions isn't listed. We have used other google services, like scheduler to invoke functions which require authentication, but have not had luck doing so with PubSub.
My question is, does cloud functions support authentication from PubSub through a subscription aim account set, or is it required that the function read and deal with the JWT itself for authentication?
You need different things:
A service account with the role/cloudfunctions.invoker
tick the Enable authentication
Select your service account
Add the Cloud Function URL (as provided in the Cloud Function) in the audience field. It's the missing part in the Ricco answer
EDIT 1
PubSub needs to have the authorization to generate a token on a service account. Check the first step on this. There, it shows how to grant the pubsub service agent service account as token creator.
Pub/Sub subscription supports the use of service account authentication for subscriptions using "Push".
To use service accounts just specify the endpoint of the cloud function, enable authentication and add a service account to be used to send requests to the cloud function. Make sure that the service account has the appropriate permissions to access both PubSub and cloud functions.

How do you start a Dataflow template from a Compute Engine instance?

From my workstation I can fire templated Dataflow jobs with the gcloud dataflow jobs command. The required authorization to insert a new job come from my workstation where I'm logged in.
On the Compute Engine instance I rely on it's service account. The one with (number)-compute#. Within the AIM section I enabled Dataflow/Dataflow Admin, Dataflow/Dataflow Developer and Dataflow/Dataflow Worker for this service account to be safe.
I even added Cloud Dataflow Service Agent when I came across that one.
Then I try to start a Dataflow from the command line but I get an error about insufficient authentication scopes: ERROR: (gcloud.dataflow.jobs.run) PERMISSION_DENIED: Request had insufficient authentication scopes.
If I do a gcloud config auth and login with my personal account, of course, it works.
Somehow I'm missing the proper permissions to set to the applied service account.
Is there a guideline I missed? Can somebody please point me into the right direction?
The error message indicates that the instance does not setup access scope properly. To launches a job from a GCE VM, the VM must have compute.read-only, compute, or cloud-platform scope for the project.
The way to verify it is using the command "gcloud compute instances describe --zone=[zone][instance]" and look for "scopes".
This document and this existing question may provide useful guidelines for you.

Can I restrict access to a Google Cloud SQL instance to specific service account?

I have multiple environments in Google Compute Engine (dev, staging, and production), each with its own Google Cloud SQL instance. The instances connect via Cloud SQL Proxy and authenticate with a credential file that is tied to a service account. I want to have a separate service account for each environment, which would be restricted to accessing the SQL instance specific to that environment. Currently, it appears that any service account with role Cloud SQL Client can access any Cloud SQL instance within the same project.
I cannot find any way to restrict access on a Cloud SQL Instance to a specific service account. Is it possible, and if so, how? If not, is there a different way to achieve the goal of preventing a server in one environment from accessing a Cloud SQL instance in another environment?
NOTE: this configuration is possible with Google Cloud Storage; one can assign a specific service account to have various permissions on each bucket, so that the dev service account cannot accidentally access Production files.
Unfortunately, Cloud SQL currently does not support instance level IAM policies.
The only workaround is hosting the instances in different projects.
As of the August 2021 release of Google Cloud SQL:
You can use IAM Conditions to define and enforce conditional, attribute-based access control for Google Cloud resources, including Cloud SQL instances
See the documentation for IAM Conditions for information about how to restrict a user or service account to specific Cloud SQL instances.

Default Compute Engine service account cant access Cloud SQL

Im trying to get my compute engine instance to communicate with Cloud SQL using the Proxy. I keep getting this error when I try to start the proxy:
the default Compute Engine service account is not configured with
sufficient permissions to access the Cloud SQL API from this VM.
Please create a new VM with Cloud SQL access (scope) enabled under
"Identity and API access". Alternatively, create a new "service
account key" and specify it using the -credentials_file parameter
When I describe my instance using gcloud compute instances describe the service account and scopes are:
serviceAccounts:
- email: 123456-compute#developer.gserviceaccount.com
scopes:
- https://www.googleapis.com/auth/devstorage.full_control
- https://www.googleapis.com/auth/logging.write
- https://www.googleapis.com/auth/monitoring.write
- https://www.googleapis.com/auth/sqlservice
- https://www.googleapis.com/auth/sqlservice.admin
I can get this working if I create a new instance with full scope permissions:
serviceAccounts:
- email: 123456-compute#developer.gserviceaccount.com
scopes:
- https://www.googleapis.com/auth/cloud-platform
But this seems less secure than just specifying the scopes I need.
It is an issue fixed in https://github.com/GoogleCloudPlatform/cloudsql-proxy/pull/21.
We will roll out a new release on Monday (4/18). Or you can compile from the source on github. Sorry for the inconvenience.

How to attach a service acount to an existing GCE VM?

Need to submit dataflow job from an existing GCE VM in google cloud, learned that there has to be one service account with proper scope to be attached to that VM when the VM is created, what if VM already existed? how to attach a service account to an existing vm?
According to the GCE docs you cannot change the attached service account after instance creation:
After you have created an instance with a service account and specified scopes, you cannot change or expand the list of scopes.
See
https://cloud.google.com/compute/docs/authentication#using
for more details.
However if you don't want to recreate your VM you should be able to create a service account and authenticate to that using a private key, as described in the following:
https://developers.google.com/identity/protocols/OAuth2ServiceAccount
This is likely less convenient than the using a VM service account because you'll need to manage the private key and authentication yourself.