Application Default Credentials Not Working in Cloud Function to access Google Workspace API - google-cloud-functions

I am trying to use the service account using Google Cloud Functions to access the Workspace Directory API. I am trying to use Application Default Credentials approach. Since the documentation doesn't mention any additional steps to be done in the Google Cloud Console side, I assume that Application Default Credentials (ADC) to the function is automatically passed to the cloud function from Google Cloud's Metadata server. The following code works perfectly in my local, emulated environment where I have set GOOGLE_APPLICATION_CREDENTIALS environment variable to point to the JSON credential file. However when I deploy the function to Google Cloud, I am getting the error "Not Authorized to access this resource/api". I have been searching and trying for days without any success. As an aside, before I stumbled upon this thread, I was using getCredentials() method of GoogleAuth to get the private_key to create JWT auth (for the "subject" property) and passing that auth to admin API call. Which again worked perfectly in my local environment but fails in the cloud environment because getCredentials() private_key is null, which is probably expected behavior. Any help is deeply appreciated. I am inexperienced with posting in Stackoverflow, if I need to follow any other protocol, please advise.
export const listUsers = functions.https.onCall((data, context) => {
return new Promise(async (resolve, reject) => {
const envAuth = new GoogleAuth({
scopes: ["https://www.googleapis.com/auth/admin.directory.user.readonly"],
clientOptions: {
subject: "admin#mydomain.com",
},
});
const client = await envAuth.getClient();
const service = google.admin({version: "directory_v1", auth: client});
try {
const response = await service.users.list({
customer: "MYCUSTOMER_ID",
});
resolve(response);
} catch (err) {
reject(err);
}
});
});

When you run a Cloud Function without authenticating via a key file (the json credential file that has the key to allow you to impersonate/authenticate as a different user), the function runs as the default application credentials for the project (<YOUR_CLOUD_PROJECT_ID>#appspot.gserviceaccount.com). You don't have to do anything special, no authentication calls -- it runs with the permissions of this service account by default.
When you run the code locally, it takes on the user authentication of whatever key file you have locally (and have referenced in the "GOOGLE_APPLICATION_CREDENTIALS" environment variable on your machine).
Without full security information on your environment, it's hard to know for sure... But, I'm guessing that you're trying to access resources that your personal Google Account has proper access; however, the default service account for you Cloud Functions Project does not.
So, couple of routes to troubleshoot and/or fix the issue:
Give the default service account access rights to Workspace resource(s) you're attempting to access.
Use the JSON key file you setup locally already, to have the Cloud Function run as the same user as is happening when you run locally.
Essentially do a hybrid where you create a new service account that ONLY has the permissions you want (instead of using the default service account or your personal user, both of which might have far more permissions then desired for safety/security), use a key file to run the Cloud Function under that identity, and only give the desired permissions to that service account.

Related

Calling Firebase Hosting API from a Firebase Cloud Function

I have a Firebase (node.js) cloud function that pulls in some data from my app's Firestore database and builds some static content for the web. I'd like that same cloud function to deploy the static content to Firebase hosting via the Firebase Hosting API, creating a static portion of my site with user generated content.
I understand the general flow thanks to the somewhat clear walkthrough, but am stuck on the first step: getting an access token to call the API. Obviously I'm not going to insecurely put my service account key in the cloud function itself, so the example in the walkthrough doesn't apply. And as I understand it, Firebase cloud functions are already associated with a service account, so presumably there's some way to get an access token to call other Google Cloud services from a cloud function.
So how do I get an access token to call the hosting API from a Cloud Function?
There are some red flags that make me think this isn't possible. For example, all of the uses cases in the walkthrough allude to other server environments, as opposed to Google Cloud environments. And yet, this use case is the third bullet in the use case list in the walkthrough.
I've searched extensively here and elsewhere for some guidance, but aren't finding anything. There are some older questions about accessing hosted files from a cloud function that aren't relevant. This promising question from 5 years ago about this exact use case only has dead ends.
You can use the google-auth-library package in Cloud Functions to a get a token as shown below:
import { GoogleAuth } from "google-auth-library";
const token = await new GoogleAuth({
scopes: ["https://www.googleapis.com/auth/cloud-platform"],
}).getAccessToken();
If you use Firebase Admin SDK in the Cloud Functions, then you can get an access token of the default service account as shown below (do ensure the service account has required permissions):
import { initializeApp } from "firebase-admin/app";
const admin = initializeApp();
const token = await admin.options.credential?.getAccessToken();
// ^ Google OAuth2 access token object used to authenticate with Firebase services.

Cloud Identity Platform make Custom SAML ACS Callback

I'm trying to make a custom SAML app to integrate with Google Workspace (i.e. so that if a person in the organization wants to access it, they could do so from the apps list on google.com).
Because Google Cloud Identity Platform only supports service provider-initiated login, this does not seem possible using the default callback URL they provide. I saw this answer to a similar question, and was hoping to implement something like this. However, the SAMLResponse coming in seems to be encrypted, and I don't know enough about the encryption process to know how to decrypt it (or if that's even possible).
I'm using a Cloud Function as my callback URL, and to be clear I'm trying to decrypt the res.body.SAMLResponse string:
exports.samlACSCallback = functions.https.onRequest(async (req, res) => {
console.log(req.body.SAMLResponse)
})
My best guess is that it's somehow related to the certificate that I had to copy from the Google Admin console to the Cloud Identity setup page?

Connecting Google add-on to google vm instance

I am developing an add-on for google drive.
As a part of the functionality of this add-on, I would like to incorporate a google vm server that performs some processing on a user's google drive files (i.e. you click on a file through the add-on, send a request with a download link to the server, the server downloads the file, then finally responds to the request with some helpful information about the file). Both the apps script for the add-on and the vm instance are connected to the same google "project."
I am struggling with google's OAuth2.0 system and how I can connect the authorization of the add-on and the vm instance together.
Currently, when users open the add-on for the first time, they are brought to the authorization screen like so .
Once they authorize, my add on has access to all the expected scopes, including read access to google drive files.
Now I want my server to have access to them as well. Unfortunately, I do not understand how to do this.
I have tried simply requesting the url returned from file.getDownloadUrl() in python. While the request returns a status code of 200, I cannot seem to get the file to download.
I have also looked into the Google Drive API for python (I am running a flask server). Unfortunately, it appears that I need an entirely new authorization flow to make it work.
Any clarity on this issue would be greatly appreciated. Frankly, I find google's documentation on this matter very scattered and confusing. So, even knowing the right place to look would extremely helpful.
Much Thanks!
EDIT
I am adding some additional code to help provide some clarity. This is currently how I make a request to my server from the add-on:
var route = http://exampleurl.com/process
var data = {
'oAuthToken': ScriptApp.getOAuthToken(),
'stateToken': ScriptApp.newStateToken().withTimeout(120).createToken(),
'fileId': e.parameters.fileId,
'fileType': e.parameters.fileMimeType
};
var options = {
'method' : 'post',
'payload' : data
};
var response = UrlFetchApp.fetch(route, options);
This code successfully sends information to my vm instance running my server.
Now, I need to authorize the server to download the file specified by fileId.
When developing, I closely followed this tutorial to set up OAuth2.0 access to the Drive API. Here are two key routes:
#app.route('/google/login')
#no_cache
def login():
session = OAuth2Session(CLIENT_ID, CLIENT_SECRET,
scope=AUTHORIZATION_SCOPE,
redirect_uri=AUTH_REDIRECT_URI)
uri, state = session.create_authorization_url(AUTHORIZATION_URL)
flask.session[AUTH_STATE_KEY] = state
flask.session.permanent = True
return flask.redirect(uri, code=302)
#app.route('/google/auth')
#no_cache
def google_auth_redirect():
req_state = flask.request.args.get('state', default=None, type=None)
if req_state != flask.session[AUTH_STATE_KEY]:
response = flask.make_response('Invalid state parameter', 401)
return response
session = OAuth2Session(CLIENT_ID, CLIENT_SECRET,
scope=AUTHORIZATION_SCOPE,
state=flask.session[AUTH_STATE_KEY],
redirect_uri=AUTH_REDIRECT_URI)
oauth2_tokens = session.fetch_access_token(
ACCESS_TOKEN_URI,
authorization_response=flask.request.url)
flask.session[AUTH_TOKEN_KEY] = oauth2_tokens
return flask.redirect(BASE_URI, code=302)
Is there a way to plug in the two tokens I generate from the add-on into this Oauth flow? It appears that Google isn't anticipating this setup given that I am required to provide a redirect URL, which wouldn't make much sense in the case of my add-on/server tech stack.
Currently, you can successfully send the access token from the Apps Script project (retrieved with getOAuthToken()), to the flask server.
Since you already got the access token, you don't need to go through all the OAuth process as defined here (use the application credentials to request the access token, provide user consent, redirect, etc.). Sending the access token through the API request is actually the last step in the OAuth flow.
You just need to use the token to build the service, and the server will be able to access the file.
Using access token to build service:
from googleapiclient.discovery import build
import google.oauth2.credentials
ACCESS_TOKEN = requestBody['oAuthToken'] # Data coming from Apps Script via UrlFetch
creds = google.oauth2.credentials.Credentials(ACCESS_TOKEN)
drive_service = build('drive', 'v3', credentials=creds) # Build Drive service
Using Drive service to download files:
After you've build the Drive service to access the API, you can use this code sample to download Google Documents, or this one for other files (both examples include a tab with a Python sample).
Note:
You might need to Install the Google Client Library if you haven't done so.
Reference:
Using OAuth 2.0 to Access Google APIs

Azure Function using MSI - Error Requesting Token

I have a Function in Azure, which has MSI (Managed Service Identity) enabled which I am trying to use to access an Azure based WebAPI (App Service WebApp) which in turn has Azure AD Authentication enabled (all same Azure Directory).
My WebAPI has an Azure App registered so it can use AAD Authentication.
This app also has the necessary AppRoles configured in its Manifest (for types 'User' and for 'Application').
I have also verified that the Functions Identity (app) was successfully created in Azure AD when I enabled MSI on the Function.
When I try to obtain a token within my Function using MSI i receive a 400 Bad Request response / error:
"ExceptionMessage": "AADSTS50105: Application '###' is not assigned to a role for the application '###'
"ErrorCode": "invalid_grant"
I have ensured the Resource value I pass in is my webAPIs app ID URI.
string resource = "<My App URI>";
string apiversion = "2017-09-01";
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("Secret", Environment.GetEnvironmentVariable("MSI_SECRET"));
var r = await client.GetAsync(String.Format("{0}/?resource={1}&api-version={2}", Environment.GetEnvironmentVariable("MSI_ENDPOINT"), resource, apiversion));
return r;
But I still get the same error. The code for requesting a token is fine, and the error does infact point towards a permissions issue.
The one thing I have not been able to do (and I guess this is the problem) is find a way to give add the new MSI/Function Identity to the Users & Groups of the webAPIs Azure App. No matter what I try my Functions App Identity does not appear in the Users list when I search for it to add as a member of the webAPI app (with the Application role).
Does anyone have any suggestions as to why I cannot add the Functions MSI to an Apps Users & Groups or to an Azure AD Group?
Or am I doing something else wrong perhaps?
Juuna was spot on in his response to my original post:
When enabling MSI on a service only a Service Principal is created in Azure AD and as such this wont appear in search results when trying to add the SP as a member of a Group or to User & Groups of an Azure AD App.
In order to assign your MSI created Service Principal permissions to your App you need to:
Edit your apps manifest and ensure you have app roles with allowed member type = "Application"
Run the PowerShell cmdlet "New-AzureADServiceAppRoleAssignment (link) to grant your Service Principal the Application role you added to your apps manifest
Within your MSI enabled service re-try requesting a token
For me, following the above, I can now successfully request app tokens from my Function so my same Azure Function can call my WebApp (which is AAD Authentication enabled).
In fact, when you enable MSI, it will create a service principal(not a user), so you could not find it in Users&Groups.
You could find it on Azure Portal and try to give permissions you need. Azure Active Directory -->Enterprise applications
Note: The service principal name is same with your function name.
In function, you should use the following code to get token, please refer to this link.
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.KeyVault;
// ...
var azureServiceTokenProvider = new AzureServiceTokenProvider();
string accessToken = await azureServiceTokenProvider.GetAccessTokenAsync("https://management.azure.com/");
// OR
var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));

How do I configure project and gcloud to connect to Datastore?

I'm unable to get my Google Compute instance to speak to the Datastore (within the same project).
I believe I've set everything up correctly:
Google Compute instance has the full API scope
All the relevant APIs have been enabled in the project
Billing is enabled in the project
The tutorials claim that I won't even need to have a service account to use the API, but since it didn't work I also tried setting up a service account and put the key file on my instance. Still no luck.
Here's the code I'm trying to run (I created a Test entity in the Datastore of the project, and I can successfully look it up using Google's API explorer with an OAuth2 token for my account):
const gcloud = require('gcloud')({
projectId: 'roger-web-client',
keyFilename: './roger-web-client-8d1fbd8baae2.json',
});
const dataset = gcloud.datastore.dataset();
dataset.get(dataset.key(['Test', 5629499534213120]), (error, entity) => {
console.log(error || entity);
});
This results in the error 403 Forbidden. If I comment out keyFilename, I get 401 Unauthorized instead, which seems to imply the magic authorization on Google Compute instances isn't working for me.
Ideas?