Azure Function using MSI - Error Requesting Token - function

I have a Function in Azure, which has MSI (Managed Service Identity) enabled which I am trying to use to access an Azure based WebAPI (App Service WebApp) which in turn has Azure AD Authentication enabled (all same Azure Directory).
My WebAPI has an Azure App registered so it can use AAD Authentication.
This app also has the necessary AppRoles configured in its Manifest (for types 'User' and for 'Application').
I have also verified that the Functions Identity (app) was successfully created in Azure AD when I enabled MSI on the Function.
When I try to obtain a token within my Function using MSI i receive a 400 Bad Request response / error:
"ExceptionMessage": "AADSTS50105: Application '###' is not assigned to a role for the application '###'
"ErrorCode": "invalid_grant"
I have ensured the Resource value I pass in is my webAPIs app ID URI.
string resource = "<My App URI>";
string apiversion = "2017-09-01";
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("Secret", Environment.GetEnvironmentVariable("MSI_SECRET"));
var r = await client.GetAsync(String.Format("{0}/?resource={1}&api-version={2}", Environment.GetEnvironmentVariable("MSI_ENDPOINT"), resource, apiversion));
return r;
But I still get the same error. The code for requesting a token is fine, and the error does infact point towards a permissions issue.
The one thing I have not been able to do (and I guess this is the problem) is find a way to give add the new MSI/Function Identity to the Users & Groups of the webAPIs Azure App. No matter what I try my Functions App Identity does not appear in the Users list when I search for it to add as a member of the webAPI app (with the Application role).
Does anyone have any suggestions as to why I cannot add the Functions MSI to an Apps Users & Groups or to an Azure AD Group?
Or am I doing something else wrong perhaps?

Juuna was spot on in his response to my original post:
When enabling MSI on a service only a Service Principal is created in Azure AD and as such this wont appear in search results when trying to add the SP as a member of a Group or to User & Groups of an Azure AD App.
In order to assign your MSI created Service Principal permissions to your App you need to:
Edit your apps manifest and ensure you have app roles with allowed member type = "Application"
Run the PowerShell cmdlet "New-AzureADServiceAppRoleAssignment (link) to grant your Service Principal the Application role you added to your apps manifest
Within your MSI enabled service re-try requesting a token
For me, following the above, I can now successfully request app tokens from my Function so my same Azure Function can call my WebApp (which is AAD Authentication enabled).

In fact, when you enable MSI, it will create a service principal(not a user), so you could not find it in Users&Groups.
You could find it on Azure Portal and try to give permissions you need. Azure Active Directory -->Enterprise applications
Note: The service principal name is same with your function name.
In function, you should use the following code to get token, please refer to this link.
using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.KeyVault;
// ...
var azureServiceTokenProvider = new AzureServiceTokenProvider();
string accessToken = await azureServiceTokenProvider.GetAccessTokenAsync("https://management.azure.com/");
// OR
var kv = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));

Related

Application Default Credentials Not Working in Cloud Function to access Google Workspace API

I am trying to use the service account using Google Cloud Functions to access the Workspace Directory API. I am trying to use Application Default Credentials approach. Since the documentation doesn't mention any additional steps to be done in the Google Cloud Console side, I assume that Application Default Credentials (ADC) to the function is automatically passed to the cloud function from Google Cloud's Metadata server. The following code works perfectly in my local, emulated environment where I have set GOOGLE_APPLICATION_CREDENTIALS environment variable to point to the JSON credential file. However when I deploy the function to Google Cloud, I am getting the error "Not Authorized to access this resource/api". I have been searching and trying for days without any success. As an aside, before I stumbled upon this thread, I was using getCredentials() method of GoogleAuth to get the private_key to create JWT auth (for the "subject" property) and passing that auth to admin API call. Which again worked perfectly in my local environment but fails in the cloud environment because getCredentials() private_key is null, which is probably expected behavior. Any help is deeply appreciated. I am inexperienced with posting in Stackoverflow, if I need to follow any other protocol, please advise.
export const listUsers = functions.https.onCall((data, context) => {
return new Promise(async (resolve, reject) => {
const envAuth = new GoogleAuth({
scopes: ["https://www.googleapis.com/auth/admin.directory.user.readonly"],
clientOptions: {
subject: "admin#mydomain.com",
},
});
const client = await envAuth.getClient();
const service = google.admin({version: "directory_v1", auth: client});
try {
const response = await service.users.list({
customer: "MYCUSTOMER_ID",
});
resolve(response);
} catch (err) {
reject(err);
}
});
});
When you run a Cloud Function without authenticating via a key file (the json credential file that has the key to allow you to impersonate/authenticate as a different user), the function runs as the default application credentials for the project (<YOUR_CLOUD_PROJECT_ID>#appspot.gserviceaccount.com). You don't have to do anything special, no authentication calls -- it runs with the permissions of this service account by default.
When you run the code locally, it takes on the user authentication of whatever key file you have locally (and have referenced in the "GOOGLE_APPLICATION_CREDENTIALS" environment variable on your machine).
Without full security information on your environment, it's hard to know for sure... But, I'm guessing that you're trying to access resources that your personal Google Account has proper access; however, the default service account for you Cloud Functions Project does not.
So, couple of routes to troubleshoot and/or fix the issue:
Give the default service account access rights to Workspace resource(s) you're attempting to access.
Use the JSON key file you setup locally already, to have the Cloud Function run as the same user as is happening when you run locally.
Essentially do a hybrid where you create a new service account that ONLY has the permissions you want (instead of using the default service account or your personal user, both of which might have far more permissions then desired for safety/security), use a key file to run the Cloud Function under that identity, and only give the desired permissions to that service account.

Google Cloud Function :: Service account :: JWT token and Bearer token

I have a Google Cloud Function. I also have a web application. I want to authenticate requests to the cloud function by using a service account.
I have the json key file.
I know that I have to follow https://cloud.google.com/functions/docs/securing/authenticating#service-to-function. But that is leading me to an IAP page that does not apply to google cloud functions.
Another similar instructions are found in https://developers.google.com/identity/protocols/oauth2/service-account
But if I am following the python library code, I end up with the sample code there :
import googleapiclient.discovery
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta3', credentials=credentials)
response = sqladmin.instances().list(project='exciting-example-123').execute()
This does not directly relate to invoking a cloud function.
This question's answer somewhat deals with my requirement but is using a Call API which is only suitable for testing.
Also, I want to expose this API to multiple applications using another tech like .net. So I believe the best option for me will be to use the HTTP method (given on the same page):
https://developers.google.com/identity/protocols/oauth2/service-account#httprest
But whatever I do I am unable to get the signature right.
Any help to get this sorted will be highly appreciated as I am stuck on this for the past few days.
You can use the Google auth library like this
from google.oauth2.id_token import fetch_id_token
from google.auth.transport import requests
audience="my_audience"
r = requests.Request()
token=fetch_id_token(r,audience)
print(token)
The fetch_id_token method will use the default credentials
The service account key file defined in the environment variable GOOGLE_APPLICATION_CREDENTIALS
The service account loaded in the Google Cloud environment
For now, I followed this answer in PHP
In the claims section, I removed the scope. Instead added a claim of target_audience.
"target_audience" => "google-function-http-trigger"
the cloud function http trigger will look like https://us-central1-test-project-name.cloudfunctions.net/function-name",
This will give the required assertion key.
Then I follow https://developers.google.com/identity/protocols/oauth2/service-account#httprest to get the id_token
Then with the id_token as the bearer token we can call the cloud function.
please note that the token expires depending on the time set in the "exp" claim. Once expired you have to redo the steps to generate the new id_token
I want to authenticate requests to the cloud function by using a service account.
I am not sure I understand the context correctly, but I would try to assign a roles/cloudfunctions.invoker IAM role to that service account (which is used to run your code in the web application) - see Cloud Functions IAM Roles .
In that case a code under that service account "Can invoke an HTTP function using its public URL"
I reckon no json keys are required in this case.

Connecting Google add-on to google vm instance

I am developing an add-on for google drive.
As a part of the functionality of this add-on, I would like to incorporate a google vm server that performs some processing on a user's google drive files (i.e. you click on a file through the add-on, send a request with a download link to the server, the server downloads the file, then finally responds to the request with some helpful information about the file). Both the apps script for the add-on and the vm instance are connected to the same google "project."
I am struggling with google's OAuth2.0 system and how I can connect the authorization of the add-on and the vm instance together.
Currently, when users open the add-on for the first time, they are brought to the authorization screen like so .
Once they authorize, my add on has access to all the expected scopes, including read access to google drive files.
Now I want my server to have access to them as well. Unfortunately, I do not understand how to do this.
I have tried simply requesting the url returned from file.getDownloadUrl() in python. While the request returns a status code of 200, I cannot seem to get the file to download.
I have also looked into the Google Drive API for python (I am running a flask server). Unfortunately, it appears that I need an entirely new authorization flow to make it work.
Any clarity on this issue would be greatly appreciated. Frankly, I find google's documentation on this matter very scattered and confusing. So, even knowing the right place to look would extremely helpful.
Much Thanks!
EDIT
I am adding some additional code to help provide some clarity. This is currently how I make a request to my server from the add-on:
var route = http://exampleurl.com/process
var data = {
'oAuthToken': ScriptApp.getOAuthToken(),
'stateToken': ScriptApp.newStateToken().withTimeout(120).createToken(),
'fileId': e.parameters.fileId,
'fileType': e.parameters.fileMimeType
};
var options = {
'method' : 'post',
'payload' : data
};
var response = UrlFetchApp.fetch(route, options);
This code successfully sends information to my vm instance running my server.
Now, I need to authorize the server to download the file specified by fileId.
When developing, I closely followed this tutorial to set up OAuth2.0 access to the Drive API. Here are two key routes:
#app.route('/google/login')
#no_cache
def login():
session = OAuth2Session(CLIENT_ID, CLIENT_SECRET,
scope=AUTHORIZATION_SCOPE,
redirect_uri=AUTH_REDIRECT_URI)
uri, state = session.create_authorization_url(AUTHORIZATION_URL)
flask.session[AUTH_STATE_KEY] = state
flask.session.permanent = True
return flask.redirect(uri, code=302)
#app.route('/google/auth')
#no_cache
def google_auth_redirect():
req_state = flask.request.args.get('state', default=None, type=None)
if req_state != flask.session[AUTH_STATE_KEY]:
response = flask.make_response('Invalid state parameter', 401)
return response
session = OAuth2Session(CLIENT_ID, CLIENT_SECRET,
scope=AUTHORIZATION_SCOPE,
state=flask.session[AUTH_STATE_KEY],
redirect_uri=AUTH_REDIRECT_URI)
oauth2_tokens = session.fetch_access_token(
ACCESS_TOKEN_URI,
authorization_response=flask.request.url)
flask.session[AUTH_TOKEN_KEY] = oauth2_tokens
return flask.redirect(BASE_URI, code=302)
Is there a way to plug in the two tokens I generate from the add-on into this Oauth flow? It appears that Google isn't anticipating this setup given that I am required to provide a redirect URL, which wouldn't make much sense in the case of my add-on/server tech stack.
Currently, you can successfully send the access token from the Apps Script project (retrieved with getOAuthToken()), to the flask server.
Since you already got the access token, you don't need to go through all the OAuth process as defined here (use the application credentials to request the access token, provide user consent, redirect, etc.). Sending the access token through the API request is actually the last step in the OAuth flow.
You just need to use the token to build the service, and the server will be able to access the file.
Using access token to build service:
from googleapiclient.discovery import build
import google.oauth2.credentials
ACCESS_TOKEN = requestBody['oAuthToken'] # Data coming from Apps Script via UrlFetch
creds = google.oauth2.credentials.Credentials(ACCESS_TOKEN)
drive_service = build('drive', 'v3', credentials=creds) # Build Drive service
Using Drive service to download files:
After you've build the Drive service to access the API, you can use this code sample to download Google Documents, or this one for other files (both examples include a tab with a Python sample).
Note:
You might need to Install the Google Client Library if you haven't done so.
Reference:
Using OAuth 2.0 to Access Google APIs

How can I publish a Google App Script using a domain-wide-delegation service account?

I'm trying to use the what Google terms a 'Domain-wide delegation' service account: https://developers.google.com/admin-sdk/directory/v1/guides/delegation
The specific API I'm trying to access with this delegation is: https://developers.google.com/apps-script/api/
Here's the code:
from google.oauth2 import service_account
import googleapiclient.discovery
import json
import os
SCOPES = ['https://www.googleapis.com/auth/script.projects', 'https://www.googleapis.com/auth/drive']
SERVICE_KEY = json.loads(os.environ['SERVICE_KEY'])
credentials = service_account.Credentials.from_service_account_info(SERVICE_KEY, scopes=SCOPES)
delegated_credentials = credentials.with_subject('fred.bloggs#my-gapps-domain.com')
script = googleapiclient.discovery.build('script', 'v1', credentials=delegated_credentials)
response = script.projects().get(scriptId='<myscriptId>').execute()
print json.dumps(response)
This fails with:
google.auth.exceptions.RefreshError: ('unauthorized_client: Client is unauthorized to retrieve access tokens using this method.', u'{\n "error" : "unauthorized_client",\n "error_description" : "Client is unauthorized to retrieve access tokens using this method."\n}')
I'm pretty sure I've followed all the steps at https://developers.google.com/api-client-library/python/auth/service-accounts, including authorizing the 'https://www.googleapis.com/auth/script.projects' scope with the client ID of the service account json key I downloaded.
Note, I was able to successfully get this particular snippet to work by skipping with_subject, and going in to the Script dashboard as the user, and 'sharing' the script project.
Unfortunately though that still doesn't allow upload a new set of files (as 'sharing' doesn't give the ability to delete). It does at least confirm my calling code is correct, albeit not authenticating properly with the json service key.
To clarify:
The script in question is what I believe is termed 'standalone' (not a web app)
It's owned by a bot user I've setup just like a regular GSuite user (as I didn't want scripts in regular user's Google Drives)
The script started in a Google Cloud Project that seemed to be automatically created, and with 'No organisation'. I then created a new project manually within the organisation, and moved the script to that project.
There's an official Google Apps Script client now, so I asked there too https://github.com/google/clasp/issues/225#issuecomment-400174500 - although they use the Javascript API (via Typescript), the principles should be the same.

How do I configure project and gcloud to connect to Datastore?

I'm unable to get my Google Compute instance to speak to the Datastore (within the same project).
I believe I've set everything up correctly:
Google Compute instance has the full API scope
All the relevant APIs have been enabled in the project
Billing is enabled in the project
The tutorials claim that I won't even need to have a service account to use the API, but since it didn't work I also tried setting up a service account and put the key file on my instance. Still no luck.
Here's the code I'm trying to run (I created a Test entity in the Datastore of the project, and I can successfully look it up using Google's API explorer with an OAuth2 token for my account):
const gcloud = require('gcloud')({
projectId: 'roger-web-client',
keyFilename: './roger-web-client-8d1fbd8baae2.json',
});
const dataset = gcloud.datastore.dataset();
dataset.get(dataset.key(['Test', 5629499534213120]), (error, entity) => {
console.log(error || entity);
});
This results in the error 403 Forbidden. If I comment out keyFilename, I get 401 Unauthorized instead, which seems to imply the magic authorization on Google Compute instances isn't working for me.
Ideas?