When sending SMS to Twilio, Twilio sends several requests to a specified URL to give a status of that SMS delivery through webhooks. I want to make this callback asynchronous, so I developed a Cloud Function that sends a representation of the request to a Cloud Task that itself reach a dedicated endpoint of my app that recreate and simulate the Twilio request internally.
Twilio signs his requests using:
my twilio account's secret key
the absolute URL that it reaches out
and the body of his request
So on my backend, I should know which endpoint Twilio reached out initially. I want to do it inside the Cloud Function, and I want to do it programmatically because this "asynchronous webhook" is also used when people answers to SMS.
The current URL of my webhook has the following format:
https://<location>-<project>.cloudfunctions.net/<cloud function name>/<some SMS uuid>
The current payload sent to my Cloud Task is the following:
absoluteUri: req.protocol + '://' + req.hostname + req.originalUrl,
relativeUri: req.originalUrl,
queryParams: req.query,
headers: req.headers,
body: req.body,
The problem is that req.originalUrl do not contain the full URI, my absoluteUri is currently:
https://<location>-<project>.cloudfunctions.net/<some SMS uuid>
So here is my question: inside a Cloud Function, is there a way to get its name programmatically?
I'm using python 3.9 and get the Cloud Function name from the curiously named "K_SERVICE" environment variable.
import os
print("Current Cloud Function Name {}".format(os.environ.get('K_SERVICE')))
Enjoy them while you can, Google are removing more and more useful environment variables with every new language version.
https://cloud.google.com/functions/docs/configuring/env-var#python_37_and_go_111
As the Cloud Function name is the entry point and should be hardcoded, I ended up writing a deploy script that sed some placeholder.
Looks very ugly, if you have a better way it will be welcome.
Related
I have a Firebase (node.js) cloud function that pulls in some data from my app's Firestore database and builds some static content for the web. I'd like that same cloud function to deploy the static content to Firebase hosting via the Firebase Hosting API, creating a static portion of my site with user generated content.
I understand the general flow thanks to the somewhat clear walkthrough, but am stuck on the first step: getting an access token to call the API. Obviously I'm not going to insecurely put my service account key in the cloud function itself, so the example in the walkthrough doesn't apply. And as I understand it, Firebase cloud functions are already associated with a service account, so presumably there's some way to get an access token to call other Google Cloud services from a cloud function.
So how do I get an access token to call the hosting API from a Cloud Function?
There are some red flags that make me think this isn't possible. For example, all of the uses cases in the walkthrough allude to other server environments, as opposed to Google Cloud environments. And yet, this use case is the third bullet in the use case list in the walkthrough.
I've searched extensively here and elsewhere for some guidance, but aren't finding anything. There are some older questions about accessing hosted files from a cloud function that aren't relevant. This promising question from 5 years ago about this exact use case only has dead ends.
You can use the google-auth-library package in Cloud Functions to a get a token as shown below:
import { GoogleAuth } from "google-auth-library";
const token = await new GoogleAuth({
scopes: ["https://www.googleapis.com/auth/cloud-platform"],
}).getAccessToken();
If you use Firebase Admin SDK in the Cloud Functions, then you can get an access token of the default service account as shown below (do ensure the service account has required permissions):
import { initializeApp } from "firebase-admin/app";
const admin = initializeApp();
const token = await admin.options.credential?.getAccessToken();
// ^ Google OAuth2 access token object used to authenticate with Firebase services.
I have a Google Cloud Function. I also have a web application. I want to authenticate requests to the cloud function by using a service account.
I have the json key file.
I know that I have to follow https://cloud.google.com/functions/docs/securing/authenticating#service-to-function. But that is leading me to an IAP page that does not apply to google cloud functions.
Another similar instructions are found in https://developers.google.com/identity/protocols/oauth2/service-account
But if I am following the python library code, I end up with the sample code there :
import googleapiclient.discovery
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta3', credentials=credentials)
response = sqladmin.instances().list(project='exciting-example-123').execute()
This does not directly relate to invoking a cloud function.
This question's answer somewhat deals with my requirement but is using a Call API which is only suitable for testing.
Also, I want to expose this API to multiple applications using another tech like .net. So I believe the best option for me will be to use the HTTP method (given on the same page):
https://developers.google.com/identity/protocols/oauth2/service-account#httprest
But whatever I do I am unable to get the signature right.
Any help to get this sorted will be highly appreciated as I am stuck on this for the past few days.
You can use the Google auth library like this
from google.oauth2.id_token import fetch_id_token
from google.auth.transport import requests
audience="my_audience"
r = requests.Request()
token=fetch_id_token(r,audience)
print(token)
The fetch_id_token method will use the default credentials
The service account key file defined in the environment variable GOOGLE_APPLICATION_CREDENTIALS
The service account loaded in the Google Cloud environment
For now, I followed this answer in PHP
In the claims section, I removed the scope. Instead added a claim of target_audience.
"target_audience" => "google-function-http-trigger"
the cloud function http trigger will look like https://us-central1-test-project-name.cloudfunctions.net/function-name",
This will give the required assertion key.
Then I follow https://developers.google.com/identity/protocols/oauth2/service-account#httprest to get the id_token
Then with the id_token as the bearer token we can call the cloud function.
please note that the token expires depending on the time set in the "exp" claim. Once expired you have to redo the steps to generate the new id_token
I want to authenticate requests to the cloud function by using a service account.
I am not sure I understand the context correctly, but I would try to assign a roles/cloudfunctions.invoker IAM role to that service account (which is used to run your code in the web application) - see Cloud Functions IAM Roles .
In that case a code under that service account "Can invoke an HTTP function using its public URL"
I reckon no json keys are required in this case.
This is my first post here. I am sorry if it's a repost, but I've been searching for more than one month for the answer to solve my problem in all websites and forums and until now... no answers!
My goal is to make a Gmail pub/sub watch() to make an action whenever I receive a new email.
To do so, according to the developer's website, I need to subscribe to Gmail watch() on a daily basis with the code:
request = {
'labelIds': ['INBOX'],
'topicName': 'projects/myproject/topics/mytopic'
}
gmail.users().watch(userId='me', body=request).execute()
Until now i have this a working scheduled task with a service account, with INVOKER Permissions. This part just works fine.
In my "initial autorization function" i have:
const {google} = require('googleapis');
// Retrieve OAuth2 config
const oauth2Client = new google.auth.OAuth2(
process.env.CLIENT_ID,
process.env.CLIENT_SECRET,
process.env.CALLBACK_URL
);
exports.oauth2init = (req, res) => {
// Define OAuth2 scopes
const scopes = [
'https://www.googleapis.com/auth/gmail.modify'
];
// Generate + redirect to OAuth2 consent form URL
const authUrl = oauth2Client.generateAuthUrl({
access_type: 'offline',
scope: scopes,
//prompt: 'none'// Required in order to receive a refresh token every time
});
return res.redirect(authUrl);
};
My issue now is that the access token is generated via (prompt) the first time and never updates to a new one ( the token expires after 1hour...) it means this code stops working after that period and a "manual" intervention is required. According with the documentation, i need to use "offline" method and on "prompt" i can omit (only requests permissions on the 1st time) or none (never asks), like is said here.
I managed how to make it work! tomorow i will continue with the process.
Should i post here my working code for reference?
Thanks!
I will rephrase the process you illustrated so that there is no ambiguity.
According the documentation you pushed:
You do not suscribed to watch(), you call watch()
watch() is an API call to the Gmail API that will enable automatic events publication on a pub/sub topic you define given conditions you specified. Who are you watching? On what events?
You suscribe to a Pub/Sub topic that is targeted by your previous watch() call
A process (e.g: Google cloud function) suscribes to the topic and will consume messages sent by the Gmail API
The call is to be renewed at least every seven days
Because Google needs to be sure you still need to monitor the targeted inbox, it needs a renewal from you. Another watch() call will act so.
Cloud scheduler will enable this periodic renewal
this service will trigger your renewal script you put in your question. To do so it needs to be authenticated to the platform that host the script. It is easier if your script is hosted in a google service (cloud function, cloud run,...) and the authent type depends on the target URL form. In all cases YOU DO NEED an authent token in your request header. The token is generated from a service account you created with the right permission to call your script (e.g: cloud run invoker). By default the scheduler has the right to generate a token from it
So far so good. Now comes the tricky part and you don't mention it in your question. How is authenticated your gmail api client? You cannot monitor someone inbox, unless this person gave you the permission to i.e you call the API with the right Oauth2 token. Indeed in the video you point they authenticat the user using this principe which is implemented in their code with Express-oauth2-handler.
So you will have a cloud function to init end user authent and watch to his/her inbox. The renewal should do so but problem is user will not be there for accepting the end user consent. Here comes the offline access but it is beyond the scope of your question. Finally a second functions will suscribe to the pubsub topic and consume the message as you need. See their implementation code which populate a spreadsheet.
The documentation you shared in the comments does not say that you can remove the token from the headers of the service account, also the gmail API documentation you also shared says that you only:
need to grant publish privileges to gmail-api-push#system.gserviceaccount.com. You can do this using the Cloud Pub/Sub Developer Console permissions interface following the resource-level access control instructions.
In order to achieve this basically what you will need is a setup of two cloud functions, the first scheduled function is responsible for setting up the watch(), and you can check this documentation for how to deploy a scheduled function, and the second function being triggered by the pubsub of gmail notifications, you can check this documentation for how to build an event triggered function. Both processes are similar.
NOTE: I have never user the Gmail API, so I am not sure if any extra steps are necessary but then again, the documentation implies that setting up the permissions of that service account is enough to make it work.
EDIT:
As per the information you have shared. The issue is likely that you are not properly setting the Service Account to authenticate with the Cloud Function. As per described in the documentation, you have to grant to the Service Account the role Cloud Functions Invoker in IAM.
Let me know if this fixed the issue.
I'm working on an performance evaluation app in Google App Maker. One of the challenges we have with our current tool is that it doesn't sync with our G Suite directory when a person's manager changes or when a person has a name change -- their existing evaluations are linked to the person's old name and we have to change manually.
In my new app, I have an Employees datasource that includes a relation to the evaluation itself that was initially populated via the Directory API. Reading the documentation here, it seems as though I should be able to set up a watch on the Users resource to look for user updates and parse through them to make the appropriate name and manager changes in my Employees datasource. What I can't figure out, though, is what the receiving URL should be for the watch request.
If anyone has done this successfully within Google App Maker, or even solely within a Google Apps Script, I'd love to know how you did it.
EDITED TO ADD:
I created a silly little GAS test function to see if I can get #dimu-designs solution below to work. Unfortunately, I just get a Bad Request error. Here's what I have:
function setUserWatch() {
var optionalArgs = {
"event": "update"
};
var resource = {
"id": "10ff4786-4363-4681-abc8-28166022425b",
"type": "web_hook",
"address": "https://script.google.com/a/.../...hXlw/exec"
};
AdminDirectory.Users.watch(resource);
}
Address is the current web app URL.
EDITED TO ADD MORE:
The (in)ability to use GAS to receive web hooks has been an active issue/feature request since Sep 2014 -- https://issuetracker.google.com/issues/36761910 -- which has been #dimu-designs on top of for some time.
This is a more comprehensive answer.
Google supports push notifications across many of their APIs. However there are many subtle (and not so subtle) differences between them. Some that leverage webhooks send their data payloads primarily as HTTP headers; for example Drive API and Calendar API. Others mix their payloads across HTTP headers and a POST body(ex: AdminDirectory API). And its gets even crazier, with some APIs utilizing different mechanisms altogether (ex: GMail API leverages Cloud PubSub).
There are nuances to each but your goal is to leverage AdminDirectory push notifications in a GAS app. To do that you need a GAS Web App whose URL can serve as a web-hook endpoint.
STEP 1 - Deploy A Stand-Alone Script As A Web App
Let's start with the following template script and deploy it as a Web App from the Apps Script Editor menu Publish > Deploy As Web App:
/** HTTP GET request handler */
function doGet(e) {
return ContentService.createTextOutput("GET message");
}
/** HTTP POST request handler */
function doPost(e) {
return ContentService.createTextOutput("POST message");
}
STEP 2 - Verify/Validate Domain Ownership And Add/Register Domain
NOTE: As of August 2019, GAS Web App URLs can no longer be verified using this method. Google Cloud Functions may be a viable
alternative.
With the web app deployed you now have to verify and register the domain of the receiving url, which in this case is also the web app url. This url takes the following form:
https://script.google.com/macros/s/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/exec
Technically you cannot own a GAS web app url's domain. Thankfully the App Script Gods at Google do provide a mechanism to verify and register a GAS web app url.
From the Apps Script Editor menu select Publish > Register in Chrome Web Store. Registering a published web app with the Chrome Web Store also validates the URL's domain (no need to fiddle with the search console).
Once validated you need to add the "domain" via the Domain verification page in the API Console. The "domain" is everything in the url sans the 'exec', so you'll add a string that looks like this:
https://script.google.com/macros/s/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/
STEP 3 - Make a watch request
For this step the AdminSDK/Directory API service should be enabled both for your App Script project and in the API Console.
Create a function that generates a watch request (this can be retooled for other event types):
function startUpdateWatch() {
var channel = AdminDirectory.newChannel(),
receivingURL = "https://script.google.com/macros/s/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/exec",
gSuiteDomain = "[business-name].com",
event = "update";
channel.id = Utilities.getUuid();
channel.type = "web_hook";
channel.address = receivingURL + "?domain=" + gSuiteDomain + "&event=" + event;
channel.expiration = Date.now() + 21600000; // max of 6 hours in the future; Note: watch must be renew before expiration to keep sending notifications
AdminDirectory.Users.watch(
channel,
{
"domain":gSuiteDomain,
"event":event
}
);
}
Note that Directory API push notifications have an expiration, the max being 6 hours from starting the watch so it must be renewed periodically to ensure notifications are sent to the endpoint URL. Typically you can use a time-based trigger to call this function every 5 hours or so.
STEP 4 - Update doPost(e) trigger to handle incoming notifications
Unlike the push mechanisms of other APIs, the Directory API sends a POST body along with its notifications, so the doPost(e) method is guaranteed to be triggered when a notification is sent. Tailor the doPost(e) trigger to handle incoming events and re-deploy the web app:
function doPost(e) {
switch(e.parameter.event) {
case "update":
// do update stuff
break;
case "add":
break;
case "delete":
break;
}
return ContentService.createTextOutput("POST message");
}
There is one caveat to keep in mind. Push notifications for update events only tell you that the user's data was updated, it won't tell you exactly what was changed. But that's a problem for another question.
Note that there are a ton of details I left out but this should be enough to get you up and running.
You can do this with GAS and the Admin SDK. The Directory API supports Notifications (Note this is scheduled to be deprecated so not sure what is replacing this functionality). You can then set up a GMAIL script to do what you need to do with the notification.
UPDATE: There are also PUSH notifications from the Directory API.
I was able to set up push notifications for a local resource (a spreadsheet) using Heroku-based Node.js app as an intermediary API. The Node app captures the custom request headers and builds the payload to be consumed by the doPost(e) function of the GAS web app.
The code for constructing a watch request is simple
//get the unique id
var channelId = Utilities.getUuid();
//build the resource object
var resource = {
"id": channelId,
"type": "web_hook",
"address": "https://yourapp.herokuapp.com/drivesub
}
//watch the resource
Drive.Files.watch(resource, fileId);
The challenge is to get that domain address verified. There are ways to verify the standalone (not file-bound!) GAS web app, however, as previous posters have mentioned, the Apps Script web app can't access custom headers.
After you've enabled the Pub/Sub API and created the topic & subscription, go to APIs & Services -> Credentials -> Domain verification. It gives you a few options of verifying your domain, including serving the html file. Download the file generated by Google. Thankfully, Heroku makes it very easy to deploy a Node app
https://devcenter.heroku.com/articles/getting-started-with-nodejs
After your domain is verified you can make your subscription push data to the endpoint URL on Heroku.
I simply created the js file for route handlers and created one specifically for domain verification
handlers.verifyDomain = function(callback){
//Synchronously read from the static html file. Async method fs.readFile() doesn't make the file available to the router callback
var file = fs.readFileSync('./google/google_file.html');
callback(200, file);
}
Then include the handler in your router object like so:
var router = {
"google_file.html": handlers.verifyDomain
}
Finally, in your server starting function, get the path from the URL (there are multiple ways of doing that), and execute the handler;
var routeHandler = router(path);
routerHandler(function(statusCode, file){
//do something
});
Go back to domain verification tool and load the HTML page to verify the URL. After it's verified, the only remaining step is creating the GAS web app and posting to it.
Back to Node app. Note that my endpoint is https://yourapp.herokuapp.com/drivesub
//The code for posting data to GAS web app. Of course, you need to
// update your router with router['driveSub'] = handlers.driveSub
handlers.driveSub = function(data, callback){
var headers = data.headers;
var options = {
method:"POST",
uri: your_gas_app_url,
json:{"headers":headers}
};
//Use NPM to install the request module
request.post(options, function(err, httpResponse, body){
console.log(err, body);
});
callback(200, data);
}
Apps Script app - don't forget to publish it.
function doPost(req) {
var postData = req.postData["contents"];
var headers = JSON.parse(postData)["headers"];
//take action
return 200;
}
Unfortunately you cannot, at least not solely using Apps Script.
Admin Directory push notifications require a web-hook URL endpoint to receive notifications. You might think deploying a GAS web app and using its URL as an endpoint would be sufficient. But the thing with Admin Directory Push notifications is that its data payload resides in custom HTTP headers which cannot be accessed from a GAS Web App. (This also holds true for push notifications across other APIs including the Drive and Calendar APIs)
You can however leverage Google Cloud Functions (a GCP service) in tandem with GAS, but you'll have to know your way around Node.js.
EDIT
After giving this some thought, and reviewing your requirements I believe there is a way to pull this off just using GAS.
You can setup a unique push notification channel for a given event per user/domain (the 'update' event in your use case) by setting the event parameter when initializing the watch. Thus the GAS web app will only be triggered if an update event occurs; you don't really need to rely on the HTTP header to determine the event type.
If you want to track multiple events, you simply create a unique channel per event and use the same GAS Web app endpoint for each one. You differentiate between events by checking the event parameter sent in the POST request. This should remove the need for middle-man services such as Heroku or Google Cloud Functions.
I want to use from my android/ios app the autocomplete api. For this I need to call url like:
https://maps.googleapis.com/maps/api/place/autocomplete/json?input=paris&key=<myapikey>
the problem is: What can make that someone else will not extract from my app my api key and use it for his own purpose ? It's important because at the end it's me who will be billed by google for the usage ...
Your intention is to call a Places API web service. Google Maps web services support only IP address restrictions.
You can check what type of restriction is supported by each API on the following page:
https://developers.google.com/maps/faq#keysystem
In order to protect an API key that is used with your sample request you should create an intermediate server and send your requests from this server. So your application should send request to intermediate server, intermediate server should send Places autocomplete request with protected API key to Google and pass response back to your app. In this case you can use an IP address of your intermediate server to protect unauthorized access with your API key.
I hope this helps!
What if you create and intermediate server and create a token for each single user, and also create a monitoring service which block suspicious behavior?
for example, a normal user would request x times/per day || hour || ...
Or
when a user runs application for the first time, application receives the [encrypted api + decryption key] and store them to a safe place like keychain(for iOS)
As I know, if you request directly to google-map-api there is always a way to sniffing packets.