how can I properly invoke a google cloud function inside another function [duplicate] - google-cloud-functions

I am using a Cloud Function to call another Cloud Function on the free spark tier.
Is there a special way to call another Cloud Function? Or do you just use a standard http request?
I have tried calling the other function directly like so:
exports.purchaseTicket = functions.https.onRequest((req, res) => {
fetch('https://us-central1-functions-****.cloudfunctions.net/validate')
.then(response => response.json())
.then(json => res.status(201).json(json))
})
But I get the error
FetchError: request to
https://us-central1-functions-****.cloudfunctions.net/validate
failed, reason: getaddrinfo ENOTFOUND
us-central1-functions-*****.cloudfunctions.net
us-central1-functions-*****.cloudfunctions.net:443
Which sounds like firebase is blocking the connection, despite it being a google owned, and therefore it shouldn't be locked
the Spark plan only allows outbound network requests to Google owned
services.
How can I make use a Cloud Function to call another Cloud Function?

You don't need to go through the trouble of invoking some shared functionality via a whole new HTTPS call. You can simply abstract away the common bits of code into a regular javascript function that gets called by either one. For example, you could modify the template helloWorld function like this:
var functions = require('firebase-functions');
exports.helloWorld = functions.https.onRequest((request, response) => {
common(response)
})
exports.helloWorld2 = functions.https.onRequest((request, response) => {
common(response)
})
function common(response) {
response.send("Hello from a regular old function!");
}
These two functions will do exactly the same thing, but with different endpoints.

To answer the question, you can do an https request to call another cloud function:
export const callCloudFunction = async (functionName: string, data: {} = {}) => {
let url = `https://us-central1-${config.firebase.projectId}.cloudfunctions.net/${functionName}`
await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ data }),
})
}
(Note we are using the npm package 'node-fetch' as our fetch implementation.)
And then simply call it:
callCloudFunction('search', { query: 'yo' })
There are legitimate reasons to do this. We used this to ping our search cloud function every minute and keep it running. This greatly lowers response latency for a few dollars a year.

It's possible to invoke another Google Cloud Function over HTTP by including an authorization token. It requires a primary HTTP request to calculate the token, which you then use when you call the actual Google Cloud Function that you want to run.
https://cloud.google.com/functions/docs/securing/authenticating#function-to-function
const {get} = require('axios');
// TODO(developer): set these values
const REGION = 'us-central1';
const PROJECT_ID = 'my-project-id';
const RECEIVING_FUNCTION = 'myFunction';
// Constants for setting up metadata server request
// See https://cloud.google.com/compute/docs/instances/verifying-instance-identity#request_signature
const functionURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + functionURL;
exports.callingFunction = async (req, res) => {
// Fetch the token
const tokenResponse = await get(tokenUrl, {
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = tokenResponse.data;
// Provide the token in the request to the receiving function
try {
const functionResponse = await get(functionURL, {
headers: {Authorization: `bearer ${token}`},
});
res.status(200).send(functionResponse.data);
} catch (err) {
console.error(err);
res.status(500).send('An error occurred! See logs for more details.');
}
};
October 2021 Update: You should not need to do this from a local development environment, thank you Aman James for clarifying this

Despite of the question tag and other answers concern the javascript I want to share the python example as it reflects the title and also authentification aspect mentioned in the question.
Google Cloud Function provide REST API interface what incluse call method that can be used in another Cloud Function.
Although the documentation mention using Google-provided client libraries there is still non one for Cloud Function on Python.
And instead you need to use general Google API Client Libraries. [This is the python one].3
Probably, the main difficulties while using this approach is an understanding of authentification process.
Generally you need provide two things to build a client service:
credentials ans scopes.
The simpliest way to get credentials is relay on Application Default Credentials (ADC) library. The rigth documentation about that are:
https://cloud.google.com/docs/authentication/production
https://github.com/googleapis/google-api-python-client/blob/master/docs/auth.md
The place where to get scopes is the each REST API function documentation page.
Like, OAuth scope: https://www.googleapis.com/auth/cloud-platform
The complete code example of calling 'hello-world' clound fucntion is below.
Before run:
Create default Cloud Function on GCP in your project.
Keep and notice the default service account to use
Keep the default body.
Notice the project_id, function name, location where you deploy function.
If you will call function outside Cloud Function environment (locally for instance) setup the environment variable GOOGLE_APPLICATION_CREDENTIALS according the doc mentioned above
If you will call actualy from another Cloud Function you don't need to configure credentials at all.
from googleapiclient.discovery import build
from googleapiclient.discovery_cache.base import Cache
import google.auth
import pprint as pp
def get_cloud_function_api_service():
class MemoryCache(Cache):
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content
scopes = ['https://www.googleapis.com/auth/cloud-platform']
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set,
# ADC uses the service account file that the variable points to.
#
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set,
# ADC uses the default service account that Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run,
# and Cloud Functions provide
#
# see more on https://cloud.google.com/docs/authentication/production
credentials, project_id = google.auth.default(scopes)
service = build('cloudfunctions', 'v1', credentials=credentials, cache=MemoryCache())
return service
google_api_service = get_cloud_function_api_service()
name = 'projects/{project_id}/locations/us-central1/functions/function-1'
body = {
'data': '{ "message": "It is awesome, you are develop on Stack Overflow language!"}' # json passed as a string
}
result_call = google_api_service.projects().locations().functions().call(name=name, body=body).execute()
pp.pprint(result_call)
# expected out out is:
# {'executionId': '3h4c8cb1kwe2', 'result': 'It is awesome, you are develop on Stack Overflow language!'}

These suggestions don't seem to work anymore.
To get this to work for me, I made calls from the client side using httpsCallable and imported the requests into postman. There were some other links to https://firebase.google.com/docs/functions/callable-reference there were helpful. But determining where the information was available took a bit of figuring out.
I wrote everything down here as it takes a bit of explaining and some examples.
https://www.tiftonpartners.com/post/call-google-cloud-function-from-another-cloud-function
Here's an inline version for the 'url' might expire.
This 'should' work, it's not tested but based off of what I wrote and tested for my own application.
module.exports = function(name,context) {
const {protocol,headers} = context.rawRequest;
const host = headers['x-forwardedfor-host'] || headers.host;
// there will be two different paths for
// production and development
const url = `${protocol}://${host}/${name}`;
const method = 'post';
const auth = headers.authorization;
return (...rest) => {
const data = JSON.stringify({data:rest});
const config = {
method, url, data,
headers: {
'Content-Type': 'application/json',
'Authorization': auth,
'Connection': 'keep-alive',
'Pragma': 'no-cache,
'Cache-control': 'no-cache',
}
};
try {
const {data:{result}} = await axios(config);
return result;
} catch(e) {
throw e;
}
}
}
This is how you would call this function.
const crud = httpsCallable('crud',context);
return await crud('read',...data);
context you get from the google cloud entry point and is the most important piece, it contains the JWT token needed to make the subsequent call to your cloud function (in my example its crud)
To define the other httpsCallable endpoint you would write an export statement as follows
exports.crud = functions.https.onCall(async (data, context) => {})
It should work just like magic.
Hopefully this helps.

I found a combination of two of the methods works best
const anprURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + anprURL;
// Fetch the token
const tokenResponse = await fetch(tokenUrl, {
method: "GET"
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = await tokenResponse.text();
const functionResponse = await fetch(anprURL, {
method: 'POST',
headers: {
"Authorization": `bearer ${token}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({"imageUrl": url}),
});
// Convert the response to text
const responseText = await functionResponse.text();
// Convert from text to json
const reponseJson = JSON.parse(responseText);

Extending the Shea Hunter Belsky's answer I would love to inform you that the call to the metatdata server of google to fetch the authorization token would not work from local machine

Since fetch is not readily available in Node.JS and my project was already using the axios library, I did it like this:
const url = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION_NAME}`;
const headers = {
'Content-Type': 'application/json',
};
const response = await axios.post(url, { data: YOUR_DATA }, { headers });

Related

Google Apps Script - URL Fetch App returning random numbers

I am new to Apps Script and was trying to build an API and call that API through a different script. I created the web app and published it.
This is the URL:
https://script.google.com/macros/s/AKfycbxKVmGy3fxDfoHxyDtQh7psqj7IdKF7qHbgxLAwNRoiKTA-bpKN4QKtArzwsYdFb-Hb/exec
When I open this link, I can see the data correctly but when I try to fetch this data from a different script using urlfetchapp, it returns random numbers. I need help on what I am doing incorrectly.
Script which I am using to call this data:
function GetCopies()
{
var options = {
'contentType': "application/json",
'method' : 'get',
};
var Data = UrlFetchApp.fetch('https://script.google.com/macros/s/AKfycbxKVmGy3fxDfoHxyDtQh7psqj7IdKF7qHbgxLAwNRoiKTA-bpKN4QKtArzwsYdFb-Hb/exec',options)
Logger.log(Data.getContent())
}
This is the log I get:
I tried parsing it, but it throws an error:
How can I get data from URL correctly?
A working sample:
Create two Google Apps Script projects. In my case API and fetcher
API
const doGet = () => {
const myObj = {
"name": "Mr.GAS",
"email": "mrgas#blabla.com"
}
return ContentService
.createTextOutput(JSON.stringify(myObj))
.setMimeType(
ContentService.MimeType.JSON
)
}
fetcher
const LINK = "API_LINK"
const fetchTheAPI = async () => {
const options = {
'contentType': "application/json",
'method': 'get',
}
const res = UrlFetchApp.fetch(LINK, options)
const text = res.getContentText()
console.log(JSON.parse(text))
}
Deploy the API: Select type > Web app and Who has access > Anyone, copy the URL (it is important to copy that URL not the one redirected in the browser)
Replace the "API_LINK" by the URL.
Run the function.
You only need to adapt this example to suit your needs.
Documentation:
Content Service
Web Apps

Facing challenge to invoke cloud Function from cloud task using oidcToken

I am facing challenge to invoke cloud Function from cloud task using oidcToken.
Here are details of my IAM & Code:
const { CloudTasksClient } = require('#google-cloud/tasks');
const client = new CloudTasksClient();
//See https://cloud.google.com/tasks/docs/tutorial-gcf
module.exports = async (payload, scheduleTimeInSec) => {
const project = process.env.GOOGLE_APPLICATION_PROJECTID;
const queue = process.env.QUEUE_NAME;
const location = process.env.QUEUE_LOCATION;
const callBackUrl = https://asia-south2-trial-288318.cloudfunctions.net/cloud-function-node-expres/;
// Construct the fully qualified queue name.
const parent = client.queuePath(project, location, queue);
const body = Buffer.from(JSON.stringify(payload)).toString('base64');
const task = {
httpRequest: {
httpMethod: 'POST',
url: callBackUrl,
headers: { 'Content-Type': 'application/json' },
body
},
scheduleTime: {
seconds: scheduleTimeInSec,
}
};
if (process.env.GOOGLE_APPLICATION_SERVICE_ACCOUNT_EMAIL) {
task.httpRequest.oidcToken = {
serviceAccountEmail: process.env.GOOGLE_APPLICATION_SERVICE_ACCOUNT_EMAIL
}
}
const request = {
parent: parent,
task: task,
};
// Send create task request.
try {
let [responses] = await client.createTask(request);
return ({ sts: true, taskName: responses.name, msg: "Email Schedule Task Created" })
}
catch (e) {
return ({ sts: true, err: true, errInfo: e, msg: "Unable to Schedule Task. Internal Error." })
}
}
The process.env.GOOGLE_APPLICATION_SERVICE_ACCOUNT_EMAIL has Cloud Functions Invoker role and the Cloud Function has allAuthenticatedUsers member with role Cloud Functions Invoker as per the doc.
But still I am seeing the 401 resposnse recevied by Cloud Task and Cloud Function is not getting called(See below image):
Any comment on this, whats going wrong here
This seems to be related that you have created the function in Firebase (guessing from the url). Seems the "Cloud Functions Invoker" is not enough for Firebase functions. I have replicated similar behavior on HelloWorld function from Firebase. The error is differnet (403) but I hope it will help you to troubleshoot the same way.
After creation helloWorld in Firebase I tested it with glcoud command in following steps:
Create service acount with role "Cloud Functions Invoker" or use exiting one
Download key for the account in JSON.
Change gcloud to act as service account:
gcloud auth activate-service-account <service-account#email> --key-file=<key-form-step-2.json>
gcloud functions call helloWorld
As the result of last action I got this error:
ERROR: (gcloud.functions.call) ResponseError: status=[403], code=[Forbidden], message=[Permission 'cloudfunctions.functions.call' denied on resource 'projects/functions-asia-test-vitooh/locations/us-central1/functions/helloWorld' (or reso
urce may not exist).]
So I created custom role in IAM: Cloud Functions Invoker + Firebase adding permission from the error massage cloudfunctions.functions.call.
The function started to work with the same gcloud functions call:
executionId: 3fgndpolu981
result: Hello from Firebase!
I think it will work as well. You can try add the same permission. If it wont work, try the same testing.
References:
gcloud auth command
create custom role in Cloud IAM
gcloud function call

How to retrieve data from AWS Lambda and display it on a static website that's hosted on AWS S3?

I'm attempting to use AWS DynamoDB, Lambda, API Gateway, and S3 to create a simple website. DDB has a table and in the table is a single entry. S3 has a simple file for a HTML for a website. The goal is to display the entry located in DDB in the website, if I update the value in DDB, then refreshing the website should change the number to reflect the update in DDB. At the moment, I have a lambda function which successfully retrieves the entry from DDB. I am stuck in trying to tell the HTML file to call the lambda function and get the data back (using API Gateway). I have never worked with AWS before, forgive me if this isn't even the right approach for this goal.
Below is code for lambda function:
'use-strict';
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region: 'us-location-x'});
exports.handler = (event, context, callback) => {
let tableToRead = {
TableName: 'dataStore',
Limit: 10
};
docClient.scan(tableToRead, function(err,data){
if(err){
callback(err,null);
} else {
callback(null,data);
}
});
};
And this is the HTML that's on S3:
<html>
<head>
</head>
<body>
<h1> This number that's in DDB </h1>
<div id="entries">
</div>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.1.0/jquery.min.js"></script>
<script type="text/javascript">
//This is the link generated by API Gateway
var API_URL = 'https://xxxxxxxxxx.execute-api.us-location-x.amazonaws.com/prod/retrieveDDB';
$(document).ready(function(){
$.ajax({
type: 'GET',
url: API_URL,
success: function(data){
$('#entries').html('');
data.Items.forEach(function(tableItem){
$('#entries').append('<p>' + tableItem.Row + '</p>');
})
}
});
});
</script>
</body>
</html>
When I run the lambda function using the "test" button, it successfully retrieves the data from the DDB. But when I try to run the HTML, it does say the header text but then it doesn't append the value from DDB. I'm assuming I'm just not understanding how to call/parse the lambda data (If I even have it set up properly).
Thanks in advance for any help!
The cross-origin request blocked error occurs because you are trying to access the api from a different domain (for e.g www.example.com). This is a security feature of the browsers called CORS( cross origin resource sharing requests). The browser will send a pre-light request to the api to check whether the call should be allowed.
A prelight request is sent as http options request to the api method.
There are two steps.
You need to enable Cors to the API gateway, resource method. So that it will enable the Options method for the resource. In short, you need to select the particular resource and then click Enable Cors from the actions.
Reference: Please read "Enable CORS support on a REST API resource" section of https://docs.aws.amazon.com/apigateway/latest/developerguide/how-to-cors.html
You should return the origin header from your lambda.
const docClient = new AWS.DynamoDB.DocumentClient({ region: 'us-location-x' });
exports.handler = async (event) => {
let tableToRead = {
TableName: 'dataStore',
Limit: 10
};
const headers = {
"Access-Control-Allow-Origin": '*'
// other headers
}
try {
const data = docClient.scan(tableToRead).promise();
return {
statusCode: 200,
body: JSON.stringify(data),
headers: headers
}
} catch (error) {
return {
statusCode: 400,
body: error.message,
headers: headers
}
}
};
Hope this helps. #Ashish Modi, thanks for sharing the link.

google actions sdk: using third party server to build templated responses

My Google Actions project points to a Google Cloud Function as a webhook. In the google cloud function I am able to create the conversation responses using conv.ask(...).
However, what I am trying to do is: build a generic conversation content framework that resides on another server (also google cloud function) where I would like to compose the response and send it back to the webhook function.
The relevant code in both these servers are like this:
// in the webhook function
app.intent('actions.intent.MAIN', (conv, input) => {
// here I would like to call the second google function by
// passing, say, the input and receiving a response that can
// be passed on to the conv
// something like
// assume request-promise is being used
//
var options = {
method: 'POST',
uri: '..',
body: {...},
json: true
};
rp(options)
.then(resp => {
conv.ask(resp) // this is what I would like to do
});
});
In the second Google Functions server, I am using express as middleware. Here on some logic templated responses get built
const ..
const {
SimpleResponse,
BasicCard,
...
} = require('actions-on-google');
...
const express = require('express');
var app = express();
...
app.post('/main', function(req, res, next) {
// here I would like to compose the response
// and send it to the earlier function
var convresp = new SimpleResponse({...});
..
res.send(convresp);
// this seems to be only sending the json
// and causes the receiving response to give an error
// when applying to conv.ask in the above code
});
Question is: how should the response be sent from the second function so that it can be "pasted" to the conv.ask functionality in the first function?. Thanks

fetch to Wikipedia-api is not being made

I'm doing the wikipedia viewer from FCC projects with React, Im trying to make the request by just passing it the searchQuery (from my state) with template string. Like this:
gettingArticle() {
const { searchQuery, articlesList } = this.state;
const API = `https://en.wikipedia.org/w/api.php?action=query&list=search&srsearch=${searchQuery}&prop=info&inprop=url&utf8=&format=json`;
const body = { method: 'GET', dataType: 'json'};
const myRequest = new Request(API, body);
fetch(myRequest)
.then(response => response.json())
.then(data => this.setState({
articlesList: data, articles: true }));
console.log( 'data fetched' + displayedArticles);
}
I don't know for sure if is like this that I have to made the request it's just what I saw on the docs. I want to made the request and after receive the data I want to iterate over the array of objects and put every little thing that I need in their corresponding tag inside a div. Here is my entire code: https://codepen.io/manAbl/pen/VxQQyJ?editors=0110
The issue is because you missed a key details in the API documentation
https://www.mediawiki.org/wiki/API:Cross-site_requests
Unauthenticated CORS requests may be made from any origin by setting the "origin" request parameter to "*". In this case MediaWiki will include the Access-Control-Allow-Credentials: false header in the response and will process the request as if logged out (in case credentials are somehow sent anyway).
So I update your url like below
const API = `https://en.wikipedia.org/w/api.php?action=query&origin=*&list=search&srsearch=${searchQuery}&prop=info&inprop=url&utf8=&format=json`;
And also your console.log was at the wrong place, so I changed it to below
fetch(myRequest)
.then(response => response.json())
.then(data => {
console.log( 'data fetched', data);
this.setState({
articlesList: data, articles: true })
});
Below is a updates pen
https://codepen.io/anon/pen/BxMyxX?editors=0110
And now you can see the API call works
Of course I didn't check why you have white strip after the call is successful, but that may be something wrong you do in your React code
The problem is not really in your code but in CORS, basically you are not allowed to make the call because of same origin policy.
Change these 2 constants
const API = `https://crossorigin.me/https://en.wikipedia.org/w/api.php?action=query&list=search&srsearch=${searchQuery}&prop=info&inprop=url&utf8=&format=json`;
const body = { method: 'GET', dataType: 'json', mode: 'cors', cache: 'default'};
I added crossorigin url before your API because it 'overrides' CORS and enables you to make calls outside the same origin policy. You should also modify your submit function:
handleSubmit(ev) {
ev.preventDefault(); //This disables default form function (reload/redirect of website and loss of state)
this.gettingArticle();
}