Trigger of a cloud function by a cloud storage bucket in different project - google-cloud-functions

I have a requirement where I need to trigger a cloud function which in turn triggers a data flow job once a file is placed in google cloud storage bucket of another project. Google documentation says its not possible, please see here https://cloud.google.com/functions/docs/calling/storage
However, I tried doing this and my cloud function failed to deploy with below error. It says
Looks like this is a permission issue and if required permission are given , this will work.
Do I need to add and give owner permission to #appspot.gserviceaccount.com of the project(project A) from where I am trying to access the bucket of another project(Project B)
So if the above is true, In my project B IAM page, I will see 2 as below
#appspot.gserviceaccount.com OWNER
#appspot.gserviceaccount.com EDITOR
Any inputs on this is much appreciated.

It's not possible to catch event from other projects for now. But there is a workaround.
In the project with the bucket, create a PubSub notification on Cloud Storage
On the topic that you created, create a push subscription. Use the Cloud Functions URL, and secure the PubSub call (you can get inspiration from there. If you are stuck, let me know, I will take more time to describe this part)
On the Cloud Functions grant the PubSub service account as cloudfunctions.invoker role
EDIT 1
The security part isn't so easy at the beginning. In your project B (where you have your Cloud Storage), you have created a PubSub topic. On this topic you can create a notification with a service account created in the project B. Take care to well fill in the audience
Then, you need to grant this "project B" service account as roles/cloudfunctions.invoker on the Cloud Function of the project A
# Create the service account
gcloud iam service-accounts create pubsub-push --project=<ProjectB>
#Create the push subscription
gcloud pubsub subscriptions create \
--push-endpoint=https://<region>-<projectA>.cloudfunctions.net/<functionName> \
--push-auth-service-account=pubsub-push#<projectB>.iam.gserviceaccount.com \
--push-auth-token-audience=https://<region>-<projectA>.cloudfunctions.net/<functionName> \
--topic=<GCSNotifTopic> --project=<ProjectB>
#Grant the service account
gcloud functions add-iam-policy-binding --member=serviceAccount:pubsub-push#<ProjectB>.iam.gserviceaccount.com --role=roles/cloudfunctions.invoker <FunctionName> --project=<projectA>
Last traps:
The Cloud Functions in the project A haven't the same signature if it's an HTTP functions (callable by a pubsub push subscription) or a Background Function (callable by events, such as CLoud Storage event). You need to update this according to the documentation
The PubSub message sent to the Cloud Functions is slightly different. Take care to update the input param accordingly.

Google documentation says its not possible
This is all you need to know. It is not possible. There are no workarounds, regardless of what sort of error messages you might see.

Related

Permission denied when running scheduling Vertex Pipelines

I wish to schedule a Vertex Pipelines and deploy it from my local machine for now.
I have defined my pipeline which runs well I deploy it using: create_run_from_job_spec, on AIPlatformClient running it once.
When trying to schedule it with create_schedule_from_job_spec, I do have a Cloud Scheduler object well created, with a http endpoint to a Cloud Function. But when the scheduler runs, it fails because of Permission denied error. I used several service accounts with owner permissions on the project.
Do you know what could have gone wrong?
Since AIPlatformClient from Kubeflow pipelines raises deprecation warning, I also want to use PipelineJob from google.cloud.aiplatform but I cant see any direct way to schedule the pipeline execution.
I've spent about 3 hours banging my head on this too. In my case, what seemed to fix it was either:
disabling and re-enabling cloud scheduler api. Why did I do this? There is supposed to be a service account called service-[project-number]#gcp-sa-cloudscheduler.iam.gserviceaccount.com. If it is missing then re-enabling API might fix it
for older projects there is an additional step: https://cloud.google.com/scheduler/docs/http-target-auth#add
Simpler explanations include not doing some of the following steps
creating a service account for scheduler job. Grant cloud function invoker during creation
use this service account (see create_schedule_from_job_spec below)
find the (sneaky) cloud function that was created for you it will be called something like 'templated_http_request-v1' and add your service account as a cloud function invoker
response = client.create_schedule_from_job_spec(
job_spec_path=pipeline_spec,
schedule="*/15 * * * *",
time_zone="Europe/London",
parameter_values={},
cloud_scheduler_service_account="<your-service-account>#<project_id>.iam.gserviceaccount.com"
)
If you are still stuck, it is also useful to run gcloud scheduler jobs describe <pipeline-name> as it really helps to understand what scheduler is doing. You'll see cloudfunction url, POST payload which is some base64 encoded and contains pipeline yaml and you'll see that it is using OIDC/service account for security. Also useful is to view the code of the 'templated_http_request-v1' cloud function (sneakily created!). I was able to invoke the cloudfunction from POSTMAN using the payload obtained from scheduler job.

Calling Webhook Cloud Function from DialogFlow

i am trying to create a DialogFlow CX agent and call webhook Cloud function. The primary criteria for authentication is to use the service account. I expected this to be created automatically as per the docs mentioned here but I just cant see it in my IAM list.
Is this a bug or am i looking at something else?
Services accounts are created only when needed. If you proceed to add the webhook to the cloud Function, test it and then go to the IAM & Admin page you should be able to see it. Please, remember to mark the Include Google-provided role grants checkbox to be able to see it. If it is not created you can always manually do it as it shows in the documentation that you linked with the command:
gcloud beta services identity create --service=dialogflow.googleapis.com --project=agent-project-id

Cloud Schedule + Cloud Functions -> Gmail API watch() - WORKING NOW

This is my first post here. I am sorry if it's a repost, but I've been searching for more than one month for the answer to solve my problem in all websites and forums and until now... no answers!
My goal is to make a Gmail pub/sub watch() to make an action whenever I receive a new email.
To do so, according to the developer's website, I need to subscribe to Gmail watch() on a daily basis with the code:
request = {
'labelIds': ['INBOX'],
'topicName': 'projects/myproject/topics/mytopic'
}
gmail.users().watch(userId='me', body=request).execute()
Until now i have this a working scheduled task with a service account, with INVOKER Permissions. This part just works fine.
In my "initial autorization function" i have:
const {google} = require('googleapis');
// Retrieve OAuth2 config
const oauth2Client = new google.auth.OAuth2(
process.env.CLIENT_ID,
process.env.CLIENT_SECRET,
process.env.CALLBACK_URL
);
exports.oauth2init = (req, res) => {
// Define OAuth2 scopes
const scopes = [
'https://www.googleapis.com/auth/gmail.modify'
];
// Generate + redirect to OAuth2 consent form URL
const authUrl = oauth2Client.generateAuthUrl({
access_type: 'offline',
scope: scopes,
//prompt: 'none'// Required in order to receive a refresh token every time
});
return res.redirect(authUrl);
};
My issue now is that the access token is generated via (prompt) the first time and never updates to a new one ( the token expires after 1hour...) it means this code stops working after that period and a "manual" intervention is required. According with the documentation, i need to use "offline" method and on "prompt" i can omit (only requests permissions on the 1st time) or none (never asks), like is said here.
I managed how to make it work! tomorow i will continue with the process.
Should i post here my working code for reference?
Thanks!
I will rephrase the process you illustrated so that there is no ambiguity.
According the documentation you pushed:
You do not suscribed to watch(), you call watch()
watch() is an API call to the Gmail API that will enable automatic events publication on a pub/sub topic you define given conditions you specified. Who are you watching? On what events?
You suscribe to a Pub/Sub topic that is targeted by your previous watch() call
A process (e.g: Google cloud function) suscribes to the topic and will consume messages sent by the Gmail API
The call is to be renewed at least every seven days
Because Google needs to be sure you still need to monitor the targeted inbox, it needs a renewal from you. Another watch() call will act so.
Cloud scheduler will enable this periodic renewal
this service will trigger your renewal script you put in your question. To do so it needs to be authenticated to the platform that host the script. It is easier if your script is hosted in a google service (cloud function, cloud run,...) and the authent type depends on the target URL form. In all cases YOU DO NEED an authent token in your request header. The token is generated from a service account you created with the right permission to call your script (e.g: cloud run invoker). By default the scheduler has the right to generate a token from it
So far so good. Now comes the tricky part and you don't mention it in your question. How is authenticated your gmail api client? You cannot monitor someone inbox, unless this person gave you the permission to i.e you call the API with the right Oauth2 token. Indeed in the video you point they authenticat the user using this principe which is implemented in their code with Express-oauth2-handler.
So you will have a cloud function to init end user authent and watch to his/her inbox. The renewal should do so but problem is user will not be there for accepting the end user consent. Here comes the offline access but it is beyond the scope of your question. Finally a second functions will suscribe to the pubsub topic and consume the message as you need. See their implementation code which populate a spreadsheet.
The documentation you shared in the comments does not say that you can remove the token from the headers of the service account, also the gmail API documentation you also shared says that you only:
need to grant publish privileges to gmail-api-push#system.gserviceaccount.com. You can do this using the Cloud Pub/Sub Developer Console permissions interface following the resource-level access control instructions.
In order to achieve this basically what you will need is a setup of two cloud functions, the first scheduled function is responsible for setting up the watch(), and you can check this documentation for how to deploy a scheduled function, and the second function being triggered by the pubsub of gmail notifications, you can check this documentation for how to build an event triggered function. Both processes are similar.
NOTE: I have never user the Gmail API, so I am not sure if any extra steps are necessary but then again, the documentation implies that setting up the permissions of that service account is enough to make it work.
EDIT:
As per the information you have shared. The issue is likely that you are not properly setting the Service Account to authenticate with the Cloud Function. As per described in the documentation, you have to grant to the Service Account the role Cloud Functions Invoker in IAM.
Let me know if this fixed the issue.

Error on google function deploy, service account doesn't exist

Please can you help me, I'm receiving this error when I'm trying to deploy a google cloud function:
HTTP Error: 400, Default service account 'project-name#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Cloud Functions API), or specify a different account.
The command used to deploy is:
firebase deploy --only functions
A temporary solution is fine, but if you can help me to solve it permanently is better.
Thanks in advance.
In my case, the App Engine default service account was deleted. It looks like this: {project_id}#appspot.gserviceaccount.com
So I had to restore the service account like this:
You can now recover the deleted service accounts from https://cloud.google.com/iam/reference/rest/v1/projects.serviceAccounts/undelete
you have to get the UniqueID of the service account from https://console.cloud.google.com/home/activity
Source: https://stackoverflow.com/a/55277567/888881
The API explorer is an easy way to use the IAM API: https://developers.google.com/apis-explorer/#p/
I was struggling to resolve this issue, then I raised a case with Google.
here is a detailed article of my learnings :
https://medium.com/#ashirazee/http-error-400-default-service-account-appspot-gserviceaccount-com-accd178ea32a
Firstly navigate to Google Cloud Platform and view your service accounts.
try and find <project_id>#appspot.gserviceaccount.com' in your list of service accounts for the firebase project, it is linked to the App Engine.
if '#appspot.gserviceaccount.com' is missing you can not deploy anything(SEE EMAIL WITH GOOGLE BELOW), if it isn't, check and see if it's enabled, try disabling it and enabling it again.
#appspot.gserviceaccount.com is pre-installed by default, regardless of a paid account or not. try and recall if at any time you may have deleted it after or before deployment.
Now if you have for any reason deleted it over a period of more than 30 days than you can not retrieve it, and you must create a new firebase project. However, if it is within 30 days you can undelete it.
EMAIL FROM GOOGLE:
Email #1
"
Hello Ali
I am checking the logs of your project, unfortunately the service account was deleted on Ma, there is no chance to recover it nor recreate it
The only workaround available is to create a new project and deploy the service desired there. I know this could not be the best option for you nevertheless it is the way this works by design.
Do not hesitate to write back if you have more questions.
Cheers,"
Email #2,
"Hello Ali
I am glad to read that you have been able to deploy your functions successfully, unfortunately that service account cannot be recovered after 30 days of being deleted and that is the only solution. If you have other questions, please let us know by contacting us again through our support channel.
Cheers,"
lastly here is a helpful command line that will help you debug this, however, it won't help if there is no service account, it'll just highlight the obvious:
firebase deploy --only functions --debug
this was my error:
"HTTP Error: 400, Default service account '<project_id>#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Clo
ud Functions API), or specify a different account."
Following the error message, you could enable the API by console accessing this url and enable the api.
Or by gcloud command:
gcloud services --project <project_id> enable cloudfunctions.googleapis.com
As the other stated, sadly, you need to use the default service account.
If within the 30 days period, you can use this tutorial to find and undelete the service acc: Read this guide to get https://cloud.google.com/iam/docs/creating-managing-service-accounts#undeleting_a_service_account
You have to enter the commands through the Google Cloud Console (one of the buttons will open a terminal on the right of the top blue app bar)
try to select Google Cloud Platform (GCP) resource location in Setting of Firebase. enter image description here

Is Google Cloud Function Trigger from PubSub Topic a Subscription

I've got two environments in GCC with similar setups. Cloud functions triggered by PubSub Topics. In both the cloud functions list the PubSub topic as the trigger.
In one environment the Topic lists a subscription count of 1 which is listed as a Push to endpoint with a url of https://guid-dot-uid-tp.appspot.com/_ah/push-handlers/pubsub/projects/name/topics/topic_name?pubsub_trigger=true
In the other environment the subscription count is 0. So I'm confused as to whether this push to endpoint url subscription is related to the cloud function trigger or is it just something I've accidentally set up somewhere along the way and is not actually doing anything? (I'd delete the subscription to find out, but it's actually in a live environment and I don't want to inadvertently break it!)
If you have a Google Cloud Function that is triggered on a Cloud Pub/Sub topic, than the topic has a subscription. The push to endpoint url subscription that you posted
https://guid-dot-uid-tp.appspot.com/_ah/push-handlers/pubsub/projects/name/topics/topic_name?pubsub_trigger=true
is indeed a trigger for one of your Cloud Functions.