Do pub/sub messages get deleted if cloud function is deleted? - google-cloud-functions

If I delete a google cloud function and redeploy the same function a day later will I have lost the pub/sub messages that trigger it for that day?

It depends how you have deployed your Cloud Functions
HTTP Functions: You deployed your Cloud Functions with an HTTP trigger. In this case, you have to create a PubSub Push subscription manually that call the Cloud Functions when a message arrived.
If you delete your Cloud Functions, the Pubsub subscription continue to live and to receive the messages. They are kept up to their expiration deadline. The push HTTP call fails in error continuously because the Cloud Functions no longer exists.
If you redeploy the Cloud Functions, on the same project, with the same name, the HTTP url will again exist and the messages will be delivered, without lost.
Background Functions: You have deployed your function with a trigger topic. In this case a PubSub subscription is created automatically with the function is created
If you delete the Cloud Functions, the PubSub subscription, created automatically, will be deleted automatically. So, the existing messages and the new ones will be lost, because there is no subscription to accept and store them.

Related

Can Timer Triggers call Activity Triggers in Azure?

I am creating a timer trigger function that is going to do certain checks. If a certain conditional is hit, I want to send an email and have set this up through an Activity Trigger. I keep getting the error
The function 'activityTriggerName` doesn't exist', additional info: No orchestrator functions are currently registered!.
I am brand new to durable functions and triggers in Azure, any direction on if this is allowed or other ideas would be appreciated.
Activity functions can only be initiated by an Orchestrator. The usual pattern is to have a trigger function create an orchestrator and then the orchestrator initiates one or more activity functions.
If you’re using Visual Studio then the Orchestrator template will create a sample which includes an HTTP trigger, an orchestrator and an activity.

Apps Script Activity Reporting/Visualization

I've been developing an apps script project for my company that tracks our time/expenses. I've structured the project like so:
The company has a paid Gsuite account that owns all the spreadsheets hosted on the company's google drive.
Each employee has their own "user" spreadsheet which is shared from the company Gsuite account with the employee's personal gmail account.
Each of the user spreadsheets has a container-bound script that accesses a central library script.
The library script allows us to update the script centrally and the effects are immediate for each user. It also prevents users from seeing the central script and meddling with it.
Each of the user container-bound scripts have installable triggers that are authorized by the company account so that the code being run has full authority to do what it needs to to the spreadsheets.
This setup has been working quite well for us with about 40 users. The drawback to this setup is that since all the script activity is run by the company account via the triggers, the activity of all our users is logged under the single company account and therefore capped by the apps script server quotas for a single user. This hasn't been much of an issue for us yet as long as our script is efficient in how it runs. I have looked into deploying this project as a web-app for our company, but there doesn't seem to be a good way to control/limit user access to the central files. In other words, if this project was running as a web app installed by each user, each user would need to have access to all the central spreadsheets that the project uses behind the scenes. And we don't want that.
SO with that background, here is my question. How do I efficiently track apps script activity to see how close we are to hitting our server quota, and identify which of my functions need to be optimized?
I started doing this by writing a entry into a "activity log" spreadsheet every time the script was called. It tracked what function was called, and who the user was and it had a start time entry and and end time entry so I can see how long unique executions took and which ones failed. This was great because I had a live view into the project activity and could graph it using the spreadsheet graphs tools. Where this began to break down was the fact that every execution of the script required two write-actions: one for initialization and another for completion. Since the script is being executed every time a user made an edit to their spreadsheet, during times of high traffic, the activity log spreadsheet became inaccessible and errors would be thrown all over the place.
So I have since transitioned to tracking activity by connecting each script file to a single Google Cloud Platform (GCP) project and using the Logger API. Writing logs is a lot more efficient than writing an entry to a spreadsheet, so the high traffic errors are all but gone. The problem now is that the GCP log browser isn't as easy to use as a spreadsheet and I can't graph the logs or sum up the activity to see where we stand with our server quota.
I've spent some time now trying to figure out how to automatically export the logs from the GCP so I can process the logs in real-time. I see how to download the logs as csv files, which I can then import into a google spreadsheet and do the calcs and graphing I need, but this is a manual process, and doesn't show live data.
I have also figured out how to stream the logs from GCP by setting up a "sink" that transfers the logs to a "bucket" which can theoretically be read by other services. This got me excited to try out Google Data Studio, which I saw is able to use Google Cloud Storage "buckets" as a data source. Unfortunately though, Google Data Studio can only read csv files in cloud storage, and not the json files that my "sink" is generating in my "bucket" for the logs.
So I've hit a wall. Am I missing something here? I'm just trying to get live data showing current activity on our apps script project so I can identify failed executions, see total processing time, and sort the logs by user or function so I can quickly identify where I need to optimize my script.
You've already referenced using GCP side of your Apps Script.
Have a look at Metric explorer, it lets you see quota usage per resource and auto generates graph for you.
But long term I think re-building your solution may be a better idea. At minimum switching to submitting data via Google Forms will save you on operation.

Sending email with file attachment using Google Cloud Function and Pub/Sub

I have a Pub/Sub topic where I publish messages and that pubsub topic is subscribed by a Google Cloud Function which sends the email to the customer using mailgun API.
This works fine so far, now I got the requirement that I have to send the email with the file attachment.
I can easily do this using Google Cloud Function using HTTP trigger but how to do it using a Pub/Sub topic?
Check the given image.
Note: The file size could be more than 10mb in my case.
First of all if I understand your goal correctly, PubSub alone will never be enough as the actual work will happen at the function.
To achieve the above you need a cloud function with Pub/Sub trigger.
Once the function is called, you need to either fetch the attachment from a Google Cloud Bucket, or download it from a another API/source you have. You'll have to download it inside the /tmp directory which is used to store temporary files.
Once completed, then you can use mailgun or other tools of your choice. Remember the attachment is inside the /tmp directory and not ./
Using mailgun or any other method does not change anything on the overall approach. For example if you were to use Gmail API, you'd add this to your cloud function for the final step.

Securely calling a Google Cloud Function via a Google Apps Script

How can I securely call a Google Cloud Function via a Google Apps Script?
✅ I have a Google Cloud Function, which I can access at https://MY_REGION-MY_PROJECT.cloudfunctions.net/MY_FUNCTION, and which I would like to allow certain users to invoke via an Apps Script.
✅ To secure the Cloud Function, I have set Cloud Function Invoker to only include known email (e.g. USER#COMPANY.com, where this is a valid Google email).
✅ I am able to successfully invoke the Cloud Function via curl, while logged into gcloud with this email, by running: curl https://MY_REGION-MY_PROJECT.cloudfunctions.net/MY_FUNCTION -H "Authorization: Bearer $(gcloud auth print-identity-token)".
✅ I have granted the following oauthScopes in my Apps Script's manifest:
"https://www.googleapis.com/auth/script.external_request"
"https://www.googleapis.com/auth/userinfo.email"
"https://www.googleapis.com/auth/cloud-platform"
⛔️ However, when I attempt to invoke the Cloud Function via a Google Apps Script, while logged in with the email USER#COMPANY.com, I am unable to invoke it and instead returned a 401. Here is how I have attempted to invoke the Cloud Function:
const token = ScriptApp.getIdentityToken();
const options = {
headers: {'Authorization': 'Bearer ' + token}
}
UrlFetchApp.fetch("https://MY_REGION-MY_PROJECT.cloudfunctions.net/MY_FUNCTION", options);
ℹ️ I have also tried the following:
Using ScriptApp.getOAuthToken()
Adding additional oauthScopes, e.g. openid.
Creating an OAuth Client ID with https://script.google.com set as an Authorized Javascript origin.
Deploying the Apps Script.
Crying out to the sky in utter, abject despair
I struggled very much authenticating from Apps Script to invoke a Cloud Run application and just figured it out, and I believe it's similar for calling any Google Cloud application including Cloud Functions. Essentially the goal is to invoke an HTTP method protected by Google Cloud IAM using the authentication information you already have running Apps Script as the user.
The missing step I believe is that the technique you're using will only work if the Apps Script script and Google Cloud Function (or Run container in my case) are in the same GCP project. (See how to associate the script with the GCP project.)
Setting it up this way is much simpler than otherwise: when you associate the script with a GCP project, this automatically creates an OAuth Client ID configuration to the project, and Apps Script's getIdentityToken function returns an identity token that is only valid for that client ID (it's coded into the aud field field of the token). If you wanted an identity token that works for another project, you'd need to get one another way.
If you are able to put the script and GCP function or app in the same GCP project, you'll also have to do these things, many of which you already did:
Successfully test authentication of your cloud function via curl https://MY_REGION-MY_PROJECT.cloudfunctions.net/MY_FUNCTION -H "Authorization: Bearer $(gcloud auth print-identity-token)" (as instructed here). If this fails then you have a different problem than is asked in this Stack Overflow question, so I'm omitting troubleshooting steps for this.
Ensure you are actually who the script is running as. You cannot get an identity token from custom function in a spreadsheet as they run anonymously. In other cases, the Apps Script code may be running as someone else, such as certain triggers.
Redeploy the Cloud Function as mentioned here (or similarly redeploy the Cloud Run container as mentioned here) so the app will pick up any new Client ID configuration. This is required after any new Client ID is created, including the one created automatically by adding or re-adding the script to the GCP project. (If you move the script to another GCP project and then move it back again, it seems to create another Client ID rather than reuse the old one and the old one will stop working.)
Add the "openid" scope (and all other needed scopes, such as https://www.googleapis.com/auth/script.external_request) explicitly in the manifest. getIdentityToken() will return null without the openid scope which can cause this error. Note to readers: read this bullet point carefully - the scope name is literally just "openid" - it's not a URL like the other scopes.
"oauthScopes": ["openid", "https://...", ...]
Use getIdentityToken() and do NOT use getOAuthToken(). According to what I've read, getOAuthToken() returns an access token rather than an identity token. Access tokens do not prove your identity; rather they just give prove authorization to access some resources.
If you are not able to add the script to the same project as the GCP application, I don't know what to do as I've never successfully tried it. Generally you're tasked with obtaining an OAuth identity token tied to one of your GCP client ids. I don't think one app (or GCP project) is supposed to be able to obtain an identity token for a different OAuth app (different GCP project). Anyway, it may still be possible. Google discusses OAuth authentication at a high level in their OpenID Connect docs. Perhaps an HTML service to do a regular Google sign-in flow with a web client, would work for user-present operations if you get the user to click the redirect link as Apps Script doesn't allow browser redirects. If you just need to protect your service from the public, perhaps you could try other authentication options that involve service accounts. (I haven't tried this either.) If the service just needs to know who the user is, perhaps you could parse the identity token and send the identifier of the user as part of the request. If the service needs to access their Google resources, then maybe you could have the user sign in to that app separately and use OAuth generally for long term access to their resources, using it as needed when called by Apps Script.
The answer above is very good. But since I am new with this I still had to spend a lot of time trying to figure it out.
This worked for me:
Apps Script code:
async function callCloudFunction() {
const token = ScriptApp.getIdentityToken();
const options = {
headers: {'Authorization': 'Bearer ' + token}
}
const data = JSON.parse(await UrlFetchApp.fetch("https://MY_REGION-MY_PROJECT.cloudfunctions.net/MY_FUNCTION", options).getContentText())
return data
}
Make sure that in project config you have the same project where your function is created.
After that, you can add the emails of the users you want to access the script on the Permission section in the function.
And as #alexander-taylor mentioned as well, make sure to add the scopes to your manifest file. You can make the manifest visible from the configuration tab in apps script. It took me some time to get that too.
Thanks to your comment you can do 2 things. But before, you have to know that you can't (or I least I never achieve this), create a valid identity token for being authenticated by Cloud Function and Cloud Run with a user credential. I opened a question on this
But you can call Google Cloud API with user credential! So
You can use the function test call API. The quotas limit you to 16 calls per 100 minutes (of course, it's design for test!)
You can publish a message into PubSub and plug your function on it. In this pattern your call is asynchronous.

Messages are not getting acknowledged in cloud functions

I am trying to use google pubsub with cloud functions all of my things worked greatly but my messages are not being ackowledged
I have done these things:
1) Created a topic
2) Created a function
3) set trigger of function as google cloud pubsub
4) select topic for pubsub
5) set node version as 8
This is my default cloud function:
/**
* Triggered from a message on a Cloud Pub/Sub topic.
*
* #param {!Object} event Event payload.
* #param {!Object} context Metadata for the event.
*/
exports.helloPubSub = (event, context) => {
const pubsubMessage = event.data;
console.log(Buffer.from(pubsubMessage, 'base64').toString());
};
It will create 2 subscriptions for this topic one is pull and other to push
My cloud function is getting called but my messages are not getting acknowledged.
According to docs
Note: Cloud Functions acks the message internally upon successful function execution. For information on how to handle failures using retries, see Retrying Background Functions.
So functions should acknowledge automatically bit it is not working.
what is problem in this flow. what am I doing wrong?
Thanks
You do not need to explicitly create the subscription for a pubsub cloud function.
When deploying the cloud function and pointing it at a topic, the subscription is automatically created.
Create your-topic:
gcloud pubsub topics create your-topic
Deploy function pointing at your-topic (this automatically creates a subscription):
gcloud functions deploy your-function --region=us-central1 --trigger-topic=your-topic --runtime=python37 --entry-point=main
Run deploy command from the folder with your cloudfunction code. You can change the runtime and entry-point to match your use case.
Any secondary subscription you create has no consumers thus messages will not be acknowledged. Also, if your cloud function is failing it's messages may not be acknowledged.
If you want to try writing, deploying, and triggering a Background Cloud Function with a Cloud Pub/Sub trigger then the following documentation page is better.
https://cloud.google.com/functions/docs/tutorials/pubsub
I think you needs to trigger the function.
gcloud pubsub topics publish YOUR_TOPIC_NAME --message YOUR_MESSAGE