I have a situation where i have to call an Azure function periodically.
When i call the function, i need to check the state of the azure function.
If the Azure function is running, then i need to postpone the call until it is completed.
I am trying to look in an Email Queue (as the emails are coming in), I need to send the email using Amazon SES
I am using a HTTPtrigger and the email part is working fine.
I don't want the function to be called, when it is already running.
If you consider the serverless architecture, each time whenever you invoke a service endpoint, a new instance will be created and scaling is managed by scaling controller.
There is no way to check if the function is running or not.
Without understanding more about your use-case, I think this is possible with Durable Functions. Look up Eternal Orchestrations that call themselves on an interval indefinitely. You can then query the status if required and have a workflow in the eternal orchestration that changes depending on certain criteria.
Related
I have created an HTTP Google Cloud Function that does not allow unauthenticated requests.
I have created a service account in the project with one role: Cloud Functions Invoker.
This service account is listed as a principal for my http cloud function and shows to have that role:
I have created a Cloud Scheduler Job to run this function.
In the job, I've specified that I want it to obtain an OIDC token for authenticating requests to the http function:
Whenever I trigger the job, it fails with a message indicating the request is unauthenticated:
Things I've tried:
Recreate the function
Recreate the job
Use a different user (the main service account user - that one doesn't work either)
Do a POST instead of a GET from the scheduler job (I've successfully created scheduled jobs for authenticated http functions before but this is the first time I've done a GET - just grasping at straws really)
Did I miss something? Any idea why it is coming back with the "Unauthenticated" message?
I revisited this today. My IAP protected HTTP function is expecting a query string parameter to be passed into it. The Cloud Platform Web UI automatically sets the audience to the same URL (including the parameter) when creating the Scheduled Job. I figured Google knows what they are doing, so I left it that way originally.
Out of desperation I tried removing this parameter from the audience and that made the authentication work properly.
So, I changed the audience from
https://<myProject>.cloudfunctions.net/myFunction?p=abc
to
https://<myProject>.cloudfunctions.net/myFunction
I'm triggering a cloud function every minute with cloud scheduler [* * * * *].
The Stackdriver logs indicate the function appears to have been triggered and run twice in the same minute. Is this possible?
PubSub promises at least once delivery but I assumed that GCP would automatically handle duplicate triggers for scheduler -> function workflows.
What is a good pattern for preventing this function from running more than once per minute?
Your function needs to be made "idempotent" in order to ensure that a message gets processed only once. In other words, you'll have to maintain state somewhere (maybe a database) that a message was processed successfully, and check that state to make sure a message doesn't get processed twice.
All non-HTTP type Cloud Functions provide a unique event ID in the context parameter provided to the function invocation. If you see a repeat event ID, that means your function is being invoked again for the same message, for whatever reason.
This need for idempotence is not unique to pubsub or cloud scheduler. It's a concern for all non-HTTP type background functions.
A full discussion on writing idempotent functions is a bit too much a Stack Overflow answer, but there is a post in the Google Cloud blog that covers the issue pretty well.
See also: Cloud functions and Firebase Firestore with Idempotency
I use serverless framework to manage my cloud functions. Some of them are of HTTP type. Recently, all the HTTP functions started to fail with 403 error. No matter if you enter a URL in a browser or trigger it with the cloud scheduler. The only place where it works is the testing tab of the function in the cloud console, when you click the "Test the function" button.
So, I did not find the reason for the error but it fixed with removing the function and redeploying it.
serverless remove
serverless deploy
Is it possible that the Identity Aware Proxy has been enabled for the Cloud Function URLs? If you navigate to Cloud Console and then to "Security" and "Identity-Aware Proxy", you should be able to see the IAP settings and whether the Cloud Function is being protected by IAP.
If that is not the cause, I would advise putting some logging in your function that would make it clear whether the function is getting called and then returning a 403 somewhere within the execution of the function (indicating a problem with the function, itself, rather than the identity infrastructure) or if the function is never getting called (the 403 is being produced outside of the Cloud Function), in which case you may need to reach out to Cloud Support for help with this (if IAP isn't the cause).
Google Cloud Functions added some new IAM functionality, not sure how recently, and now new functions don’t have public access by default.
Incase someone else comes here I thought I'd share this information here.
To allow your function to be invoked you first have to add permissions to the function, you can do this by selecting the function in the functions list and adding allUsers to the Cloud Invokes role, you can see the step by step at:
https://lukestoolkit.blogspot.com/2020/06/google-cloud-functions-error-forbidden.html
I have to create timer function that will call particular API for every specific duration.
I don't want to use any console application for that.
In short, I have to call webapi every 10 min.
You can also consider using Azure Logic Apps. It provides declarative way to schedule tasks and workflows.
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-recurrence
Even your post isn't realy a question, you probably look for Azure Scheduler.
Using Azure Scheduler, you don't even have to write a function, just enter the URL you wan't to call and specify the trigger.
I'm trying to develop a test framework for some ActionScript code we're developing (Flex 3.5). What's happening is this:
As part of a Web Analytics function we are calling a track method in a class, providing the relevant information as part of the call. This method is provided in a library (SWC), and we have no access to the code.
Ultimately the track method sends an outgoing http request to the tracking server. We can see this quite happily in HttpFox.
I was hoping to be able to capture this outgoing request and interrogate it in my test class, allowing us to a) run tests in a more standalone fashion, and b) programmatically determine that the correct information is being tracked.
No problem just run this developer tool that displays all requests leaving your machine.
http://www.charlesproxy.com/
Unless you're going to use a sniffing tool, which probably would be hard to use for a programmatic evaluation, I would recommend using a proxy to channel your request. You could let the track method send the request to a php script on the proxy server, have it evaluate the request content, and then forward it to the actual tracking server. I suppose on a tracking system, you won't need to worry about the response, so it shouldn't be too hard to implement.
You could run a web server on a localhost (or any really) and just make sure the DNS entry the code is trying to access points to the server you are running.