I have experience working with AWS Lambda and on my previous job we were able to have a neglectable cold start time for our use case (99% of cold start < 500 ms). Now I am working for a new company with similar use case, but they selected GCP as the main cloud platform.
By investigating Google Cloud functions it seems that the effective cold start for Google Cloud Functions are 5+ times longer than Lambda (for node.js) which might turn cloud functions not suitable for User facing APIs, unfortunately.
Is there anything in the pipeline to make Cloud Functions cold start as short (or shorter) than the one seen for AWS Lambda?
Regards
Yes. There is something in the pipeline. They talked on it on the last Google IO Firebase QA. I can remember watching it. Here is a link to that question and answer: https://youtu.be/3BMNzY_ljSw?t=865
No one can tell now when it will come to Firebase but I hope soon :)
UPDATE
You can check it now in the releas note ;)
exports.getAutocompleteResponse = functions
.runWith({
// Keep 5 instances warm for this latency-critical function
minInstances: 5,
})
.https.onCall((data, context) => {
// Autocomplete a user's search term
});
Related
I've build a small web app that asks a user to perform a Google OAuth login and submit their data in a form to a cloud function. All of this is using the same API Key but I'm seeing the usage in the last 30 days to be 2000+ when it's just me testing it. When I look at the usage of the cloud function I see 300 (which looks more realistic to me).
The API Key usage says it includes billable and non-billable requests, any way I can actually see the breakdown without having a surprise slap me in the face at the end of the month once the app goes live?
Any help would be greatly appreciated :)
I have a situation where i have to call an Azure function periodically.
When i call the function, i need to check the state of the azure function.
If the Azure function is running, then i need to postpone the call until it is completed.
I am trying to look in an Email Queue (as the emails are coming in), I need to send the email using Amazon SES
I am using a HTTPtrigger and the email part is working fine.
I don't want the function to be called, when it is already running.
If you consider the serverless architecture, each time whenever you invoke a service endpoint, a new instance will be created and scaling is managed by scaling controller.
There is no way to check if the function is running or not.
Without understanding more about your use-case, I think this is possible with Durable Functions. Look up Eternal Orchestrations that call themselves on an interval indefinitely. You can then query the status if required and have a workflow in the eternal orchestration that changes depending on certain criteria.
I would like to load data to my db hosted on GKE, using cloud function (small ETL needs, Cloud function would be great for that case)
I'm working in the same region. my GKE has an internal load balancer exposing an gcloud internal IP.
the method called is working perfectly when it's from Appengine but when doing it with cloud function I have an connexion error : "can't find client at IP"
I would like to know if it is possible ?
if so, what would be the procedure ?
Many thanks !!
Gab
We just released this feature to Beta. You can get started by following our docs:
https://cloud.google.com/functions/docs/connecting-vpc https://cloud.google.com/appengine/docs/standard/python/connecting-vpc
https://cloud.google.com/vpc/docs/configure-serverless-vpc-access
This is not currently possible as of today.
https://issuetracker.google.com/issues/36859738
Thanks for your feedback.
You are totally right. At the moment the instances are only able to receive such requests via the external IP [1].
I have filed a feature request in your behalf so that this functionality might be considered for future deployments. I cannot guarantee this will be implemented or provide an E.T.A. Nevertheless, rest assured that your feedback is always seriously taken.
We also reached out to our Google Cloud representative who confirmed this was a highly requested feature that was being looked at but was unable to provide an ETA as when it would be released.
I have lately been experimenting around (as a noob!) with Webhook. However, I seem to be stuck with an "actions-on-google:error No user object" issue.
Would appreciate if you could reach out and lend a hand please.
firebase log
index.js
The inline editor uses Firebase Cloud Functions and the issue is that Firebase isn't allowing you to make external requests ( EXT_PRAYER_TIME_API_URL = ...) with the current plan (see Cloud Function Pricing). You need to setup a billing account with your project and change your plan to one that allows you to make outbound requests.
Today I got this error in cloud function:
Function killed. Error: memory limit exceeded
My function is based on authenticated-json-api example of Firebase sample functions. Because it worked like a charm, I extended it with multiple routes and multiple tasks like connecting with multiple external api, turning base64 strings into pdf on storage, validation, logging, and so on...
I removed some routes and it looks more stable now. My question now is: Can there be a limit to the amount of code / processing there can be within a single function. And would it be a better approach to split them up in multiple express api's ?
I also found some questions about allocating memory to specific functions. However, I can't find the option in Google Cloud Platform to change it nor the option in firebase package.json to set it.
I found the solution:
Go to the Google Cloud Platform Console (not the Firebase console)
Select Cloud Functions in the menu
Now you see your firebase function in here if it's correct. Otherwise check if you selected the right project.
Ignore all checkboxes, buttons and menu items, just click on the name of the function.
Click on edit (top menu) and only change the allocated memory and click save.
Regards, Peter