Today I got this error in cloud function:
Function killed. Error: memory limit exceeded
My function is based on authenticated-json-api example of Firebase sample functions. Because it worked like a charm, I extended it with multiple routes and multiple tasks like connecting with multiple external api, turning base64 strings into pdf on storage, validation, logging, and so on...
I removed some routes and it looks more stable now. My question now is: Can there be a limit to the amount of code / processing there can be within a single function. And would it be a better approach to split them up in multiple express api's ?
I also found some questions about allocating memory to specific functions. However, I can't find the option in Google Cloud Platform to change it nor the option in firebase package.json to set it.
I found the solution:
Go to the Google Cloud Platform Console (not the Firebase console)
Select Cloud Functions in the menu
Now you see your firebase function in here if it's correct. Otherwise check if you selected the right project.
Ignore all checkboxes, buttons and menu items, just click on the name of the function.
Click on edit (top menu) and only change the allocated memory and click save.
Regards, Peter
Related
I would like to quickly list all Google Cloud projects in an organization, without AppScript folders.
gcloud projects list can be very slow. This documentation is about speeding it up, but does not show how to retrieve the Appscript folder which is used for filtering. Can that be done from the command line?
Also, gcloud projects list does not have a way to filter by organization. It seems that that is impossible as projects are not linked to their organization except through a tree of folders.
The documentation shows a way of walking the tree, apparently with Resource Manager API, which might do the job, but only pseudocode is shown. How can this be done with gcloud -- or else with Python or another language?
And if there is no way to accelerate this: How do I page through results using gcloud projects list? The documentation shows that page-size can be set, but does not show how to step through page by page (presumably by sending a page number with each command).
See also below for a reference to code I wrote that is the imperfect but best solution I could find.
Unfortunately there isn’t a native Apps Script resource available to work with Cloud Resource Manager API.
Although, it is possible to make a HTTP call directly to the Resource Manager API projects.list() endpoint with the help of UrlFetchApp service.
Alternatively, using Python as mentioned, the recommended Google APIs client library for python supports calls to Resource Manager API. You can find the specific projects.list() method documentation here.
On additional note, if you happen to use a Cloud project to generate credentials and authenticate the API call, you may want to enable Cloud Resource Manager API on your project by following this URL.
I’d also recommend submitting a new Feature Request using this template.
Here is some code that lists projects in an organization as quickly as possible. It is in Clojure, but it uses Java APIs and you can translate it easily.
Key steps
Query all accessible projects using CloudResourceManager projects(), using setQuery to accelerate the query by filtering out, for example, the hundreds of sys- projects often generated by AppScript. The query uses paging.
From the results
Accept those that are the child of the desired org
Reject those that are the child of another org.
For those that are the child of a folder, do this (concurrently, for speed): Use gcloud projects get-ancestors $PROJECT_ID to find the projects in your organization. (I don't see a way to do that in Java, and so I call the CLI.)
I use serverless framework to manage my cloud functions. Some of them are of HTTP type. Recently, all the HTTP functions started to fail with 403 error. No matter if you enter a URL in a browser or trigger it with the cloud scheduler. The only place where it works is the testing tab of the function in the cloud console, when you click the "Test the function" button.
So, I did not find the reason for the error but it fixed with removing the function and redeploying it.
serverless remove
serverless deploy
Is it possible that the Identity Aware Proxy has been enabled for the Cloud Function URLs? If you navigate to Cloud Console and then to "Security" and "Identity-Aware Proxy", you should be able to see the IAP settings and whether the Cloud Function is being protected by IAP.
If that is not the cause, I would advise putting some logging in your function that would make it clear whether the function is getting called and then returning a 403 somewhere within the execution of the function (indicating a problem with the function, itself, rather than the identity infrastructure) or if the function is never getting called (the 403 is being produced outside of the Cloud Function), in which case you may need to reach out to Cloud Support for help with this (if IAP isn't the cause).
Google Cloud Functions added some new IAM functionality, not sure how recently, and now new functions don’t have public access by default.
Incase someone else comes here I thought I'd share this information here.
To allow your function to be invoked you first have to add permissions to the function, you can do this by selecting the function in the functions list and adding allUsers to the Cloud Invokes role, you can see the step by step at:
https://lukestoolkit.blogspot.com/2020/06/google-cloud-functions-error-forbidden.html
I have lately been experimenting around (as a noob!) with Webhook. However, I seem to be stuck with an "actions-on-google:error No user object" issue.
Would appreciate if you could reach out and lend a hand please.
firebase log
index.js
The inline editor uses Firebase Cloud Functions and the issue is that Firebase isn't allowing you to make external requests ( EXT_PRAYER_TIME_API_URL = ...) with the current plan (see Cloud Function Pricing). You need to setup a billing account with your project and change your plan to one that allows you to make outbound requests.
Does Google Data Studio Community connector support pagination?
I work with an external data service. The service returns data page by page. It requires start and next parameters and it requires 2 req/sec. Can I override a method like getData or upgrade request argument for implement this feature?
If it's not. Is there the best practice for getting data of this kind?
Community Connectors do not support pagination for web APIs at present.
The best practice would depend on your use case. If you want to get the full dataset for user, you can make multiple UrlFetch calls to get the full dataset, merge it, and return the merged set as the getdata() response. It might also make sense to cache this result to avoid making a large number of requests in the short term. You can cache using Apps Script cache, or a Sheet, or even to BigQuery. Keep in mind that Apps Script has 6 min / execution limit.
However, if you want to return only specific pages, the only way to configure that would be through getConfig since configparams are passed with the getData() request. Example use case would be returning only first n number of pages where n selected by user in the config.
Our application uses the drive.file scope to make sure we only can see files of our users that have been created by our application.
However, a 'list' call returns files that have been shared with the user even though they are not created by our application.
That can be easily verified in the "Try it!" section of files/list API documentation.
Authorize with drive.file scope and run a simple list query without any parameters. That should return an empty list but in my case returns dozens of files that have been shared with me.
There was the same issue before (a slight variation only affecting queries with q parameter set): Listing files with search query returns out-of-scope results (drive.files.list call, using drive.files scope)
It has been fixed in the meantime but now it seems to be back for all list queries. It's problematic not mainly because it breaks our app that expects nothing but its own files. There is the privacy problem because I can suddenly see the file names of our users' private data, which they have never agreed to.
I believe this issue is due to the behavior of API Explorer, not Drive API itself. If API Explorer already has a token with OAuth scopes capable of making the call, it will use that so if previously given API Explorer a scope that can see all user files, you'll get them all back. Try revoking ALL Explorer tokens for your account at:
https://accounts.google.com/b/0/IssuedAuthSubTokens?hl=en
then, after clearing all cookies/sessions for developers.google.com, try creating a new token with ony the drive.file scope and attempt your API call again.
You should also note that files that are publicly shared will be returned.