I'd like to deploy a Google Cloud Function from GAE/Java but I cannot find any info about deploying a function other than using the gcloud command line tool.
Is there a way to deploy a cloud function from Google App Engine (standard) / Java e.g. by using the Cloud Storage API and setting some additional fields on the request (e.g. for the httptrigger etc.)?
You can create Cloud Functions using the REST or RPC end-points.
Related
I have some params in Remote Config that I want to update from the Google Cloud Functions.
Should I use the Remote Config API when both Cloud Functions and Remote Config belong to the same account or project?
I am asking because Cloud Functions can directly import the data from Firestore without any authentication and API.
Remote Config provided the RESTful APIs to update the parameters or template.
You don't have to call them from a Google Cloud function. But calling them from a Google Cloud function or even Firebase Cloud function is definitely workable.
You can even call the RESTful APIs from postman or some other tools once you set up the call properly.
Check more details here: https://firebase.google.com/docs/reference/remote-config/rest
With Remote Config backend APIs, you could use Remote Config with Cloud Functions for Firebase, changing values in your app based on events that happen server-side. For example, you can use Remote Config to promote a new feature in your app, and then turn off that promotion automatically once you detect enough people have interacted with the new feature.
Using the Remote Config REST API or the Admin SDKs described in this guide, you can bypass managing the template in the Firebase console to directly integrate Remote Config changes into your own processes.
As described here, Cloud Functions can be triggered in response to changes in Firebase Remote Config in the same Cloud project as the function. This makes it possible to change the behavior and appearance of your app without publishing an app update.
I have a Google Firebase app with a Cloud Functions back-end. I'm using the node.js 10 runtime which is Ubuntu 18.04. Users can upload files to Google Cloud Storage, and that triggers a GCF function.
What I'd like that function to do is copy that file to Google Cloud Filestore (that's File with an L), Google's new NFS-mountable file server. They say the usual way to do that is from the command line with gsutil rsync or gcloud compute scp.
My question is: are either or both of those commands available on GCF nodes? Or is there another way? (It would be awesome if I could just mount the Filestore share on the GCF instance, but I'm guessing that's nontrivial.)
Using NFS based storage is not a good idea in this environment. NFS works by providing a mountable file system, something that will not work in Cloud Functions environment as the file system is read only with the exception of /tmp folder.
You should consider using cloud native storage systems like GCS for which the Application Default Credentials are already setup. See the list of supported services here.
According to the official documentation Cloud Filestore documentation
Use Cloud Filestore to create fully managed NFS file servers on Google
Cloud Platform (GCP) for use with applications running on Compute
Engine virtual machines (VMs) instances or Google Kubernetes Engine
clusters.
You can not mount the Filestore on GCF.
Also, you can not execute gsutil or gcloud commands from a Google cloud function Writing Cloud Functions.
Google Cloud Functions can be written in Node.js, Python, and Go, and
are executed in language-specific runtimes
I want to write a Google Cloud Function that can interact with GCP's Dataproc service to programatically launch Dataproc clusters. We already have a battle-hardened Dataproc infrastructure, we're just looking to extend the ways in which they get launched.
Our Dataproc clusters can only be launched using an appropriate IAM service account that is already a member of the appropriate IAM roles hence the Cloud Function will need to authenticate to the Dataproc service using that service account. What is the most appropriate way for a Cloud Function to authenticate to other GCP services/APIs using a service account?
Options I suspect include:
* running the function as that service account
* providing a JSON key file & setting GOOGLE_APPLICATION_CREDENTIALS environment variable
Is there a recognised way of achieving this?
I have had a look at :
* https://cloud.google.com/docs/authentication/
* https://cloud.google.com/docs/authentication/getting-started
but they are not specific to Cloud Functions.
I've also looked at
* https://cloud.google.com/functions/docs/writing/http
but that seems more concerned with how the caller of the function can authenticate.
I think this is what you're looking for: https://cloud.google.com/functions/docs/concepts/iam
At runtime, Cloud Functions defaults to using the App Engine default service account (PROJECT_ID#appspot.gserviceaccount.com), which has the Editor role on the project. You can change the roles of this service account to limit or extend the permissions for your running functions. You can also change which service account is used by providing a non-default service account on a per-function basis.
tl;dr gcloud functions deploy FUNCTION_NAME --service-account SERVICE_ACCOUNT_EMAIL
By the way, if you ever need more complex scheduling logic, consider looking into Cloud Composer (managed Apache Airflow): https://cloud.google.com/composer/
I have an application which uses objectify which i want to deploy in a Google Compute engine to access google datastore. I have been able to test this application in local development server using Objectify. I am also able to access the cloud datastore from the compute engine by following the documentation in https://cloud.google.com/datastore/docs/getstarted/start_java/.
But when I deploy my application in the google compute engine I am not able to communicate with the google cloud datastore and am getting the following exception:
No API environment is registered for this thread.
I should be missing something. Kindly help me out.
As far as I can tell, Objectify currently only works with Datastore if you are running in Google App Engine (GAE), not for Datastore Access via Google Compute Engine (GCE). There is an open issue https://github.com/objectify/objectify/issues/203
The reason it does not work on Google Compute Engine is apparently because the API for Datastore access in GCE is apparently different from the one used for GAE.
Since i am new to Google App Engine, I am using MySQL database locally and I am running a sample application in Google App Engine. But i need to use database also using Google Cloud SQL, I referred some details in the Google developer zone and found some information in connecting Google Cloud SQL using googlemysql connection
I have used a sample code for MySQL connection
if (SystemProperty.environment.value() == SystemProperty.Environment.Value.Production) {
// Load the class that provides the new "jdbc:google:mysql://"
// prefix.
Class.forName("com.mysql.jdbc.GoogleDriver");
url = "jdbc:google:mysql://azk-net:your-instance-name/guestbook?user=root";
} else {
// Local MySQL instance to use during development.
Class.forName("com.mysql.jdbc.Driver");
url = "jdbc:mysql://127.0.0.1:3306/guestbook?user=root";
// Alternatively, connect to a Google Cloud SQL instance using:
// jdbc:mysql://ip-address-of-google-cloud-sql-instance:3306/guestbook?user=root
}
but the thing is i cannot create instance name for my project. In local it is working correctly but i have deployed in the google app engine i am getting error since instance name is not available
I am creating the instance name in the Google Cloud SQL but it is not creating it requires billing.
can anyone help me to connect the database from local to Google Cloud SQL by writing connection properties for accessing database from Google Cloud SQL.
There's no free quota for Google Cloud SQL, you have to enable billing to use it
Google offers two billing plans for Cloud SQL: Packages and Per Use. More information on pricing for Google Cloud SQL can be found on the Cloud SQL Pricing page.