gcloud functions list command return 0 items but has active cloud function - google-cloud-functions

I have deployed one cloud function (gen2) to the GCP as a trigger as HTTP, once deployed and verified at the console it is active and able to browse the URL, but when I run the gcloud functions list it returns 0 items.
Does anyone have the same issue when using the gcloud functions list? I wonder if it could be the issue on the cloud function gen2?

After upgrading the gcloud(Google Cloud SDK) to the latest 406.0.0, and running the gcloud functions list works.
I think before the upgrade, my gcloud version was 3xx something.

Related

Dynatrace Invoke shell script from Linux server

Is there a way to create a problem metric in Dynatrace using a shell script that can be executed from the Linux server?
Here, Problem metric means,
Let's assume that we are using a shell script to check the status of deployed services on the Linux Server.
Then,
That Shell Script should be able to be called by Dynatrace
And, based on Shell Script's response, should be able to create Problem.
What do you mean by 'problem metric'?
You can create metrics via the Metric API and Problems via the Events API
You can call either endpoint from a shell script on linux. If there is a OneAgent on the system you could also use an extension.

How to create a function that runs a gcloud command?

I use the following command in my Compute Engine to run a script that's stored in Cloud Storage:
gsutil cat gs://project/folder/script.sh | sh
I want to create a function that runs this command and eventually schedule to run this function, but I don't know how to do this. Does anyone know how to do this?
Cloud Functions is serverless and you can't manage the runtime environment. You don't know what is installed on the runtime environment of the Cloud Functions and your can't assume that GCLOUD exists.
The solution is to use cloud Run. the behavior is very close to Cloud Functions, simply wrap your function on a webserver (I wrote my first article on that) and, in your container, install what you want, especially GCLOUD SDK (you can also use a base image with GCLOUD SDK already installed). And this time you will be able to call system binaries, because you know that they exist because you installed them!
Anyway, be careful in your script execution: the container is immutable, you can't change file, the binaries, the stored files,... I don't know the content of your script but your aren't on a VM, you are still on serverless environment, with ephemeral runtime.

Google Functions + Execute bash script on instance

I need to execute a bash script/command within a Google compute instance using Google Functions and get a response.
AWS has an agent called SSM that let me do that with no hassle using Lambda, nevertheless I did not find anything like that on Google Cloud.
On AWS using a nodejs lambda I use the following example:
ssm.sendCommand({
DocumentName: documentName,
InstanceIds: [ instanceId ],
TimeoutSeconds: 3600,
Parameters: {
'commands' : commands
}
}
How can I achieve what I want on Google Cloud? Thank you.
The nature of Google Cloud Functions is that it is the most abstract of the Serverless paradigms. It assumes that all you are providing is stateless logic to be executed in one of the supported languages. You aren't supposed to know nor care how that logic is executed. Running bash (for example) makes an assumption that your Cloud Function is running in a Unix like environment where bash is available. While this is likely to be a true statement, it isn't part of the "core" contract that you have with the Cloud Function environment.
An alternative solution I would suggest is to study Google's Cloud Run concept. Just like Cloud Functions, Cloud Run is serverless (you don't provision any servers and it scales to zero) but the distinction is that what is executed is a docker container. Within that container, your "code" within is executed in the container environment when called. Google spins up / spins down these containers as needed to satisfy your incoming load. Since it is running in a container, you have 100% control over what your logic does ... including running bash commands and providing any scripts or other environment needed to be able to run.
This is not possible. You can perform some actions such as start a VM or stop it from a Cloud Function, but you can't get or list the dirs within a VM. In this case, the Compute Engine API is being used, but only the is reached.
The workaround would be to create a request handler in your VM in order to could be rached by the CF. The proper security would be implemented in order to avoid security issues and requests from anonymous callers. You might use the public IP to reach your VM.
You can run a Bash script on Google Cloud Run with a Bash Docker image. As an example - https://github.com/sethvargo/cloud-run-bash-example
Then you can call this Cloud Run service from your other existing Google Cloud Functions, AppEngine etc.

I can't use any commands on gcloud sql

MacBook-Air-2:~ Owner$ gcloud sql instances describe ahaha-mysql
ERROR: (gcloud.sql.instances.describe) There was no instance found at projects/ahaha-20180621/instances/ahaha-mysql or you are not authorized to access it.
You will need to enable permission for the compute instance to access the cloud sql
Stop the instance
Edit the instance and change the Cloud API access scopes
Enable Cloud SQL
Restart the instance
run gcloud auth login and authenticate as yourself by following the link (you might need to fix it in a text editor) and entering the verification code.
run gcloud sql instances describe ahaha-mysql this should now work

Deploy Google Cloud Function from GAE/Java

I'd like to deploy a Google Cloud Function from GAE/Java but I cannot find any info about deploying a function other than using the gcloud command line tool.
Is there a way to deploy a cloud function from Google App Engine (standard) / Java e.g. by using the Cloud Storage API and setting some additional fields on the request (e.g. for the httptrigger etc.)?
You can create Cloud Functions using the REST or RPC end-points.