functions and events in OCI - oracle-cloud-infrastructure

I have a scenario in where I want to run commands in instance using functions based on events in OCI(oracle cloud infrastructure).
flow :
object storage: object update/modification -> trigger event -> execute function : to run commands in specific instances using run command
Is this achievable ?
As currently I see that to execute run command service we would need oci config files(profile)

To do this, you can use the OCI Java or Python SDK within the function to invoke the Run Command API.
See API details at “Using the API” in the docs. You will also need to configure the Functions resource principal to have access to the Compute instance in question. See this for details on configuring the Functions resource principal

Related

Dynatrace Invoke shell script from Linux server

Is there a way to create a problem metric in Dynatrace using a shell script that can be executed from the Linux server?
Here, Problem metric means,
Let's assume that we are using a shell script to check the status of deployed services on the Linux Server.
Then,
That Shell Script should be able to be called by Dynatrace
And, based on Shell Script's response, should be able to create Problem.
What do you mean by 'problem metric'?
You can create metrics via the Metric API and Problems via the Events API
You can call either endpoint from a shell script on linux. If there is a OneAgent on the system you could also use an extension.

How to create a function that runs a gcloud command?

I use the following command in my Compute Engine to run a script that's stored in Cloud Storage:
gsutil cat gs://project/folder/script.sh | sh
I want to create a function that runs this command and eventually schedule to run this function, but I don't know how to do this. Does anyone know how to do this?
Cloud Functions is serverless and you can't manage the runtime environment. You don't know what is installed on the runtime environment of the Cloud Functions and your can't assume that GCLOUD exists.
The solution is to use cloud Run. the behavior is very close to Cloud Functions, simply wrap your function on a webserver (I wrote my first article on that) and, in your container, install what you want, especially GCLOUD SDK (you can also use a base image with GCLOUD SDK already installed). And this time you will be able to call system binaries, because you know that they exist because you installed them!
Anyway, be careful in your script execution: the container is immutable, you can't change file, the binaries, the stored files,... I don't know the content of your script but your aren't on a VM, you are still on serverless environment, with ephemeral runtime.

how to run a Initialization Script to create vm in the oracle oci java sdk

In the Oracle OCI web console, when you try to create a new compute instance, the console allows you to upload a Initialization Script to run some of the setup command automatically.
how to achieve the same benefit from the oci java sdk?
When you create an OCI VM, you can set its metadata to include an initialization script. Take a look at the example on how to set instance metadata as part of instance launch here. To specify an initialization script as part of the metadata, you'll want to set the key user_data in the metadata, with the value described as mentioned here (see "user_data" section).

Google Functions + Execute bash script on instance

I need to execute a bash script/command within a Google compute instance using Google Functions and get a response.
AWS has an agent called SSM that let me do that with no hassle using Lambda, nevertheless I did not find anything like that on Google Cloud.
On AWS using a nodejs lambda I use the following example:
ssm.sendCommand({
DocumentName: documentName,
InstanceIds: [ instanceId ],
TimeoutSeconds: 3600,
Parameters: {
'commands' : commands
}
}
How can I achieve what I want on Google Cloud? Thank you.
The nature of Google Cloud Functions is that it is the most abstract of the Serverless paradigms. It assumes that all you are providing is stateless logic to be executed in one of the supported languages. You aren't supposed to know nor care how that logic is executed. Running bash (for example) makes an assumption that your Cloud Function is running in a Unix like environment where bash is available. While this is likely to be a true statement, it isn't part of the "core" contract that you have with the Cloud Function environment.
An alternative solution I would suggest is to study Google's Cloud Run concept. Just like Cloud Functions, Cloud Run is serverless (you don't provision any servers and it scales to zero) but the distinction is that what is executed is a docker container. Within that container, your "code" within is executed in the container environment when called. Google spins up / spins down these containers as needed to satisfy your incoming load. Since it is running in a container, you have 100% control over what your logic does ... including running bash commands and providing any scripts or other environment needed to be able to run.
This is not possible. You can perform some actions such as start a VM or stop it from a Cloud Function, but you can't get or list the dirs within a VM. In this case, the Compute Engine API is being used, but only the is reached.
The workaround would be to create a request handler in your VM in order to could be rached by the CF. The proper security would be implemented in order to avoid security issues and requests from anonymous callers. You might use the public IP to reach your VM.
You can run a Bash script on Google Cloud Run with a Bash Docker image. As an example - https://github.com/sethvargo/cloud-run-bash-example
Then you can call this Cloud Run service from your other existing Google Cloud Functions, AppEngine etc.

I can't use any commands on gcloud sql

MacBook-Air-2:~ Owner$ gcloud sql instances describe ahaha-mysql
ERROR: (gcloud.sql.instances.describe) There was no instance found at projects/ahaha-20180621/instances/ahaha-mysql or you are not authorized to access it.
You will need to enable permission for the compute instance to access the cloud sql
Stop the instance
Edit the instance and change the Cloud API access scopes
Enable Cloud SQL
Restart the instance
run gcloud auth login and authenticate as yourself by following the link (you might need to fix it in a text editor) and entering the verification code.
run gcloud sql instances describe ahaha-mysql this should now work