How to get logs of eventarc events - google-cloud-functions

In the documentation, the firebase resize extension has the piece of code
exports.onimageresized = onCustomEventPublished(
"firebase.extensions.storage-resize-images.v1.complete",
(event) => {
logger.info("Received image resize completed event", event);
// For example, write resized image details into Firestore.
return getFirestore()
.collection("images")
.doc(event.subject.replace("/", "_")) // original file path
.set(event.data); // resized images paths and sizes
});
only when I check for that log in the function logs, there is nothing, only a create operation log(for when the function was deployed). Weird thing is the function still writes to firestore, I actually thought it was not firing.
Where are the eventarc logs read from, I cannot seem to find them

The following types of audit logs are available for Eventarc:
Admin Activity audit logs Includes "admin write" operations that
write metadata or configuration information. You can't disable Admin
Activity audit logs.
Data Access audit logs Includes "admin read" operations that read
metadata or configuration information. Also includes "data read" and
"data write" operations that read or write user-provided data.
To receive Data Access audit logs, you must explicitly enable them.
Eventarc audit logs use the service name eventarc.googleapis.com.
Eventarc audit logs use the resource type audited_resource for all audit logs.
You can view audit logs in Cloud Logging by using the Google Cloud console, the Google Cloud CLI, or the Logging API.
To view directly using the Cloud console follow the below steps:
In the Google Cloud console, go to the Logging> Logs Explorer page
Select an existing Cloud project, folder, or organization.
In the Query builder pane,In Resource type, select the Google Cloud
resource whose audit logs you want to see and in Log name, select
the audit log type that you want to see.
If you're experiencing issues when trying to view logs in the Logs Explorer, see the troubleshooting information.
Also check the documentation for functions calling eventrac and supported events.

Related

How can I get Grafana to read a custom metric?

Currently my Grafana Dashboard reads system info from the Grafana agent that runs on my machine.
I have a script that executes hourly to do some action. If the script executes successfully then it can output that success to an XML file or create a file called "success.txt". If the script fails then it could create a file "fail.txt".
How can I get Grafana to check for the presence of a file or a file's content to get it to report back to the dashboard the status, basically a binary result, of a custom metric "Hourly script job" such as success or fail?
I've searched the web and found any-json-to-metrics exporter but not sure that'll work. I'd like to avoid hosting a web server that exposes endpoints. I'd like for the Grafana agent to pick up the custom metrics.

Permission denied when running scheduling Vertex Pipelines

I wish to schedule a Vertex Pipelines and deploy it from my local machine for now.
I have defined my pipeline which runs well I deploy it using: create_run_from_job_spec, on AIPlatformClient running it once.
When trying to schedule it with create_schedule_from_job_spec, I do have a Cloud Scheduler object well created, with a http endpoint to a Cloud Function. But when the scheduler runs, it fails because of Permission denied error. I used several service accounts with owner permissions on the project.
Do you know what could have gone wrong?
Since AIPlatformClient from Kubeflow pipelines raises deprecation warning, I also want to use PipelineJob from google.cloud.aiplatform but I cant see any direct way to schedule the pipeline execution.
I've spent about 3 hours banging my head on this too. In my case, what seemed to fix it was either:
disabling and re-enabling cloud scheduler api. Why did I do this? There is supposed to be a service account called service-[project-number]#gcp-sa-cloudscheduler.iam.gserviceaccount.com. If it is missing then re-enabling API might fix it
for older projects there is an additional step: https://cloud.google.com/scheduler/docs/http-target-auth#add
Simpler explanations include not doing some of the following steps
creating a service account for scheduler job. Grant cloud function invoker during creation
use this service account (see create_schedule_from_job_spec below)
find the (sneaky) cloud function that was created for you it will be called something like 'templated_http_request-v1' and add your service account as a cloud function invoker
response = client.create_schedule_from_job_spec(
job_spec_path=pipeline_spec,
schedule="*/15 * * * *",
time_zone="Europe/London",
parameter_values={},
cloud_scheduler_service_account="<your-service-account>#<project_id>.iam.gserviceaccount.com"
)
If you are still stuck, it is also useful to run gcloud scheduler jobs describe <pipeline-name> as it really helps to understand what scheduler is doing. You'll see cloudfunction url, POST payload which is some base64 encoded and contains pipeline yaml and you'll see that it is using OIDC/service account for security. Also useful is to view the code of the 'templated_http_request-v1' cloud function (sneakily created!). I was able to invoke the cloudfunction from POSTMAN using the payload obtained from scheduler job.

Does Google Drive Activity API expose view as an action?

I've managed to get the Google Drive API to run against a Google doc in my personal drive (of a Corporate paid account) using their example (after correcting for Python 2 syntax).
However, it shows only edit and comment as primary actions in that history, I don't see any actions saying just view.
Does that mean that there is no action recorded for document views? or that they are simply recorded as comment or something else, since the document URL always seems to redirect to .../edit even if I only grant View privileges.
There is no view action detail in Drive Activity API.
edit or comment action detail cannot be used to get view action history.
I tried to view a document and get all its activity and there is no new record seen (either view, edit or comment).
If you want to get the view history of a document, here are some of your options:
Use Drive API Files:get(), it will return a file resource which contains viewedByMe flag and viewedByMeTime datetime parameter.
It will only show the last time you viewed the file. If the file is shared, you cannot use this to get the recent datetime the file was viewed by other user.
If you have an admin account, you can use activities.list in Admin SDK Reports API to access Drive audit logs which contains view log activities and other events. See Drive Audit Activity Events for a list of events available.
Sample activities.list Request Parameters:
userKey: all
applicationName: drive
eventName: view
filters: doc_id==13NgKy87BggedOXnmkymygTyEh0xxxxxxxx
NOTE:
If the file was viewed by a user outside your organization, the email address will not be available (User is anonymous)
Drive audit logs has a data retention time of 6 months. You can access Drive audit logs data this far back.
References:
View user Google Drive file activity
Data retention and lag times

How can I get logs for Cloud Functions that violate a Stackdriver alerting policy?

I am using Cloud Functions on Google Cloud Platform. I have set up a Stackdriver alerting policy to send me notifications when those functions exceed an execution time threshold.
I would like to specifically get the logs for function instances that violate this policy. But when I click on the "Logs" link on the Policy Violation page of the Stackdriver user interface, it shows me all the logs for that function.
How can I filter Cloud Function logs to only get logs of instances which violate a Stackdriver alerting policy?
Update:
To clarify, my intention is to get the text logs of instances that violate a policy, rather than a summary metric.
If you want to filter on Cloud Function logs of an instance which violate a stackdriver alerting policy, then you need to create a "custom metric" (Stackdriver > Logs-based metrics > Create Metric) to get the execution time of a particular function.
You can find more detailed information in this article, Getting Google Cloud Functions times in Stackdriver.
However, the custom metric will parse the logs and give Stackdriver a number you can put in a chart. After that, you can set an alert for that chart.

how to log the response and request in API Management

how to log the response size ,request size, error message from API management instance?
If yes then how can I fetch the data from it.
There is an inbuilt "log-to-event-hub" policy that you can use to send basically any information that exists on the context object (meaning the request/response + a bit more) to an event hub. From there you can use any regular method for processing the events
How to log events to Azure Event Hubs in Azure API Management.
Use Azure Monitor to configure diagnostic logs from ApiManagement to either Storage, Eventhub or Log Analytics. These logs have the data you are looking for.
I would start with free tier of Log Analytics for easy querying, dashboards and alerting. Refer this.
For more custom logging, you can use log to event hub policy. Refer this blog.