Cloudwatch Logs Insights aggregating data for application dashboard - json

I have a lambda function that makes a call to a ticketing API, and returns a list of tickets and their attributes (open/resolved, assignee, etc). I want to build a dashboard within CloudWatch to show this information but I'm not sure if I'm approaching the problem correctly.
For example if I wanted a visualization of "Open tickets which are assigned to Jason" which changes over time. I've tried using Log Insights, but the response message is an array of tickets, which I've not been able to successfully query. The logs are structured as:
{
"level": "INFO",
"location": "get_all_tickets:18",
"message": [
{
"ticketId": "001",
"status": "Open",
"assignee": "Jason",
"requester": "Paul",
"createdAt": "2022-10-20 11:08:35.105000+00:00",
"lastUpdatedAt": "2022-10-25 13:42:52.881000+00:00",
"title": "Example Ticket 1",
},
{
"ticketId": "002",
"status": "Resolved",
"assignee": "Jason",
"requester": "John",
"createdAt": "2022-10-20 11:09:35.105000+00:00",
"lastUpdatedAt": "2022-10-25 13:42:52.881000+00:00",
"title": "Example Ticket 2",
}
],
"timestamp": "2022-10-27 18:26:32,680+0000",
"service": "ticket_metrics"
}
Within Logs Insights the fields are serialized as message.0.status, message.1.status, etc, but I haven't found a way to query and aggregate these. Is there a way I can produce a metric, like the above example "Open Tickets assigned to Jason" from within Logs Insights?
I tried queries from the Logs Insights docs, but these didn't perform as expected or didn't apply to my use case.

Related

FIWARE - Orion Context Broker as Context Provider

I'm having a hard time understanding how context providers work in the Orion Context Broker.
I followed the examples in the step-by-step guide written by Jason Fox. However, I still do not exactly get what happens in the background and how the context broker exactly creates the POST from the registration. Here is what I am trying to do:
I do have a WeatherStation that provides sensor data for a neighborhood.
{
"id": "urn:ngsi-ld:WeatherStation:001",
"type": "Device:WeatherStation",
"temperature": {
"type": "Number",
"value": 20.5,
"metadata": {}
},
"windspeed": {
"type": "Number",
"value": 60.0,
"metadata": {}
}
}
Now I like the WeatherStation to be a context provider for all buildings.
{
"id": "urn:ngsi-ld:building:001",
"type": "Building"
}
Here is the registration that I try to use.
{
"id": null,
"description": "Random Weather Conditions",
"provider": {
"http": {
"url": "http://localhost:1026/v2"
},
"supportedForwardingMode": "all"
},
"dataProvided": {
"entities": [
{
"id": "null",
"idPattern": ".*",
"type": "Building",
"typePattern": null
}
],
"attrs": [
"temperature",
"windspeed"
],
"expression": null
},
"status": "active",
"expires": null,
"forwardingInformation": null
}
The context broker accepts both entities and the registration without any error.
Since I have a multi-tenant setup I use one fiware_service for the complete neighborhood but every building would later have a seperate fiware_servicepath. Hence, the weatherstation has a different servicepath than the building. Although I also tried to put them both on the same path.
For now I used the same headers for all entities.
{
"fiware-service": "filip",
"fiware-servicepath": "/testing"
}
Here is the log of the context broker (version: 3.1.0):
INFO#2021-09-23T19:17:17.944Z logTracing.cpp[212]: Request forwarded (regId: 614cd2b511c25270060d873a): POST http://localhost:1026/v2/op/query, request payload (87 bytes): {"entities":[{"idPattern":".*","type":"Building"}],"attrs":["temperature","windspeed"]}, response payload (2 bytes): [], response code: 200
INFO#2021-09-23T19:17:17.944Z logTracing.cpp[130]: Request received: POST /v2/op/query?options=normalized%2Ccount&limit=1000, request payload (55 bytes): {"entities": [{"idPattern": ".*", "type": "Building"}]}, response code: 200
The log says that it receives the request and forwards it as expected. However, as I understand it this would simply point to the same building entity again. Hence, it is somehow a circular forwarding. I also cannot tell anything about the headers of the request.
I do not understand how the forwarded request from the building can actually query the weather station for information. When I query my building I still only receive the entity with no own properties:
{
"id": "urn:ngsi-ld:building:001",
"type": "Building"
}
I also tried to vary the url of the registration but with no success.
Is this scenario actually possible with the current implementation? It would be very useful
Is there any example for this including also the headers?
I know that I could simply use reference but that would put more work on the user.
Thanks for any help on this.
It is messy, but you could achieve this via a subscription. Hold the weather station as a separate entity in the context broker and poll or push updates into the entity. The subscription would fire whenever the data changes and make two NGSI requests:
Find all entities which have a Relationship servicedBy=WeatherStationX
Run an upsert on all entities to add a Property to each entity:
{
"temperature" : {
"type" : "Property",
"value" : 7,
"unitCode": "CEL",
"observedAt": "XXXXX",
"providedBy": "WeatherStation1"
}
}
Where observedAt comes either from the payload of the weather station or the notification timestamp.
Within the existing IoT Agents, provisioning the link attribute allows a device to propagate measures to a second entity (e.g. this Thermometer entity is measuring temperature for an associated Building entity)
{
"entity_type": "Device",
"resource": "/iot/d",
"protocol": "PDI-IoTA-UltraLight",
..etc
"attributes": [
{"object_id": "l", "name": "temperature", "type":"Float",
"metadata":{
"unitCode":{"type": "Text", "value" :"CEL"}
}
}
],
"static_attributes": [
{
"name": "controlledAsset",
"type": "Relationship",
"value": "urn:ngsi-ld:Building:001",
"link": {
"attributes": ["temperature"],
"name": "providedBy",
"type": "Building"
}
}
]
}
At the moment the logic just links direct one-to-one, but it would be possible to raise a PR to check for an Array and update multiple entities in an upsert - the relevant section of code is here

How do I test for Google Calendar events that are marked: "Automatically decline new and existing meetings"

Non-engineer here. I'm working on a program (using Google Sheets) that will help people in my office analyze their calendars.
The Google Calendar Service has methods that allow me to retrieve event start and end times, guest lists, and much more. Unfortunately, I haven't figured out how to tell the difference between a "Reminder" and an "Out of the office" event. They both show up as all day events, but the former can be ignored and the latter causes that entire day to show up as "blocked" or "scheduled". I need to be able to tell them apart, but I'm not sure how.
Any suggestions?
MORE DETAIL:
I create a "Reminder" in Google Calendar by clicking just below the day, at the top. This gives me three choices: Event, Out of the office, and Appointment slots. When I am creating a Reminder for myself I first name it (i.e. "Doctor's appointment at 2:00"), then I usually change the color to red so it's extra visible.
If I am going to be Out of the office I choose that option. The option "Automatically decline new and existing meetings" is checked by default. I add a name like, "Dev Conference" and Save it. This causes the entire day to show in a light blue color, indicating that the entire day is blocked.
You can differentiate between a normal event and a "Out of Office" event because of the latter not having the transparency attribute included and it has the description attribute filled with the following value:
"This is an out-of-office event, which can only be edited in Google Calendar. Meetings during this time will be automatically declined."
You can test this by making a Events.list request and check the results, for example:
Normal event:
{
"kind": "calendar#event",
"etag": "\"3167876463844000\"",
"id": "1nqq8sg43po8itr8h0ebedfgbq",
"status": "confirmed",
"htmlLink": "https://www.google.com/calendar/event?eid=MW5xcThzZzQzcG84aXRyOGgwZWJlZGZnYnEgdGVzdGluYUBlZ3Mtc2J0MDExLmV1",
"created": "2020-03-11T14:50:31.000Z",
"updated": "2020-03-11T14:50:31.922Z",
"summary": "test",
"creator": {
"email": "testina#egs-sbt011.eu",
"self": true
},
"organizer": {
"email": "testina#egs-sbt011.eu",
"self": true
},
"start": {
"date": "2020-03-16"
},
"end": {
"date": "2020-03-17"
},
"transparency": "transparent",
"iCalUID": "1nqq8sg43po8itr8h0ebedfgbq#google.com",
"sequence": 0,
"reminders": {
"useDefault": false
}
}
Out of Office event:
{
"kind": "calendar#event",
"etag": "\"3167876472386000\"",
"id": "29g3lpl9hojb92bevvkvdccq6p",
"status": "confirmed",
"htmlLink": "https://www.google.com/calendar/event?eid=MjlnM2xwbDlob2piOTJiZXZ2a3ZkY2NxNnAgdGVzdGluYUBlZ3Mtc2J0MDExLmV1",
"created": "2020-03-11T14:50:36.000Z",
"updated": "2020-03-11T14:50:36.289Z",
"summary": "test OOO",
"description": "This is an out-of-office event, which can only be edited in Google Calendar. Meetings during this time will be automatically declined.",
"creator": {
"email": "testina#egs-sbt011.eu",
"self": true
},
"organizer": {
"email": "testina#egs-sbt011.eu",
"self": true
},
"start": {
"dateTime": "2020-03-17T00:00:00+01:00"
},
"end": {
"dateTime": "2020-03-18T00:00:00+01:00"
},
"visibility": "public",
"iCalUID": "29g3lpl9hojb92bevvkvdccq6p#google.com",
"sequence": 0,
"reminders": {
"useDefault": false
}
}
Regarding appointment slots, these are not supported by Google Calendar API, you can +1 this open feature request to implement this.
For a while after the feature was introduced, there was no way to filter them via API other than parsing the description. There is now an eventType key on events.
https://developers.google.com/calendar/api/v3/reference/events#eventType
Look for "eventType": "outOfOffice" events.

Retrieving "Description" or "Custom Attribute" fields using Autodesk Forge API

We are trying to retrieve the description or custom attribute field as shown in BIM360 Docs using Autodesk Forge API requests/commands.
We have tried the following requests to retrieve information about a specific file:
https://forge.autodesk.com/en/docs/data/v2/reference/http/projects-project_id-items-item_id-GET/
https://forge.autodesk.com/en/docs/data/v2/reference/http/ListItems/
We get a lot of information/data about our files, but we cannot see the Description field neither a Custom Attribute in the responses that we are getting.
"data": {
"type": "versions",
"id": "urn:adsk.wipprod:fs.file:vf.WKuhlYuyR8uK2WT8HE1bCQ?version=20",
"attributes": {
"name": "TESTING",
"displayName": "TESTING",
"createTime": "2019-07-29T09:37:33.0000000Z",
"createUserId": "*****",
"createUserName": "TESTING",
"lastModifiedTime": "2019-08-05T08:27:10.0000000Z",
"lastModifiedUserId": "*****",
"lastModifiedUserName": "TESTING",
"versionNumber": 20,
"storageSize": 27020,
"fileType": "xlsx",
"extension": {
"type": "versions:autodesk.bim360:File",
"version": "1.0",
"schema": {
"href": "https://developer.api.autodesk.com/schema/v1/versions/versions:autodesk.bim360:File-1.0"
},
"data": {
"processState": "PROCESSING_COMPLETE",
"extractionState": "UNSUPPORTED",
"splittingState": "NOT_SPLIT",
"revisionDisplayLabel": "20",
"sourceFileName": "TESTING"
}
}
},
...
}
Among all of these fields, we expected the fields "Description" or "Custom Attributes" to appear as well (As they are shown in BIM360 DOCS). ¿Is it possible to retrive these fields using API requests?
At the moment it is not possible to retrieve Custom Attribute information from BIM 360 Docs, But a new API is under development and in Private Beta testing at the moment to get this information back, Please check back with us in a near future since this will become available soon. Thank you for understanding.

How to add date/time exception to recurring Google calendar event

I have code that is creating Google calendar entries (recurring events) using the Google calendar API. This is working fine. Now I am trying to add exceptions to certain instances. I've got it working for deleted instances (by using the EXDATE parameter) but I'm having trouble changing the date/time of an instance. I am following the instructions found here:
https://developers.google.com/google-apps/calendar/recurringevents
After I create the basic recurring event, I retrieve a list of all instances. For testing, I am pulling out the entire section of json that applies to the instance I want to change, for example:
{
"kind": "calendar#event",
"etag": "\"3032281375946000\"",
"id": "_69j3ge9iccp68or1c8p3ac1nccpm8p356gpj8pb66koj2c32c5h0_20180130T203000Z",
"status": "confirmed",
"htmlLink": "https://www.google.com/calendar/event?eid=XzY5ajNnZTlpY2NwNjhvcjFjOHAzYWMxbmNjcG04cDM1NmdwajhwYjY2a29qMmMzMmM1aDBfMjAxODAxMzBUMjAzMDAwWiBqbWNrYXk5MzUxQG0",
"created": "2018-01-16T20:08:56.000Z",
"updated": "2018-01-17T00:13:59.277Z",
"summary": "Recurring test with deletes and changes",
"description": "\n",
"location": "SEattle",
"creator": {
"email": "jmckay9351#gmail.com",
"displayName": "Jeffrey McKay",
"self": true
},
"organizer": {
"email": "jmckay9351#gmail.com",
"displayName": "Jeffrey McKay",
"self": true
},
"start": {
"dateTime": "2018-01-30T13:30:00-08:00"
},
"end": {
"dateTime": "2018-01-30T14:00:00-08:00"
},
"recurringEventId": "_69j3ge9iccp68or1c8p3ac1nccpm8p356gpj8pb66koj2c32c5h0",
"originalStartTime": {
"dateTime": "2018-01-30T12:30:00-08:00"
},
"iCalUID": "2f892c2dcab2507c3dde434ef5110bab",
"sequence": 0,
"reminders": {
"useDefault": false,
"overrides": [{
"method": "popup",
"minutes": 15
}]
}
}
Then I do a http PUT of this data to this URL:
https://www.googleapis.com/calendar/v3/calendars/jmckay9351#gmail.com/events/_69j3ge9iccp68or1c8p3ac1nccpm8p356gpj8pb66koj2c32c5h0_20180130T203000Z
What I get back is a http error 400, and the error message in the returned json is "Missing end time". I get the same response regardless of how much or little data I feed into the PUT. Any idea what I'm doing wrong here?
OK I've got this working now - I don't understand what was wrong with my libcurl PUT code, but I changed it to be a POST with a custom request of "PUT" and that seemed to get to data correctly uploaded.

Data Factory: AzureSQL in- and output for pipeline activity type AzureMLBatchExecution

In Azure Data Factory, I’m trying to call an Azure Machine Learning model by a Data Factory Pipeline. I want to use a Azure SQL table as input and another Azure SQL table for the output.
First I deployed a Machine Learning (classic) web service. Then I created an Azure Data Factory Pipeline, using a LinkedService (type= ‘AzureML’, using Request URI and API key of the ML-webservice) and a input and output dataset (‘AzureSqlTable’ type).
Deploying and Provisioning is succeeded. The pipeline starts as scheduled, but keeps ‘Running’ without any result. The pipeline activity is not being shown in the Monitor&Manage: Activity Windows.
On different sites and tutorials, I only find JSON-scripts using the activity type ‘AzureMLBatchExecution’ with BLOB in- and outputs. I want to use AzureSQL in- and output but I can’t get this working.
Can someone provide a sample JSON-script or tell me what’s possibly wrong with the code below?
Thanks!
{
"name": "Predictive_ML_Pipeline",
"properties": {
"description": "use MyAzureML model",
"activities": [
{
"type": "AzureMLBatchExecution",
"typeProperties": {},
"inputs": [
{
"name": "AzureSQLDataset_ML_Input"
}
],
"outputs": [
{
"name": "AzureSQLDataset_ML_Output"
}
],
"policy": {
"timeout": "02:00:00",
"concurrency": 3,
"executionPriorityOrder": "NewestFirst",
"retry": 1
},
"scheduler": {
"frequency": "Week",
"interval": 1
},
"name": "My_ML_Activity",
"description": "prediction analysis on ML batch input",
"linkedServiceName": "AzureMLLinkedService"
}
],
"start": "2017-04-04T09:00:00Z",
"end": "2017-04-04T18:00:00Z",
"isPaused": false,
"hubName": "myml_hub",
"pipelineMode": "Scheduled"
}
}
With a little help from a Microsoft technician, I've got this working. The JSON script as mentioned above is only changed in the schedule-section:
"start": "2017-04-01T08:45:00Z",
"end": "2017-04-09T18:00:00Z",
A pipeline is active only between its start time and end time. Because the scheduler is set to weekly, the pipeline is triggered at the start of the week: that date should be within start- and end date. For more details about scheduling, see: https://learn.microsoft.com/en-us/azure/data-factory/data-factory-scheduling-and-execution
The Azure SQL Input dataset should look like this:
{
"name": "AzureSQLDataset_ML_Input",
"properties": {
"published": false,
"type": "AzureSqlTable",
"linkedServiceName": "SRC_SQL_Azure",
"typeProperties": {
"tableName": "dbo.Azure_ML_Input"
},
"availability": {
"frequency": "Week",
"interval": 1
},
"external": true,
"policy": {
"externalData": {
"retryInterval": "00:01:00",
"retryTimeout": "00:10:00",
"maximumRetry": 3
}
}
}
I added the external and policy properties to this dataset (see script above) and after that, it worked.