cygnus mongo sink and meta data storing - fiware

I tried to stored this meta data entity but seems like cygnus is only storing entity data no meta data was stored in the data base.
Here is how I update my entity using NGSI v1 updateContext
{
"contextElements": [
{
"type": "dummyMeta",
"isPattern": "false",
"id": "dummyMeta",
"attributes": [
{
"name": "dummy",
"type": "float",
"value": "26.5",
"metadatas": [
{
"name": "accuracy",
"type": "float",
"value": "1"
}
]
}
]
}
],
"updateAction": "APPEND"
}
Here is the payload subscription:
{
"entities": [
{
"id": "dummyMeta",
"type": "dummyMeta",
"isPattern": "false"
}
],
"attributes": [
"dummy"
]
,
"reference": "http://cygnusserver.ddns.net:5050/notify",
"duration":"P1M",
"notifyConditions": [
{
"type": "ONCHANGE",
"condValues": [
"dummy"
]
}
],
"throttling": "PT5S"
}
here is how it is stored in the data base
> db['kura_/egmmqtt_dummyMeta_dummyMeta'].find().sort({$natural:-1})
{ "_id" : ObjectId("57c929d8902531258a3c6ed0"), "recvTime" : ISODate("2016-09-02T07:27:18.331Z"), "attrName" : "dummy", "attrType" : "float", "attrValue" : "26.5" }
{ "_id" : ObjectId("57c92990902531258a3c6ecc"), "recvTime" : ISODate("2016-09-02T07:26:04.148Z"), "attrName" : "dummy", "attrType" : "float", "attrValue" : "26.5" }
What am I missing to be able to store the whole information (data and metadata) about the attribute?
Thanks in advance for your help!

MongoDB sink does not save the metadata by design. It is a requirement by our internal products currently using Cygnus.
Being said that, I think it should not be very difficult to modify the code by your side in order to save the metadata.
Alternatively, I can create an issue about optionally save metadata when configured through a configuration parameter. Nevertheless, I cannot commit with an implementation date.

Related

Unable to extract values from a dynamic list of lists in a JSON response in Power Automate

In Power Automate, from the following json response which is obtained from azure-application-insights, I wish to extract each of the sub-lists in the “rows” list. The number of sub-lists would be variable depending upon the responses from the insights.
{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{
"name": "Name",
"type": "dynamic"
},
{
"name": "Question",
"type": "dynamic"
},
{
"name": "Answer",
"type": "dynamic"
],
"rows": [
[
"John",
"when are you going",
"January’ 2022?"
],
[
"Albert",
"who will add deal co-owners for a deal?",
"The authorizers"
],
[
"Mohan",
"co-deal owner name can be added by whom?",
"The approvers”
]
]
}
]
}
I wish to dynamically extract the contents of all the sub-lists, and use each of the extracted outputs separately.

How to search a Claim using extension field

I have a Claim payload, in which I have added an extension block: (not sure how where the url came from)
"extension" : [{
"url" : "http://hl7.org/fhir/StructureDefinition/iso-21090-EN-use",
"valueString" : "MAPD"
}],
I want search this claim record using the extension but don't know how to do it.
I tried using GET request to https://<azure_fhir_server>/Claim?extension=MAPD but it says
{
"severity": "warning",
"code": "not-supported",
"diagnostics": "The search parameter 'extension' is not supported for resource type 'Claim'."
}
=====================
EDIT:
As suggested by #Nik Klassen, I posted the following payload to /SearchParameter
{
"resourceType" : "SearchParameter",
"id": "b072f860-7ecd-4d73-a490-74acd673f8d2",
"name": "extensionValueString",
"status": "active",
"url" : "http://hl7.org/fhir/SearchParameter/extension-valuestring",
"description": "Returns a Claim with extension.valueString matching the specified one in request.",
"code" : "lob",
"base" : [
"Claim"
],
"type" : "string",
"expression" : "Claim.extension.where(url ='http://hl7.org/fhir/SearchParameter/extension-valuestring').extension.value.string"
}
Also, did the $reindex on Claim, but the couldnt find the column lob($reindex response is below):
{
"resourceType": "Parameters",
"id": "ee8786d2-616a-4b81-8f6a-8089591b1225",
"meta": {
"versionId": "1"
},
"parameter": [
{
"name": "_id",
"valueString": "28e808d6-e420-4a33-bb0b-7cd325c8c169"
},
{
"name": "status",
"valueString": "http://hl7.org/fhir/fm-status|active"
},
{
"name": "priority",
"valueString": "http://terminology.hl7.org/CodeSystem/processpriority|normal"
},
{
"name": "facility",
"valueString": "Location/Location"
},
{
"name": "patient",
"valueString": "Patient/f8d8477c-1ef4-4878-abed-51e514bfd91f"
},
{
"name": "encounter",
"valueString": "Encounter/67062d00-2531-3ebd-8558-1de2fd3e5aab"
},
{
"name": "use",
"valueString": "http://hl7.org/fhir/claim-use|claim"
},
{
"name": "identifier",
"valueString": "TEST"
},
{
"name": "_lastUpdated",
"valueString": "2021-08-25T07:39:15.3050000+00:00"
},
{
"name": "created",
"valueString": "1957-04-12T21:23:35+05:30"
}
]
}
I read somewhere I need to create StructureDefinition, but don't know how to do that.
Basically I want to add a field "LOB" as an extension to all my resources, and search them using: GET: https://fhir_server/{resource}?lob=<value>
By default you can only search on fields that are part of the FHIR spec. These are listed in a "Search Parameters" section on the page for each resource type, i.e. https://hl7.org/fhir/claim.html#search. To search on extensions you will need to create a custom SearchParameter https://learn.microsoft.com/en-us/azure/healthcare-apis/fhir/how-to-do-custom-search, i.e.
POST {{FHIR_URL}}/SearchParameter
{
"resourceType" : "SearchParameter",
"id" : "iso-21090-EN-use",
"url" : "ttp://hl7.org/fhir/SearchParameter/iso-21090-EN-use",
... some required fields ...
"code" : "iso-use",
"base" : [
"Claim"
],
"type" : "token",
"expression" : "Claim.extension.where(url = 'http://hl7.org/fhir/StructureDefinition/iso-21090-EN-use').value.string"
}

Create MongoDB collection with standard JSON schema

I want to create a MongoDB collection using an JSON schema file.
Suppose the JSON file address.schema.json contain address information schema (this file is one of the Json-schema.org's examples):
{
"$id": "https://example.com/address.schema.json",
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "An address similar to http://microformats.org/wiki/h-card",
"type": "object",
"properties": {
"post-office-box": {
"type": "string"
},
"extended-address": {
"type": "string"
},
"street-address": {
"type": "string"
},
"locality": {
"type": "string"
},
"region": {
"type": "string"
},
"postal-code": {
"type": "string"
},
"country-name": {
"type": "string"
}
},
"required": [ "locality", "region", "country-name" ],
"dependencies": {
"post-office-box": [ "street-address" ],
"extended-address": [ "street-address" ]
}
}
What is the MongoDB command, such as mongoimport or db.createCollection to create a MongoDB collection using the above schema?
It can be nice if I can use the file directly in MongoDB with the need of changing the file format manually.
I wonder if the JSON schema format is a standard one, why do I need to change it to adopt it for MongoDB. Why MongoDB does not have this functionality built-in?
You can create it via createCollection command or shell helper:
db.createCollection(
"mycollection",
{validator:{$jsonSchema:{
"description": "An address similar to http://microformats.org/wiki/h-card",
"type": "object",
"properties": {
"post-office-box": { "type": "string" },
"extended-address": { "type": "string" },
"street-address": { "type": "string" },
"locality": { "type": "string" },
"region": { "type": "string" },
"postal-code": { "type": "string" },
"country-name": { "type": "string" }
},
"required": [ "locality", "region", "country-name" ],
"dependencies": {
"post-office-box": [ "street-address" ],
"extended-address": [ "street-address" ]
}
}}})
You only need to specify bsonType instead of type if you want to use a type that exists in bson but not in generic json schema. You do have to remove the lines $id and $schema as those are not supported by MongoDB JSON schema support (documented here)
The only option which you can use is adding jsonSchema validator during collection creating: https://docs.mongodb.com/manual/reference/operator/query/jsonSchema/#document-validator. It will mean that any document which you will insert/update in your collection will have to match with the provided schema

Orion JSON Bad Request

I'm currently trying to subscribe Orion and Cosmos. All data sent to Orion is being updated without any issue. But, when posting to http://xxx.xxx.xx.xx:1026/v1/subscribeContext I'm getting the following error:
{
"subscribeError": {
"errorCode": {
"code": "400",
"reasonPhrase": "Bad Request",
"details": "JSON Parse Error"
}
}
}
This is the json string I'm sending:
{
"entities": [
{
"type": "Location",
"isPattern": "false",
"id": "Device-1"
}
],
"reference": "http://52.31.144.170:5050/notify",
"duration": "PT10S",
"notifyConditions": [
{
"type": "ONCHANGE",
"condValues": [
"position"
]
}
],
"attributes": [
"position"
]
}
The entity updating OK in Orion is:
{
"type": "Location",
"isPattern": "false",
"id": "Device-1",
"attributes": [
{
"name": "position",
"type": "coords",
"value": "24,21",
"metadatas": [
{
"name": "location",
"type": "string",
"value": "WGS84"
}
]
},
{
"name": "id",
"type": "device",
"value": "1"
}
]
}
I've tried with many different examples from readthedocs, and other responses in StackOverflow unsuccessfully.
Which is the correct format? Should I call /subscribeContext before or after updating Orion with /contextEntities?
Orion Context Broker version is 0.26.1.
Thank you in advance.
Taking into account that the same payload works ok when send using curl (see this execution session) I tend to think that some issue in the client (maybe hiden by a programming framework?) is causing the problem.

How can I index .JSON in elasticsearch

I am starting with elasticsearch now and i don't know anything about it.
I have folowing .JSON:
[
{
"label": "Admin Law",
"tags": [
"#admin"
],
"owner": "generalTopicTagText"
},
{
"label": "Judicial review",
"tags": [
"#JR"
],
"owner": "generalTopicTagText"
},
{
"label": "Admiralty/Shipping",
"tags": [
"#shipping"
],
"owner": "generalTopicTagText"
}
]
My mapping is this:
{
"topic_tax": {
"properties": {
"label": {
"type": "string",
"index": "not_analyzed"
},
"tags": {
"type": "string",
"index_name": "tag"
},
"owner": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
I need to put the first .Json into Elasticsearch, but it does not work.
All I know is that i am defining only 1 of this:
{
"label": "Judicial review",
"tags": [
"#JR"
],
"owner": "generalTopicTagText"
}
So when i try to put all of them with my elasticsearch.init, it will not work.
But I really don't know how to declare the mapping.Json to put the all .Json, it is like i need something like a for there.
You have to insert them json after json. But what you should do is use the bulk api of elasticsearch to insert multiple documents in one request. Check this api doc to see how it works
You can do something like this
curl -XPUT 'localhost:9000/es/post/1?version=2' -d '{
"text" : "your test message!"
}'
here is the documentation for index json with elasticsearch