Convert string in dataframe pyspark to table, obtaining only the necessary from string - json

{
"schema": {
"type": "struct",
"fields": [
{
"type": "int32",
"optional": true,
"field": "c1"
},
{
"type": "string",
"optional": true,
"field": "c2"
},
{
"type": "int64",
"optional": false,
"name": "org.apache.kafka.connect.data.Timestamp",
"version": 1,
"field": "create_ts"
},
{
"type": "int64",
"optional": false,
"name": "org.apache.kafka.connect.data.Timestamp",
"version": 1,
"field": "update_ts"
}
],
"optional": false,
"name": "foobar"
},
"payload": {
"c1": 67,
"c2": "foo",
"create_ts": 1663920002000,
"update_ts": 1663920002000
}
}
I have my json string in this format and I don't want the whole data into data into table , wanted the table in this format.
| c1 | c2 | create_ts | update_ts |
+------+------+------------------+---------------------+
| 1 v| foo | 2022-09-21 10:47:54 | 2022-09-21 10:47:54 |
| 28 | foo | 2022-09-21 13:16:45 | 2022-09-21 13:16:45 |
| 29 | foo | 2022-09-21 14:19:10 | 2022-09-21 14:19:10 |
| 30 | foo | 2022-09-21 14:19:20 | 2022-09-21 14:19:20 |
| 31 | foo | 2022-09-21 14:29:19 | 2022-09-21 14:29:19 |

Skip other (nested) attributes by specifying the only one you want to see in the resulting output:
(
spark
.read
.option("multiline","true")
.json("/path/json-path")
.select("payload.*")
.show()
)

Related

QueryDSL with DB2 fetching Nested Json object or Json array aggregation Response

I am trying to fetch nested JSON objects and JSON List from Database using QueryDSL. I have used a native query with LISTAGG and JSON_OBJECT.
Native Query :
SELECT b.id,b.bankName,b.account,b.branch,(select CONCAT(CONCAT('[',LISTAGG(JSON_OBJECT('accountId' value c.accountId, 'name' value customer_name,'amount' value c.amount),',')),']') from CUSTOMER_DETAILS c where c.bankId = b.id) as customers from BANK_DETAILS b
BANK_DETAILS
+----+---------+---------+----------+
| id | BankName| account | branch |
+----+---------+---------+----------+
| 1 | bank1 | savings | branch1 |
| 2 | bank2 | current | branch2 |
+----+---------+---------+----------+
CUSTOMER_DETAILS
+----+-----------+---------------+----------+-----------+
| id | accountId | customer_name | amount | BankId |
+----+-----------+---------------+----------+-----------+
| 1 | 50123 | Abc1 | 150000 | 1 |
| 2 | 50124 | Abc2 | 25000 | 1 |
| 3 | 50125 | Abc3 | 50000 | 2 |
| 4 | 50126 | Abc4 | 250000 | 2 |
+----+-----------+---------------+----------+-----------+
Expected Output for the above tables
[{
"id": "1",
"bankName": "bank1",
"account": "savings",
"branch": "branch1",
"customers": [
{
"accountId": "50123",
"Name": "Abc1",
"amount": 150000
},
{
"accountId": "50124",
"Name": "Abc2",
"amount": 25000
},
]
},{
"id": "2",
"bankName": "bank3",
"account": "current",
"branch": "branch2",
"customers": [
{
"accountId": "50125",
"name": "Abc3",
"amount": 50000
},
{
"accountId": "50126",
"Name": "Abc4",
"amount": 250000
},
]
}]
i have tried with writing this native query in QueryDSL with the below multiple queries for make the same expected output with the forEach loop.
class Repository {
private SQLQueryFactory queryFactory;
public Repository (SQLQueryFactory queryFactory){
this.queryFactory = queryFactory;
}
public void fetchBankDetails(){
List<BankDetails> bankList = queryFactory.select(QBankDetails.bankDetails)
.from(QBankDetails.bankDetails);
bankList.forEach(bankData ->{
List<CustomerDetails> customerList = queryFactory.select(QCustomerDetails.customerDetails)
.from(QCustomerDetails.customerDetails)
.where(QCustomerDetails.customerDetails.bankId.eq(bankData.bankId));
bankData.setCustomerList(customerList)
});
System.out.println(bankList);
}
}
I need to improve my code and convert it into a single query using QueryDSL to return the expected output
Is there any other way or any suggestions?

How to insert a new key-value into the first row of a table containing a json (Snowflake)

I have a table "MY_TABLE" with one column "VALUE" and the first row of the column contains a json that looks like:
{
"VALUE": {
"c1": "name",
"c10": "age",
"c100": "gender",
"c101": "address",
"c102": "status"
}
}
I would like to add a new key-value pair to this json in the first row where the pair is "c125" : "job" so that the result looks like:
{
"VALUE": {
"c1": "name",
"c10": "age",
"c100": "gender",
"c101": "address",
"c102": "status",
"c125": "job"
}
}
I tried:
SELECT object_insert(OBJECT_CONSTRUCT(*),'c125', 'job') FROM MY_TABLE;
But it inserted the new key value pair into the wrong spot so the result looks like:
{
"VALUE": {
"c1": "name",
"c10": "age",
"c100": "gender",
"c101": "address",
"c102": "status"
},
"c125": "job"
}
Is there another way to do this? Thanks!
Another, similar approach, using OBJECT_INSERT -
For original table (assuming, column data-type is variant, else use parse_json function) -
select * from temp_1;
+------------------------+
| COL1 |
|------------------------|
| { |
| "VALUE": { |
| "c1": "name", |
| "c10": "age", |
| "c100": "gender", |
| "c101": "address", |
| "c102": "status" |
| } |
| } |
+------------------------+
Query with added key ("c31":101) as output -
select
object_insert(col1,'VALUE',object_insert(col1:VALUE,'c31',101),TRUE)
as output_col from temp_1;
+------------------------+
| OUTPUT_COL |
|------------------------|
| { |
| "VALUE": { |
| "c1": "name", |
| "c10": "age", |
| "c100": "gender", |
| "c101": "address", |
| "c102": "status", |
| "c31": 101 |
| } |
| } |
+------------------------+
Clause used in a update (can be predicated based on another column to be used a key) -
update temp_1 set col1 = object_insert(col1,'VALUE',object_insert(col1:VALUE,'c31',101),TRUE);
After update -
select * from temp_1;
+------------------------+
| COL1 |
|------------------------|
| { |
| "VALUE": { |
| "c1": "name", |
| "c10": "age", |
| "c100": "gender", |
| "c101": "address", |
| "c102": "status", |
| "c31": 101 |
| } |
| } |
+------------------------+
One approach could be flattening the result first and construct again:
CREATE TABLE MY_TABLE
AS
SELECT PARSE_JSON('{
"VALUE": {
"c1": "name",
"c10": "age",
"c100": "gender",
"c101": "address",
"c102": "status"
}
}') AS VALUE;
SELECT * FROM MY_TABLE;
Before:
Query:
WITH cte(key, value) AS (
SELECT 'c125', 'job'::VARIANT
UNION ALL
SELECT s.key, s.value
FROM MY_TABLE
,TABLE(FLATTEN (input => VALUE, path => 'VALUE')) s
)
SELECT OBJECT_CONSTRUCT('VALUE', OBJECT_AGG(key, value))
FROM cte;
Output:

parse the nested json have same attribute with jq streaming mode

I want to parse the data from the below nested json file, but it has too many "keys" in the json, it makes hard to parse the data
{
"jobname": {
"keys": {
"jobid":"E000295",
"car":"BMW"
},
"property":{
"doctype":"File",
"areadesc":[
{
"areaid":"qaz",
"weather":"hot",
},
{
"areaid":"wsx",
"weather":"code",
},
{
"areaid":"edc",
"weather":"hot",
},
{
"areaid":"rfv",
"weather":"hot",
}
]
},
"toolJobs":[
{
"keys":{
"toolid":"123"
},
"reports":[
{
"keys":{
"oiltype":"a",
"oilcountry":"us"
},
"property":{"reportid":"001"},
"datas":[
{
"keys":{"areaid":"qaz"},
"data":[
{
"time": "2021-01-01",
"value": 1
},
{
"time": "2021-01-02",
"value": 3
},
]
},
{
"keys":{"areaid":"wsx"},
"data":[
{
"time": "2021-01-03",
"value": 5
},
{
"time": "2021-01-04",
"value": 7
},
]
},
]
},
{
"keys":{
"oiltype":"b",
"oilcountry":"china"
},
"property":{"reportid":"002"},
"datas":[
{
"keys":{"areaid":"edc"},
"data":[
{
"time": "2021-01-05",
"value": 2
},
{
"time": "2021-01-06",
"value": 4
},
]
},
{
"keys":{"areaid":"rfv"},
"data":[
{
"time": "2021-01-07",
"value": 6
},
{
"time": "2021-01-08",
"value": 8
},
]
},
]
}
]
}
]
}
}
until now, I can use the below code to get the basic result, but some columns do not have, such as oiltype, oilcountry, reportid, areaid
cat tmp1.json | jq -cn --stream '
[fromstream(
1|truncate_stream(inputs)
| (.[0][:2] | index("keys")) as $ix
| if $ix then .[0] |= .[1+$ix:]
else (.[0] | index("toolJobs")) as $iy | (.[0][$iy:$iy+3] | index("keys")) as $iz
| if $iz then .[0] |= .[1+$iy+$iz:]
else (.[0] | index("data")) as $ik
| if $ik then .[0] |= .[$ik:]
else empty
end
end
end
)] | .[0] as $header | .[1] as $tool | [.[2:][] | ($header+ $tool+.)] | .'
The result is
[
{"jobid":"E000295","car":"BMW","toolid":"123","data":[{"time":"2021-01-01","value":1},{"time":"2021-01-02","value":3}]},
{"jobid":"E000295","car":"BMW","toolid":"123","data":[{"time":"2021-01-03","value":5},{"time":"2021-01-04","value":7}]},
{"jobid":"E000295","car":"BMW","toolid":"123","data":[{"time":"2021-01-05","value":2},{"time":"2021-01-06","value":4}]},
{"jobid":"E000295","car":"BMW","toolid":"123","data":[{"time":"2021-01-07","value":6},{"time":"2021-01-08","value":8}]}]
I also try below code
cat tmp1.json | jq -cn --stream '
[fromstream(
1|truncate_stream(inputs)
| (.[0][:2] | index("keys")) as $ix
| if $ix then .[0] |= .[1+$ix:]
else (.[0] | index("toolJobs")) as $iy | (.[0][$iy:$iy+3] | index("keys")) as $iz
| if $iz then .[0] |= .[1+$iy+$iz:]
else (.[0] | index("data")) as $ik
| if $ik then .[0] |= .[$ik:]
else (.[0] | index("reports")) as $iw | (.[0][$iw:$iw+3] | index("property")) as $ii
| if $ii then (.[0] |= .[$iw+$ii:])
else (.[0] | index("keys")) as $ij
| if $ij then (.[0] |= .[$ij:])
else empty
end
end
end
end
end
)] | .[0] as $header | .[1] as $prjob | [.[2:][] | ($header + $prjob + .)] | .'
but the result is strange
[
{"jobid":"E000295","car":"BMW","property":{"reportid":"001"},"toolid":"123","keys":{"oiltype":"a","oilcountry":"us","areaid":"qaz"},"data":[{"time":"2021-01-01","value":1},{"time":"2021-01-02","value":3}]},
{"jobid":"E000295","car":"BMW","property":{"doctype":"File","areadesc":[{"areaid":"qaz","weather":"hot"},{"areaid":"wsx","weather":"code"},{"areaid":"edc","weather":"hot"},{"areaid":"rfv","weather":"hot"}]},"toolid":"123","keys":{"areaid":"wsx"},"data":[{"time":"2021-01-03","value":5},{"time":"2021-01-04","value":7}]},
{"jobid":"E000295","car":"BMW","property":{"reportid":"002"},"toolid":"123","keys":{"oiltype":"b","oilcountry":"china","areaid":"edc"},"data":[{"time":"2021-01-05","value":2},{"time":"2021-01-06","value":4}]},
{"jobid":"E000295","car":"BMW","property":{"doctype":"File","areadesc":[{"areaid":"qaz","weather":"hot"},{"areaid":"wsx","weather":"code"},{"areaid":"edc","weather":"hot"},{"areaid":"rfv","weather":"hot"}]},"toolid":"123","keys":{"areaid":"rfv"},"data":[{"time":"2021-01-07","value":6},{"time":"2021-01-08","value":8}]}
]
Below is my expected result
[
{
"jobid":"E000295",
"car":"BMW",
"toolid":"123",
"oiltype":"a",
"oilcountry":"us",
"reportid":"001",
"areaid":"qaz",
"data":[
{
"time": "2021-01-01",
"value": 1
},
{
"time": "2021-01-02",
"value": 3
},
]
},
{
"jobid":"E000295",
"car":"BMW",
"toolid":"123",
"oiltype":"a",
"oilcountry":"us",
"reportid":"001",
"areaid":"wsx",
"data":[
{
"time": "2021-01-03",
"value": 5
},
{
"time": "2021-01-04",
"value": 7
},
]
},
{
"jobid":"E000295",
"car":"BMW",
"toolid":"123",
"oiltype":"b",
"oilcountry":"china",
"reportid":"002",
"areaid":"edc",
"data":[
{
"time": "2021-01-05",
"value": 2
},
{
"time": "2021-01-06",
"value": 4
},
]
},
{
"jobid":"E000295",
"car":"BMW",
"toolid":"123",
"oiltype":"b",
"oilcountry":"china",
"reportid":"002",
"areaid":"rfv",
"data":[
{
"time": "2021-01-07",
"value": 6
},
{
"time": "2021-01-08",
"value": 8
},
]
}
]
Does anyone have any idea?
Assuming the input has been corrected, the following "regular" jq program produces the desired result:
[
.jobname
| (.keys + .toolJobs[].keys) as $one
| .toolJobs[]
| .keys as $two
| .reports[]
| (.keys + .property) as $three
| .datas[]
| (.keys + {data}) as $four
| $one + $two + $three + $four
]
If your input is too large, you could reduce the memory requirements by creating a jq-to-jq pipeline, with the first invocation using the above program (or a --stream version of it) but with the outer brackets removed.

How to return parent value for each array iteration with jsonpath?

I'm attempting to import some json data into grafana via the JSON API.
Here's a snippet of the json structure I'm working with:
[
{
"origin": "TS",
"id": "M8C8E02434D442725422CCB337057792F",
"type": "1.5.1:1",
"self": "https://metricsourcehost01/uimapiM8C8E02434D442725422CCB337057792F",
"source": "destinationhost01.our.domain.net",
"target": "destinationhost01.our.domain.net-0",
"probe": "cdm",
"for_computer_system": {
"id": "14873",
"self": "https://metricsourcehost01/uimapi/devices/14873",
"name": "destinationhost01.our.domain.net",
"ip": "10.1.1.16"
},
"for_device": {
"id": "D4F3D290D787D3FA4E7CD2824BFA6B1C8",
"self": "https://metricsourcehost01/uimapi/devices/D4F3D290D787D3FA4E7CD2824BFA6B1C8"
},
"for_configuration_item": {
"id": "CCE5006B73554FE7D307C1A355429286A",
"self": "https://metricsourcehost01/uimapi/TBD/CCE5006B73554FE7D307C1A355429286A",
"name": "CPU-0",
"qosName": "QOS_CPU_MULTI_USAGE",
"description": "Individual CPU Usage",
"unit": "%"
},
"uimMetricDefinition": null,
"minSampleValue": 61.17,
"maxSampleValue": 72.78,
"meanSampleValue": 64.864,
"sample": [
{
"time": "2021-09-02T00:50:32.000Z",
"timeSinceEpoch": 1630543832,
"value": 61.17,
"rate": 60
},
{
"time": "2021-09-02T00:49:32.000Z",
"timeSinceEpoch": 1630543772,
"value": 63.52,
"rate": 60
},
{
"time": "2021-09-02T00:48:32.000Z",
"timeSinceEpoch": 1630543712,
"value": 62.79,
"rate": 60
},
{
"time": "2021-09-02T00:47:32.000Z",
"timeSinceEpoch": 1630543652,
"value": 64.06,
"rate": 60
},
{
"time": "2021-09-02T00:46:32.000Z",
"timeSinceEpoch": 1630543592,
"value": 72.78,
"rate": 60
}
]
},
{
"origin": "TS",
"id": "M9D90857B9F9BE73EB15912D3314DB2DA",
"type": "1.5.1:1",
"self": "https://metricsourcehost01/uimapiM9D90857B9F9BE73EB15912D3314DB2DA",
"source": "destinationhost01.our.domain.net",
"target": "destinationhost01.our.domain.net-1",
"probe": "cdm",
"for_computer_system": {
"id": "14873",
"self": "https://metricsourcehost01/uimapi/devices/14873",
"name": "destinationhost01.our.domain.net",
"ip": "10.1.1.16"
},
"for_device": {
"id": "D4F3D290D787D3FA4E7CD2824BFA6B1C8",
"self": "https://metricsourcehost01/uimapi/devices/D4F3D290D787D3FA4E7CD2824BFA6B1C8"
},
"for_configuration_item": {
"id": "CF1D7A708DD4C6C9D303025AE3D2334AE",
"self": "https://metricsourcehost01/uimapi/TBD/CF1D7A708DD4C6C9D303025AE3D2334AE",
"name": "CPU-1",
"qosName": "QOS_CPU_MULTI_USAGE",
"description": "Individual CPU Usage",
"unit": "%"
},
"uimMetricDefinition": null,
"minSampleValue": 59.85,
"maxSampleValue": 72.31,
"meanSampleValue": 64.296,
"sample": [
{
"time": "2021-09-02T00:50:32.000Z",
"timeSinceEpoch": 1630543832,
"value": 59.85,
"rate": 60
},
{
"time": "2021-09-02T00:49:32.000Z",
"timeSinceEpoch": 1630543772,
"value": 63.88,
"rate": 60
},
{
"time": "2021-09-02T00:48:32.000Z",
"timeSinceEpoch": 1630543712,
"value": 60.17,
"rate": 60
},
{
"time": "2021-09-02T00:47:32.000Z",
"timeSinceEpoch": 1630543652,
"value": 65.27,
"rate": 60
},
{
"time": "2021-09-02T00:46:32.000Z",
"timeSinceEpoch": 1630543592,
"value": 72.31,
"rate": 60
}
]
}
]
It's CPU utilisation for 2 CPU cores from the same host.
Using $.[*].sample[*].time and $.[*].sample[*].value successfully returns the required time and value data which can be easily graphed:
| time | value |
| ------------------------ | ----- |
| 2021-09-02T00:50:32.000Z | 61.17 |
| 2021-09-02T00:49:32.000Z | 63.52 |
| 2021-09-02T00:48:32.000Z | 62.79 |
| 2021-09-02T00:47:32.000Z | 64.06 |
| 2021-09-02T00:46:32.000Z | 72.78 |
| 2021-09-02T00:50:32.000Z | 59.85 |
| 2021-09-02T00:49:32.000Z | 63.88 |
| 2021-09-02T00:48:32.000Z | 60.17 |
| 2021-09-02T00:47:32.000Z | 65.27 |
| 2021-09-02T00:46:32.000Z | 72.31 |
However, it combines all the data with no way to differenciate between the two CPU core data samples.
I've been trying to figure out a way to get a third column utilising the target value for each iteration of the sample array.
Ideally, the output should look like this when tabled:
| target | time | value |
| -------------------------------------- | ------------------------ | ----- |
| destinationhost01.our.domain.net-**0** | 2021-09-02T00:50:32.000Z | 61.17 |
| destinationhost01.our.domain.net-**0** | 2021-09-02T00:49:32.000Z | 63.52 |
| destinationhost01.our.domain.net-**0** | 2021-09-02T00:48:32.000Z | 62.79 |
| destinationhost01.our.domain.net-**0** | 2021-09-02T00:47:32.000Z | 64.06 |
| destinationhost01.our.domain.net-**0** | 2021-09-02T00:46:32.000Z | 72.78 |
| destinationhost01.our.domain.net-**1** | 2021-09-02T00:50:32.000Z | 59.85 |
| destinationhost01.our.domain.net-**1** | 2021-09-02T00:49:32.000Z | 63.88 |
| destinationhost01.our.domain.net-**1** | 2021-09-02T00:48:32.000Z | 60.17 |
| destinationhost01.our.domain.net-**1** | 2021-09-02T00:47:32.000Z | 65.27 |
| destinationhost01.our.domain.net-**1** | 2021-09-02T00:46:32.000Z | 72.31 |
Any advice would be greatly appreciated. I'm not sure it's even doable with jsonpath... hence why i'm reaching out to the experts.
Thanks
As the JSON API for Grafana uses the JSONPath Plus package, it's quite easy to accomplish what I was after.
The ^ is able to grab the parent of any matching item. Playing around with this in the JSONPath Demo site got me there. You can paste in my example from the original post and test the following queries:
$.[*].sample[*].time obtains the time from each sample.
$.[*].sample[*].value obtains the value from each sample.
$.[*].sample[*].value^^^^.for_configuration_item.name is the special sauce that will grab the for_configuration_item.name for each sample
Providing these three queries to Grafana makes a table like this:
core
time
value
CPU-0
2021-09-02T00:50:32.000Z
61.17
CPU-0
2021-09-02T00:49:32.000Z
63.52
CPU-0
2021-09-02T00:48:32.000Z
62.79
CPU-0
2021-09-02T00:47:32.000Z
64.06
CPU-0
2021-09-02T00:46:32.000Z
72.78
CPU-1
2021-09-02T00:50:32.000Z
59.85
CPU-1
2021-09-02T00:49:32.000Z
63.88
CPU-1
2021-09-02T00:48:32.000Z
60.17
CPU-1
2021-09-02T00:47:32.000Z
65.27
CPU-1
2021-09-02T00:46:32.000Z
72.31
From there, using the Group by feature in the Experimental tab on the core column graphs the values exactly as required.

How to change values from writable basic Datatypes on an OPC UA Server with FIWARE OPC UA AGENT

GOAL
Change writable values on the OPC UA Server by using the Fiware OPC UA Agent.
My test implementation
Adding the NodeId "7:PLC1_7:G_Communication_7:fi_heartbeat_i" to the "command" and "contextSubscription" sections in the config.json file. The value data type of the NodeId is Int16, but because this value is supposed to be written, I assume that "command" must be used as the type. Unfortunately, more detailed information cannot be found in the manual fiware opcua agent.
Start a new test environment with OpcUa Agent, Orion Context broker and mongodb.
Expectet behavior
The value on the server is updated when a request is sent to the Context Broker.
Current behaviour
The value of the parameter is read out correctly but with an incorrect data type (string instead of integer).
The value of the parameter is not updated when a request is sent to the Orion Context Broker.
additional informations
config.json
{
"logLevel" : "DEBUG",
"contextBroker" : {
"host" : "orion",
"port" : 1026
},
"server" : {
"port" : 4001,
"baseRoot" : "/"
},
"deviceRegistry" : {
"type" : "memory"
},
"mongodb" : {
"host" : "iotmongo",
"port" : "27017",
"db" : "iotagent",
"retries" : 5,
"retryTime" : 5
},
"providerUrl" : "http://iotopcua:4001",
"pollingExpiration" : "200000",
"pollingDaemonFrequency" : "20000",
"deviceRegistrationDuration" : "P1M",
"defaultType" : null,
"browseServerOptions" : null,
"service" : "test",
"subservice" : "/test",
"types" : {
"g_communication" : {
"service" : "test",
"subservice" : "/test",
"active" : [{
"name" : "7:PLC1_7:G_Communication_7:fo_smartControllerActive_b",
"type" : "Boolean"
} ],
"lazy" : [ ],
"commands" : [{
"name" : "7:PLC1_7:G_Communication_7:fi_heartbeat_i",
"type" : "Command"
}]
}
},
"contexts" : [ {
"id" : "plant",
"type" : "g_communication",
"service" : "test",
"subservice" : "/test",
"polling" : null,
"mappings" : [{
"ocb_id" : "7:PLC1_7:G_Communication_7:fo_smartControllerActive_b",
"opcua_id" : "ns=7;s=G_Communication.fo_smartControllerActive_b",
"object_id" : null,
"inputArguments" : []
} ]
}],
"contextSubscriptions" : [{
"id" : "plant",
"type" : "g_communication",
"mappings" : [{
"ocb_id" : "7:PLC1_7:G_Communication_7:fi_heartbeat_i",
"opcua_id" : "ns=7;s=G_Communication.fi_heartbeat_i",
"object_id" : "ns=7;s=G_Communication",
"inputArguments" : [{
"type": "Number"
}]
}]
}]
}
List Entities
curl 'http://localhost:1026/v2/entities/plant/' -H 'fiware-service: test' -H 'fiwate-servicepath: /test' | python -m json.tool
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 684 100 684 0 0 13751 0 --:--:-- --:--:-- --:--:-- 13959
{
"7:PLC1_7:G_Communication_7:fi_heartbeat_i": {
"metadata": {
"ServerTimestamp": {
"type": "ISO8601",
"value": "null"
},
"SourceTimestamp": {
"type": "ISO8601",
"value": "null"
}
},
"type": "string",
"value": "4"
},
"7:PLC1_7:G_Communication_7:fi_heartbeat_i_info": {
"metadata": {},
"type": "commandResult",
"value": " "
},
"7:PLC1_7:G_Communication_7:fi_heartbeat_i_status": {
"metadata": {},
"type": "commandStatus",
"value": "UNKNOWN"
},
"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b": {
"metadata": {
"ServerTimestamp": {
"type": "ISO8601",
"value": "2021-05-04T07:38:01.150Z"
},
"SourceTimestamp": {
"type": "ISO8601",
"value": "2021-05-04T07:37:59.934Z"
}
},
"type": "Boolean",
"value": false
},
"id": "plant",
"type": "g_communication"
}
Registrations
curl 'http://localhost:1026/v2/registrations' -H 'fiware-service: test' -H 'fiwate-servicepath: /test' | python -m json.tool
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 320 100 320 0 0 46049 0 --:--:-- --:--:-- --:--:-- 53333
[
{
"dataProvided": {
"attrs": [
"7:PLC1_7:G_Communication_7:fi_heartbeat_i"
],
"entities": [
{
"id": "plant",
"type": "g_communication"
}
]
},
"expires": "2021-06-03T07:37:38.00Z",
"id": "6090f9c254b918756abf1a7d",
"provider": {
"http": {
"url": "http://iotopcua:4001"
},
"legacyForwarding": true,
"supportedForwardingMode": "all"
},
"status": "active"
}
]
Test communication with iotopcua
curl "http://iotopcua:4001/version"
{"libVersion":"2.12.0-next","port":4001,"baseRoot":"/"}
Request for update
curl -X PUT \
'http://localhost:1026/v2/entities/plant/attrs/7:PLC1_7:G_Communication_7:fi_heartbeat_i?type=g_communication' \
-H 'content-type: application/json' \
-H 'fiware-service: test' \
-H 'fiware-servicepath: /test' \
-d '{
"value": 2
}'
Log OCB
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=logMsg.h[1844]:lmTransactionStart | msg=Starting transaction from 0.0.0.0:54232/v2/entities/plant/attrs/7:PLC1_7:G_Communication_7:fi_heartbeat_i
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=rest.cpp[874]:servicePathSplit | msg=Service Path 0: '/test'
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=connectionOperations.cpp[244]:collectionCount | msg=Database Operation Successful (count: { _id.id: "plant", _id.type: "g_communication", _id.servicePath: "/test" })
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=connectionOperations.cpp[94]:collectionQuery | msg=Database Operation Successful (query: { _id.id: "plant", _id.type: "g_communication", _id.servicePath: "/test" })
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=connectionOperations.cpp[182]:collectionRangedQuery | msg=Database Operation Successful (query: { query: { $or: [ { contextRegistration.entities.id: "plant", contextRegistration.entities.type: "g_communication" }, { contextRegistration.entities.id: ".*", contextRegistration.entities.isPattern: "true", contextRegistration.entities.type: { $in: [ "g_communication" ] } }, { contextRegistration.entities.id: ".*", contextRegistration.entities.isPattern: "true", contextRegistration.entities.type: { $exists: false } } ], expiration: { $gt: 1620114947 }, contextRegistration.attrs.name: { $in: [ "7:PLC1_7:G_Communication_7:fi_heartbeat_i" ] }, servicePath: "/test" }, orderby: { _id: 1 } })
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=logMsg.h[1844]:lmTransactionStart | msg=Starting transaction to http://iotopcua:4001//updateContext
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=httpRequestSend.cpp[550]:httpRequestSendWithCurl | msg=Sending message 4 to HTTP server: sending message of 458 bytes to HTTP server
from=10.1.17.1 | srv=test | subsrv=/test | comp=Orion | op=logMsg.h[1844]:lmTransactionStart | msg=Starting transaction from 10.1.17.1:58162/v1/updateContext
from=10.1.17.1 | srv=test | subsrv=/test | comp=Orion | op=rest.cpp[874]:servicePathSplit | msg=Service Path 0: '/test'
from=10.1.17.1 | srv=test | subsrv=/test | comp=Orion | op=connectionOperations.cpp[94]:collectionQuery | msg=Database Operation Successful (query: { _id.id: "plant", _id.type: "g_communication", _id.servicePath: "/test" })
from=10.1.17.1 | srv=test | subsrv=/test | comp=Orion | op=connectionOperations.cpp[454]:collectionUpdate | msg=Database Operation Successful (update: <{ _id.id: "plant", _id.type: "g_communication", _id.servicePath: "/test" }, { $set: { attrs.7:PLC1_7:G_Communication_7:fi_heartbeat_i_status: { value: "PENDING", type: "commandStatus", mdNames: [], creDate: 1620113858, modDate: 1620114947 }, modDate: 1620114947, lastCorrelator: "2324ca1e-acae-11eb-a4f7-226cad26e2cc" }, $unset: { location: 1, expDate: 1 } }>)
from=10.1.17.1 | srv=test | subsrv=/test | comp=Orion | op=logMsg.h[1874]:lmTransactionEnd | msg=Transaction ended
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=httpRequestSend.cpp[570]:httpRequestSendWithCurl | msg=Notification Successfully Sent to http://iotopcua:4001//updateContext
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=httpRequestSend.cpp[579]:httpRequestSendWithCurl | msg=Notification response OK, http code: 200
from=0.0.0.0 | srv=test | subsrv=/test | comp=Orion | op=logMsg.h[1874]:lmTransactionEnd | msg=Transaction ended
Log OPCUA Client
time=2021-05-04T07:55:47.191Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.GenericMiddlewares | msg=Request for path [//updateContext] from [iotopcua:4001]
time=2021-05-04T07:55:47.191Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.GenericMiddlewares | msg=Body:
{
"contextElements": [
{
"type": "g_communication",
"isPattern": "false",
"id": "plant",
"attributes": [
{
"name": "7:PLC1_7:G_Communication_7:fi_heartbeat_i",
"type": "Number",
"value": 2
}
]
}
],
"updateAction": "UPDATE"
}
time=2021-05-04T07:55:47.193Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.ContextServer | msg=Handling update from [iotopcua:4001]
time=2021-05-04T07:55:47.193Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.ContextServer | msg=[object Object]
time=2021-05-04T07:55:47.194Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.InMemoryGroupRegister | msg=Looking for device params ["service","subservice","type"]
time=2021-05-04T07:55:47.194Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.DeviceService | msg=deviceData after merge with conf: {"id":"plant","name":"plant","type":"g_communication","active":[{"name":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b","type":"Boolean","object_id":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b"}],"service":"test","subservice":"/test","polling":null,"endpoint":"opc.tcp://109.68.106.155:48050","registrationId":"6090f9c254b918756abf1a7d","creationDate":1620113858802}
time=2021-05-04T07:55:47.194Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.DeviceService | msg=deviceData before merge with conf: {"id":"plant","name":"plant","type":"g_communication","active":[{"name":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b","type":"Boolean","object_id":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b"}],"lazy":[],"commands":[{"name":"7:PLC1_7:G_Communication_7:fi_heartbeat_i","type":"Command","object_id":"7:PLC1_7:G_Communication_7:fi_heartbeat_i"}],"service":"test","subservice":"/test","polling":null,"endpoint":"opc.tcp://109.68.106.155:48050","registrationId":"6090f9c254b918756abf1a7d","creationDate":1620113858802,"internalAttributes":null,"staticAttributes":[],"subscriptions":[]}
time=2021-05-04T07:55:47.195Z | lvl=INFO | corr=n/a | trans=n/a | op=Index.CommandContextHandler | comp=iotAgent-OPCUA | srv=test | subsrv=/test | msg=method to call =[{"objectId":"ns=7;s=G_Communication","methodId":"ns=7;s=G_Communication.fi_heartbeat_i","inputArguments":[{"type":"Number"}]}]
time=2021-05-04T07:55:47.879Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=executeWithDeviceInfo entityName plant type undefined apikey undefined attributes [{"name":"7:PLC1_7:G_Communication_7:fi_heartbeat_i_status","type":"commandStatus","value":"PENDING"}] deviceInformation {"id":"plant","name":"plant","type":"g_communication","active":[{"name":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b","type":"Boolean","object_id":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b"}],"lazy":[],"commands":[{"name":"7:PLC1_7:G_Communication_7:fi_heartbeat_i","type":"Command","object_id":"7:PLC1_7:G_Communication_7:fi_heartbeat_i"}],"service":"test","subservice":"/test","polling":null,"endpoint":"opc.tcp://109.68.106.155:48050","registrationId":"6090f9c254b918756abf1a7d","creationDate":1620113858802,"internalAttributes":null,"staticAttributes":[],"subscriptions":[]}
time=2021-05-04T07:55:47.879Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=error {"name":"DEVICE_GROUP_NOT_FOUND","message":"Couldn\t find device group","code":404} in get group device
time=2021-05-04T07:55:47.880Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=typeInformation {"id":"plant","name":"plant","type":"g_communication","active":[{"name":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b","type":"Boolean","object_id":"7:PLC1_7:G_Communication_7:fo_smartControllerActive_b"}],"lazy":[],"commands":[{"name":"7:PLC1_7:G_Communication_7:fi_heartbeat_i","type":"Command","object_id":"7:PLC1_7:G_Communication_7:fi_heartbeat_i"}],"service":"test","subservice":"/test","polling":null,"endpoint":"opc.tcp://109.68.106.155:48050","registrationId":"6090f9c254b918756abf1a7d","creationDate":1620113858802,"internalAttributes":null,"staticAttributes":[],"subscriptions":[]}
time=2021-05-04T07:55:47.880Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=Updating device value in the Context Broker at [http://orion:1026/v1/updateContext]
time=2021-05-04T07:55:47.880Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=Using the following request:
{
"url": "http://orion:1026/v1/updateContext",
"method": "POST",
"headers": {
"fiware-service": "test",
"fiware-servicepath": "/test"
},
"json": {
"contextElements": [
{
"type": "g_communication",
"isPattern": "false",
"id": "plant",
"attributes": [
{
"name": "7:PLC1_7:G_Communication_7:fi_heartbeat_i_status",
"type": "commandStatus",
"value": "PENDING"
}
]
}
],
"updateAction": "UPDATE"
}
}
time=2021-05-04T07:55:47.886Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=Received the following request from the CB:
{
"contextResponses": [
{
"contextElement": {
"type": "g_communication",
"isPattern": "false",
"id": "plant",
"attributes": [
{
"name": "7:PLC1_7:G_Communication_7:fi_heartbeat_i_status",
"type": "commandStatus",
"value": ""
}
]
},
"statusCode": {
"code": "200",
"reasonPhrase": "OK"
}
}
]
}
time=2021-05-04T07:55:47.886Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.NGSIService | msg=Value updated successfully
time=2021-05-04T07:55:47.886Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.ContextServer | msg=Update action from [iotopcua:4001] handled successfully.
time=2021-05-04T07:55:47.886Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.ContextServer | msg=Generated update response: {"contextResponses":[{"contextElement":{"attributes":[{"name":"7:PLC1_7:G_Communication_7:fi_heartbeat_i","type":"Number","value":""}],"id":"plant","isPattern":false,"type":"g_communication"},"statusCode":{"code":200,"reasonPhrase":"OK"}}]}
time=2021-05-04T07:55:47.887Z | lvl=DEBUG | corr=n/a | trans=n/a | op=IoTAgentNGSI.DomainControl | msg=response-time: 697
as far as i can tell, i think the error resides in your contextSubscription snippet inside your config.json which should look like :
"contextSubscriptions": [{
"id": "plant",
"type": "g_communication",
"service": "test",
"subservice": "/test",
"mappings": [{
"ocb_id": "7:PLC1_7:G_Communication_7:fi_heartbeat_i",
"opcua_id": "ns=7;s=G_Communication.fi_heartbeat_i",
"object_id": "ns=7;s=G_Communication",
"inputArguments": [{
"dataType": 4,
"type": "Intensity"
}]
}]
}]
Could you please give it a try?