Can IoT device with different protocols use IoT Agents to "talk" with each other? - fiware

I'm playing around with FIWARE Orion, IoT Agent JSON, and IoT Agent OPC UA. I'm wondering that since all the IoT Agents connect with Orion and map different IoT protocols into NGSI, is it possible for devices using different protocols to communicate with each other without adding any additional application logic?
Let's consider a MQTT device A and an OPC UA server B, For example, is it possible for:
B reports its measurements to the Orion Context Broker, A subscribe to that attribute. Some thing like
B-->IoT Agent OPC UA-->Orion-->IoT Agent JSON-->mosquitto-->A
(I tried to make a context provider registration. However, the url of the B entity attributes(orion:1026/v2/B/attrs/XXX) obviously doesn't work since Orion will send POST to an orion:1026/v2/B/attrs/XXX/op/query which doesn't exist), and the provided attribute is not provisioned at IoT Agent JSON...I feel like I'm taking the totally wrong direction)
A and B access the same entity and report their measurements to that entity in the Orion. Since A and B both need their own IoT Agents, and the same entity can not be provisioned at each agent due to duplicated...
Is it a super bad idea that trying to mess up one entity with several protocols' devices...Thank you so much for answering my doubts in advance!!!

Each NGSI entity should map to something that has state in the real world. In your case your data models should be based on Device and you should have a separate OPC-UA-based Device entity and a second separate JSON Device entity. These are the low level entities within your system, which would hold readings from the IoT Devices and would also hold additional data (such as battery-level or a link to the documentation or whatever).
If you want to model the state of a second aggregated entity then you can do that as well - just subscribe to changes in context in the device and Upsert the values and metadata over to the other entity.
curl --location --request POST 'http://localhost:1027/v2/subscriptions/' \
--header 'Content-Type: application/json' \
--header 'fiware-service: openiot' \
--data-raw '{
"description": "Notify Subscription Listener of Lamp context changes",
"subject": {
"entities": [
{
"idPattern": "Lamp.*"
}
],
"condition": {
"attrs": ["luminosity"]
}
},
"notification": {
"http": {
"url": "http://tutorial:3000/device/subscription/luminosity"
},
"attrs": ["luminosity", "controlledAsset", "supportedUnits"]
},
"throttling": 5
}'
Sample code to do the work from the listening endpoint ( /device/subscription/luminosity ) can be found here - it is a tutorial which is still a work-in-progress, so the full documentation is currently missing.
function shadowDeviceMeasures(req, res) {
const attrib = req.params.attrib;
async function copyAttributeData(device, index) {
await upsertDeviceEntityAsLD(device);
if (device[attrib]) {
await upsertLinkedAttributeDataAsLD(device, 'controlledAsset', attrib);
}
}
req.body.data.forEach(copyAttributeData);
res.status(204).send();
}
The point here is that you can (and should) think of data entities at different levels -
I have a thermometer Device - it is sending temperature readings. It has a temperature attribute
I have a Building - it has a thermometer in it - the Building has a temperature attribute and metadata.providedBy linking back to the Device
Depending on your use case you may only need to consider entities at one layer or you may need to use both.

Related

What is a useful Azure IoT Hub JSON message structure for consumption in Time Series Insights

The title sounds quite comprehensive, but my baseline question is quite simple, I guess.
Context
I Azure, I have an IoT hub, which I am sending messages to. I use a modified version one of the samples from the Azure Iot SDK for python.
Sending works fine. However, instead of a string, I send a JSON structure.
When I watch the events flowing into the IoT hub, using the Cloud shell, it looks like this:
PS /home/marcel> az iot hub monitor-events --hub-name weathertestiothub
This extension 'azure-cli-iot-ext' is deprecated and scheduled for removal. Please remove and add 'azure-iot' instead.
Starting event monitor, use ctrl-c to stop...
{
"event": {
"origin": "raspberrypi-zero-wh",
"payload": "{ \"timestamp\": \"1608643863720\", \"locationDescription\": \"Attic\", \"temperature\": \"21.941\", \"relhumidity\": \"71.602\" }"
}
}
Issue
The data seems fine, except the payload looks strange here. BUT, the payload is literally what I send from the device, using the SDK sample.
Is this the correct way to do it? At the end, I have a very hard time to actually get the data into the Time Series Insights model. So I guess, my structure is to be blamed.
Question
What is a recommended JSON data structure to send to the IoT hub for later use?
You should add the following 2 lines to your message in your python SDK sample:
msg.content_encoding = "utf-8"
msg.content_type = "application/json"
This should resolve your formatting concern.
We've also updated our samples to reflect this: https://github.com/Azure/azure-iot-sdk-python/blob/master/azure-iot-device/samples/sync-samples/send_message.py
I ended up using the tip by #elhorton, but it was not the key change. Nonetheless, the formatting in the Azure Shell Monitor looks now much better:
"event": {
"origin": "raspberrypi-zero-wh",
"payload": {
"temperature": 21.543947753906245,
"humidity": 69.22964477539062,
"locationDescription": "Attic"
}
}
The key was:
include the message source time in ISO format
from datetime import datetime
timestampIso = datetime.now().isoformat()
message.custom_properties["iothub-creation-time-utc"] = timestampIso
Using the locationDescription as the Time Series ID Property See https://learn.microsoft.com/en-us/azure/time-series-insights/how-to-select-tsid (Maybe I could also have taken the iothub-connection-device-id, but I did not test that alone specifically)
I guess using "iothub-connection-device-id" will make "raspberrypi-zero-wh" as the name of the time series instance. I agree with your choice of using "locationDescription" as TSID; so Attic becomes the time series instance name, temperature and humidity will be your variables.

Twilio Autopilot, best method to check mobile number in external database

I would like to use Twilio autopilot to build a whatsapp Chatbot. I need to know if a user has already used our service before. This means that on initiation, i get the data from an external source, i can then use functions to further specify the chatbot logic.
I am wondering what are the best options in the twilio environment to get that external data loaded in event or memory? I see webhooks are only for diagnostics, and i dont know what are the async capabilities of functions. Could someone elaborate a bit about the pro's and cons of different methods?
Thanks
you can take a look at The Autopilot Request. These request parameters are provided to your application (via a webhook associated with a task), where you can add additional logic to see if this is a new user or returning user and return the appropriate Autopilot Actions, one of which is Remember.
An approach would be setting a webhook that will determine if the User has used the system before on to the Assistant's assistant_initiation more info
The webhook can then reply with a JSON Remember + Redirect
Example -
{
"actions": [
{
"remember": {
"isNewuser": "true"
}
},
{
"redirect": "task://newUserTask"
}
]
}
OR
{
"actions": [
{
"remember": {
"isNewuser": "false"
}
},
{
"redirect": "task://oldUserTask"
}
]
}
Also since "isNewuser": "true" will be on the assistant's memory you can use that info on any following task until the session (4 Hours) expires.

How to use Openshift Exposing Object Fields?

The openshift documentation has a feature Exposing Object Fields that I am struggling to comprehend. When I load my secret I am exposing it as per the documentation. Yet it is unclear from the language of the documentation what are the actual mechanism to bind to the exposed variables. The docs state:
An example response to a bind operation given the above partial
template follows:
{ "credentials": {
"username": "foo",
"password": "YmFy",
"service_ip_port": "172.30.12.34:8080",
"uri": "http://route-test.router.default.svc.cluster.local/mypath" } }
Yet that example isn't helpful as its not clear what was bound and how it was bound to actually pick-up the exposed variables. What I am hoping it is all about is that the exposed values become ambient and that when I run some other templates into the same project (???) it will automatically resolve (bind) the variables. Then I can decouple secret creation (happening at product creation time) and secret usage (happening when developers populate their project). Am I correct that this feature creates ambient properties and that they are picked up by any template? Are there any examples of using this feature to decouple secret creation from secret usage (i.e. using this feature for segregation fo duties).
I am running Redhat OCP:
OpenShift Master:
v3.5.5.31.24
Kubernetes Master:
v1.5.2+43a9be4

Modify device - IoTAgentUL

I need to modify a registered device in IoTAgent UltraLight. With modify I mean add some attributes and delete others.
I also want to update the entity in Orion CB.
Is it possible to do that? How can I do that?
IoTAs (and the IoTA library in general), expose a north provisioning interface for device creation. The core idea is when you provision a device in the IoTA (directly, or through the IoTA Manager), an entity is automatically created in Context Broker. Such nothr provisioning interface allows for retrieval, removal and update, too.
Being said that, the south interface of the IoTAs is designed to only accept measures and command execution results from the devices. Thus, if a new attribute comes into play, and you provide values for that new attribute via an IoTA, a new attribute will not be appended at Context Broker; simply, this information will be droped.
In order to accept data regarding new attributes, first you'll have to use the above mentioned provisioning interface of the IoTAs, in particular the update device operation, in order to provision such new attribute; that will automatically append a new attribute to the entity at Context Broker level. From here on, values for the new attribute sent to the IoTA will be updated in Context Broker.
Such an update request looks like:
PUT http://iota_host:iota_port/iot/devices/<dev_id>?protocol=<protocol_type>
Fiware-Service: <service>
Fiware-ServicePath: <subservice>
{
"entity_type": <entity_type>,
"attributes": [ <new_active_attrs_if_any> ],
"lazy": [ <new_lazy_attrs_if_any> ],
"commands": [ <new_commands_if_any> ],
"statis_attributes": [ <new_static_attrs_if_any> ]
}
Sadly, already existent attributes cannot be removed, for the time being.

Querying lwm2m active attribute in Orion

I have the following configuration for an active lightweightm2m-iotagent attribute (a temperature sensor value). Fiware's IoT agent turns IPSO objects into lazy attributes but I add a mapping to make it an active attribute as in the documentation:
types: {
'Type': {
service: 'service',
subservice: '/service',
commands: [],
lazy: [],
active: [
{
"name": "t",
"type": "number"
}
],
lwm2mResourceMapping: {
"t": {
"objectType": 3303,
"objectInstance": 0,
"objectResource": 5700
}
}
},
According to the documentation for the iotagent-node-lib:
NGSI queries to the context broker will be resolved in the Broker database.
However, when I query my active attribute in Orion, Orion also queries the lightweightm2m-iotagent, requesting a bogus /3303/0/0 path which doesn't even exist in the IPSO definition.
curl -H "Fiware-service: service" -H "Fiware-servicepath: /service" http://172.17.0.1:1026/v2/entities/entity1:Type/attrs/t/value
How can I set up the configuration to get the behavior stated in the documentation, resolving a query for an active attribute in the broker database and avoiding these bogus queries?
Maybe the IoTAgent is not recognizing the active attribute as such, and it may be related to the static configuration of the types via "config.js"; this kind of configuration is not commonly used and may contain some errors (as probably the one you have found). Please, try provisioning the device through the API, as explained in: https://github.com/telefonicaid/lightweightm2m-iotagent/blob/master/docs/deviceProvisioning.md. If it works then, maybe we should flag the static attribute configuration as a buggy.