Fiware IoT Agent - data loss issue - fiware

We are observing significant data loss in IoT Agent. The data loss happens when IoT agent receives data from our VerneMQ MQTT broker.
Our data is flowing at a rate of 1000/min. When the IoT Agent is started, it works fine for approximately 12 hours, but after 12 hours we see average data loss of 20%.
We have one other subscriber apart from IoT Agent which receives the same data from VerneMQ topic, and that subscriber receives all the data from VerneMQ. The data loss is only in IoT Agent.
We are running IoT Agent and Orion Context Broker on AWS environment using ECS/Fargate services. Each IoT Agent and Orion instance runs in a separate dedicated container.
When IoT Agent is running on one ECS container it works fine without data loss, but when ECS container increased to more than one, there seems to be a data loss. Can you please guide and help us to resolve this issue.
Details about our environment:
IoT Agent version:
{
"libVersion": "2.12.0-next",
"port": "4041",
"baseRoot": "/",
"version": "1.14.0-next" }
Orion Context Broker version:
{ "orion": {
"version": "2.3.0",
"uptime": "0 d, 3 h, 51 m, 36 s",
"git_hash": "764f44bff1e73f819d4e0ac52e878272c375d322",
"compile_time": "Tue Nov 5 09:38:37 UTC 2019",
"compiled_by": "root",
"compiled_in": "38ab37448d3a",
"release_date": "Tue Nov 5 09:38:37 UTC 2019",
"doc": "https://fiware-orion.rtfd.io/en/2.3.0/" } }
Environment variables set in IoT Agent:

Taking into account:
When IoT Agent is running on one ECS container it works fine without data loss, but when ECS container increased to more than one, there seems to be a data loss
it seems the problem in somehow related with the underlying infrastructure but not with the FIWARE software itself. Thus, I'd suggest to review your AWS ECS settings (unfortunatelly, I'm not an expert in AWS ECS, so I cannot provide more specific feedback).

When you say "data loss", do you mean that VerneMQ logs it's dropping messages? This can happen if the consumer is overloaded. In that case, VerneMQ will protect it by load shedding messages.
Your current messages rates should not be a problem though.
André (with the VerneMQ project)

Related

Data retrieval from Orion Context Broker subscription fails

With a successful subscription in Orion Context Broker, a listening accumulator server fails to recieve any data, under any circumstances I can find to test.
We are using an Ubuntu virtual machine that has a nested virtual machine with FIWARE Orion in it. Having subscribed to Orion Context Broker and confirmed that it was successful by checking the database and also having confirmed that data is successfully updated, the accumulator server fails to respond. unable to tell if this is a failure to send from Orion, or to receive by accumulator, and unsure how to check and continue, we humbly beg the wisdom of the stack overflow community.
We have run the accumulator server on both virtual machine on the same PC and on another PC with non-vm Ubuntu. The script we are using to subscribe is presented below:
Orion VM
{
"duration": "P1M",
"entities": [
{
"type": "Thing",
"id": "Sensor_GV_01",
"isPattern": "false"
}
],
"throttling": "PT1S",
"reference": "http://10.224.24.236:1028/accumulate",
"attributes": [
"temperature",
"pressure"
]
}
EDIT 1
upon using GET/v2/subscriptions/ we receive that the subscription is present but it gives only basic info, no Timesent values. It is pretty much the same thing we receive when we ask MongoDB directly.
Also, forgot to mention, Orion version we are using is 1.9.0 Subscription check

How to test that my Ultralight 2.0 IOT agent receives payload

I am using the Ultralight 2.0 IOT Agent.
Although I can see that the payload, that my gateway sends, is subscribed at a specific topic into the mosquitto MQTT broker, how could I test that the IOTAgent is listening at port 4061 and that it is receiving something from the broker?
I am refering to the IOTAgent-UL which is running on a CentOS 7 VM as a service. I have access to it with REST calls and I also see that the payloads, that my gateway sends, are subscribed into the MQTT broker. How could I pass my payloads from the mosquitto to the IOTAgent and after to the Context Broker?
thanks a lot!
What you are asking for seems to be related with the basic operational workflow of the IOTAgents in general. Thus, I'd recommend you to have a look to the following Step by step guide. It is based in another agent (the one for JSON payloads instead of UL) but most of the procedure is the same, so I think it could be useful.
EDIT: JSON format is documented here. UL format is documented here. Payload format is indepedent of the transport, i.e. is the same no matter if you use HTTP or MQTT.

checking command from orion to iot agent in Fiware

I've used the FIWARE Orion Context Broker and IoTAgent-UL in my project. I've registered a virual device by sending a Json message carrying the device attributes, the command attributes, device endpoint address and the used protocol (UL2.0).
If I update the command attribute of the device entity in Orion Context Broker, how can i check that the command is sent to the IoTAgent successfully before it is forwarded to the device virtual device itself?
Moreover, can I make the IP address of a Raspberrypi the endpoint itself and assign a port to a device connected to the Raspberrypi? And how could this be done?
Finally, in case I have no physical device could I consider the IoTAgent's address an endpoint to check whether any update on the command attribute in the context broker will be forwarded to that endpoint?
Thanks
There are three ways of checking the update context/command has been sent to the agent, and from the agent or the device:
Check Orion or agent logs.
Check the MQTT broker logs, if you are using MQTT transport.
Check the device itself. If the command was received, you'll be able to see the effects of the command.
Regarding the place a Raspberry Pi may play in an architecture using IoT agents, typically it is used to replace the agent :) I mean, if having a device such as a Raspberry Pi, the usual scenario is to connect to the R-Pi all your sensors and actuators, as if it was a gateway, and then let the R-Pi connect directly to Orion Context Broker by implementing a NGSI client running in the R-Pi. Schematically:
Orion <---> R-Pi + NGSI client <---> sensor/actuator
Nevertheless, I guess you can use the R-Pi as if it was a final device (sensor or actuator) in order to test IoT agents. Regarding how to emulate the final device itself, I guess you'll have to run certain logic in the R-Pi in order to accept the UL messages from the IoT agent/MQTT broker. A simple netcat could help you; more complex emulation services could be run, of course. Schematically:
Orion <----> UL agent <---> R-Pi + netcat
Anyway, please observe always a final device (sensor or actuator) is required, either real, either simulated (running netcat or similar in a R-Pi/server), since the UL agent must have an endpoint where to send the UL payloads.
Orion <---> UL agent <---> R-Pi + netcat OR server + netcat OR real sensor/actuator

FIWARE IoT Agent: can the IoT agent send data to multiple context broker

I am using the MQTT IoT agent to send data to my fiware context broker, I am wondering if I can send data from my IoT agent to multiple context brokers. Is that possible? if yes how to?
Thanks in advance for your help!
The MQTT IoT-Agent is connected to a specific Context Broker instance depending on the Service provision. If the Service Context Broker instance is not configured, then the "ngsi_urls" parameter is used.
Therefore, yes, you can deliver information to multiple ContextBroker instances but only one per defined FIWARE service.
If you want to send the information of one single service to multiple instances of Context Brokers I think you may send it to one and then federate the other instances. To learn about Context Broker instances federations you should check the ContextBroker related documents.
Thanks for using IDAS and sorry for a so delayed response (we have been slower regading support due to an internal migration process).

Synchronization and time keeping of multiple applications

How would I implement a system that will keep 20 applications running on a closed network to stay synchronized whilst performing various tasks?
Each application will be identical, on an identical machine. These machines will have a socket connection to the master application that will issue TCP commands to the units such as Play:"Video1.mp4". It is vital that these videos are played at the same time and keep time with each other.
The only difference between each unit is that the window will be offset on the desktop, so that each one has a different view port on the application - as this will be used in a multi-projector set up.
any solutions/ideas would be greatly appreciated.
I did it some years ago. 5 computers running 5 instances of the same flash app. Evey app was displaying a "slice" of the same huge app and everything needed to be synchronized at fractions of seconds precision.
I used a simple Python script (running on a 6th machine) that was sending OSC messages on the local network. the flash apps were listening through FLOSC to this packets, and were sending to the Python script message about their status.
The stuff was running at the Withney Museum (NY) and at Palais de Tokyo (Paris), so I'm quite confident about the solution :) I hope it helps you
You have to keep tracking and latest updated data in your master application. you have to broadcast your newly updated data to all connected client to deliver updated data. after any update from any client you have to send updated data to all connected clients.
In FMS remote shared object is used to maintain data centrally across the network connected application via FMS. when any client is sending any updated OnSync Event is fired to all client application and data is sync with FMS Remote Shared Object. So this kind of Flow you have to develop for proper synchronization of data across network.
you can also use the RPC system to sync data between all connected application to the Master application. in that you have to init RPC to the client to Master application to send data update and Master application send RPC to all other client which are connected to the Master application.