data is not transfering to thingsboard from nodered via mqtt - json

I try to send data to thingsboard via mqtt. The mqtt node shows connected status but data is not transferring to thingsboard. Eventhough data is in json format, data is not receving in thingsboard. Any help will be more helpful.
sample sending data:
{ "main-door": "closed", "main-light": "OFF"}

You need to follow the Thingsboard rules for payload format at https://thingsboard.io/docs/getting-started-guides/helloworld/#pushing-data-using-mqtt-coap-or-http
You did not say which topic you are publishin to.

Related

Receiving strange values to kafka topic from kura

Trying to send data to kafka topic from kura through Mqtt proxy but receiving strange values in kafka topic
we are trying to send json data to kafka topic
The MQTT proxy only sends binary data (bytes or strings), and Control Center can only show strings.
You will want to verify the Kafka producer settings to see if it is truly sending JSON, and not other binary format, or compressed/encrypted bytes.
You can further debug with kafka-console-consumer, rather than use any UI tool

Microsoft Azure iot-hub: The measured senosor values sent via device to iot-hub cannot be read stored Json data

We are using a data acquisition system as a device and send some signals values via MQTT protocol into a container which is assigned to an iot-hub. The connection works well between device and iot-hub, and we receive some JSON data. When we open a JSON data, We cannot read the temperature values in "Body" inside the JSON data, since they are encoded. I would be thankful if you tell us, how we should automatically convert the JSON data to a proper format so that we could read the values in numbers?
Please find below three of our code's lines in JSON Data. The rest of the lines are the same, but they are encoded differently.
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.8600000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.8600000Z"},"Body":"My42MjI3NTQ="}
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.8750000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.8750000Z"},"Body":"My42ODEyNDY="}
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.9070000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.9070000Z"},"Body":"My43Mzk1OTI="}
Thanks in advance!
Br
Masoud
you should add to the message topic two parameters such as the content-type (ct) and content-encoding (ce) like is shown in the following example:
devices/device1/messages/events/$.ct=application%2Fjson&$.ce=utf-8

Configuring a linked service between zoho (CRM) and Azure Data Factory

I am trying to configure a linked service in Azure Data Factory (ADF) in order to load the ZOHO data to my SQL database. I am using a REST API linked service for this which successfully connects to ZOHO. I am mainly struggling to select the proper relative URL. Currently I use the following settings:
Base URL: https://www.zohoapis.eu/crm/v2/
Relative URL: /orgxxxxxxxxxx
When I try to preview the data, this results in the following error:
Error occurred when deserializing source JSON file ''. Check if the data is in valid JSON object format.
Unexpected character encountered while parsing value: <. Path '', line 0, position 0.
Activity ID: 8d32386a-eee0-4d2a-920a-5c70dc15ef06
Does anyone know what I have to do to make this work such that I can load all required Zoho tables into ADF?
As per Microsoft official document:
The Zoho connector is supported for the following activities:
1. Copy activity with supported source/sink matrix
2. Lookup activity
REST API doesn't support Zoho Connector.
You are expecting a JSON file as source but your Sink is Azure SQL Database. You need to change your approach.
You can use directly use Zoho Linked Service in ADF to fetch the data from zoho (CRM).
You can copy data from Zoho to any supported sink data store. This
connector supports Xero access token authentication and OAuth 2.0
authentication.
You need to use Copy activity to fetch the data from Zoho (CRM) and store it in Azure Storage account. Once data received, you need to Denormalize the JSON file to store it in Azure SQL Database. Use data flow activity to denormalize the data using flatten transformation.
Use the flatten transformation in mapping data flow to take array values inside
hierarchical structures such as JSON and unroll them into individual
rows. This process is known as denormalization.
Learn more here about Flatten transformation in mapping data flow.
Once flatten done, you can store the data in Azure SQL Database.

Retrieving forecast data from OpenWeatherMap in FIWARE ORION

I am trying to get weather forecasts data from OpenWeatherMap and integrate them in Orion by performing a registeration request.
I was able to register and get the API key from OpenWeatherMap, however, the latter returns a JSON file with all the data inside, which is not supported by ORION.
I have followed the step by step tutorial https://fiware-tutorials.readthedocs.io/en/latest/context-providers/index.html#context-provider-ngsi-proxy where they have acquired the data from OpenWeatherMap using NGSI proxy, an API key is required to be indicated in the docker-compose file as an environment variable, however, the data acquired is the "current data" and not forecast and also specific to Berlin.
I have tried to access the files inside the container "fiware/tutorials.context-provider" and try to modify and match the parameters to my needs but I feel like I am taking a long blocked path.
I don't think that's even considered as good practice but I have run out of ideas :(
Can anyone suggest how I could bring the forecast data to Orion and register it as a context provider?
Thank you in advance.
I imagine you aim to implement a context provider, able to speak NGSI with Orion.
OpenWeatherMap surely doesn't implement NGSI ...
If you have the data from OpenWeatherMap, as a JSON string, perhaps you should parse the JSON and create your entities using some select key-values from the parsed OpenWeatherMap? Save the entity (entities) locally and then register those keys in Orion.
Alternatively (easier but I wouldn't recommend it), create local entities with the entire OpenWeatherMap data as the value of an attribute of the entity:
{
"id": "id-from-OpenWeatherMap",
"type": "OpenWeatherMap",
"weatherData": {
"value":
...
}
...
}
Then you register id/weatherData in Orion.

Cumulocity JSON via MQTT / measurement and error topic

I'm trying to send measurement values to my Cumulocity Tenant via JSON MQTT.
For my test connection I use MQTTBox.
I have already successfully sent data / create new deivces via MQTT.
100,MQTT-Simulation-1,c8y_MQTTdevice
211,80
I have now tried to change my format to JSON MQTT.
According to the Cumulocity IoT Guide (http://www.cumulocity.com/guides/mqtt/json/) I have defined my MQTT topic endpoint.
<api>/<resource>/<action>/<id>
measurement/measurements/create/231
Topic to publish: measurement/measurements/create/231
Payload Type: String / JSON / XML / Characters
Payload: {"c8y_Temperature": {"T": {"unit": "°C","value": 35.742410434759904}}}
As described in the guide, the payload of SmartRest can be used?
The examples in the Guide looks different as described, no ID can be found here in the topic to publish.
A subscription to the topic error is also not possible. Only to the topic smartrest error s/e.
JSON via MQTT will only be possible in the next release for my main tenant - cumulocity support.