Receiving strange values to kafka topic from kura - json

Trying to send data to kafka topic from kura through Mqtt proxy but receiving strange values in kafka topic
we are trying to send json data to kafka topic

The MQTT proxy only sends binary data (bytes or strings), and Control Center can only show strings.
You will want to verify the Kafka producer settings to see if it is truly sending JSON, and not other binary format, or compressed/encrypted bytes.
You can further debug with kafka-console-consumer, rather than use any UI tool

Related

Microsoft Azure iot-hub: The measured senosor values sent via device to iot-hub cannot be read stored Json data

We are using a data acquisition system as a device and send some signals values via MQTT protocol into a container which is assigned to an iot-hub. The connection works well between device and iot-hub, and we receive some JSON data. When we open a JSON data, We cannot read the temperature values in "Body" inside the JSON data, since they are encoded. I would be thankful if you tell us, how we should automatically convert the JSON data to a proper format so that we could read the values in numbers?
Please find below three of our code's lines in JSON Data. The rest of the lines are the same, but they are encoded differently.
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.8600000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.8600000Z"},"Body":"My42MjI3NTQ="}
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.8750000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.8750000Z"},"Body":"My42ODEyNDY="}
{"EnqueuedTimeUtc":"2022-02-09T10:00:30.9070000Z","Properties":{"Sensor":""},"SystemProperties":{"connectionDeviceId":"Iba","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637799949903534194","enqueuedTime":"2022-02-09T10:00:30.9070000Z"},"Body":"My43Mzk1OTI="}
Thanks in advance!
Br
Masoud
you should add to the message topic two parameters such as the content-type (ct) and content-encoding (ce) like is shown in the following example:
devices/device1/messages/events/$.ct=application%2Fjson&$.ce=utf-8

How to read data from Kinesis stream using AWS CLI?

I have a Kinesis stream in AWS and can send data to it (JSON) using kinesis command and can get it back from a stream with:
SHARD_ITERATOR=$(aws kinesis get-shard-iterator --shard-id shardId-000000000000 --shard-iterator-type TRIM_HORIZON --stream-name mystream --query 'ShardIterator' --profile myprofile)
aws kinesis get-records --shard-iterator $SHARD_ITERATOR --profile myprofile
Output of this looks like something like:
HsKCQkidmlkZW9Tb3VyY2UiOiBbCgkJCXsKCQkJCSJicmFuZGluZyI6IHt9LAoJCQkJInByb21vUG9vbCI6IFtdLAoJCQkJImlkIjogbnVsbAoJCQl9CgkJXSwKCQkiaW1hZ2VTb3VyY2UiOiB7fSwKCQkibWV0YWRhdGFBcHByb3ZlZCI6IHRydWUsCgkJImR1ZURhdGUiOiAxNTgzMzEyNTA0ODAzLAoJCSJwcm9maWxlIjogewoJCQkiY29tcG9uZW50Q291bnQiOiAwLAoJCQkibmFtZSI6ICJTUUVfQVRfUFJPRklMRSIsCgkJCSJpZCI6ICJTUUVfQVRfUFJPRklMRV9JRCIsCgkJCSJwYWNrYWdlQ291bnQiOiAwLAoJCQkicGFja2FnZXMiOiBbCgkJCQl7CgkJCQkJIm5hbWUiOiAiUEVBQ09DSy1MVEEiLAoJCQkJCSJpZCI6ICJmZDk5NTRmZC03NDYwLTRjZjItOTU5Ni05YzBhMjcxNTViODgiCgkJCQl9CgkJCV0KCQl9LAoJCSJ3b3JrT3JkZXJJZCI6ICJTUUVfQVRfSk9CX1NVQk1JU1
How do I get actual JSON message in raw format (to look as JSON) - same way as it was in original when I sent it?
Thanks
As per the docs, you need to use a Base64 decoding tool or use KCL library to get the data in the format it was sent:
The first thing you'll likely notice about your record in this part of the tutorial is that the data appears to be garbage –; it's not the clear text testdata we sent. This is due to the way put-record uses Base64 encoding to allow you to send binary data. However, the Kinesis Data Streams support in the AWS CLI does not provide Base64 decoding because Base64 decoding to raw binary content printed to stdout can lead to undesired behavior and potential security issues on certain platforms and terminals. If you use a Base64 decoder (for example, https://www.base64decode.org/) to manually decode dGVzdGRhdGE= you will see that it is, in fact, testdata. This is sufficient for the sake of this tutorial because, in practice, the AWS CLI is rarely used to consume data, but more often to monitor the state of the stream and obtain information, as shown previously (describe-stream and list-streams). Future tutorials will show you how to build production-quality consumer applications using the Kinesis Client Library (KCL), where Base64 is taken care of for you. For more information about the KCL, see Developing KCL 1.x Consumers.
on unix, you can use the base64 --decode command to decode the base64 encoded kinesis record data.
for example, to decode the data of the first record:
# define the name of the stream you want to read
KINESIS_STREAM_NAME='__your_stream_name_goes_here__';
# define the shard iterator to use
SHARD_ITERATOR=$(aws kinesis get-shard-iterator --shard-id shardId-000000000000 --shard-iterator-type TRIM_HORIZON --stream-name $KINESIS_STREAM_NAME --query 'ShardIterator');
# read the records, use `jq` to grab the data of the first record, and base64 decode it
aws kinesis get-records --shard-iterator $SHARD_ITERATOR | jq -r '.Records[0].Data' | base64 --decode

RawRabbit - How to publish/subscribe to JSON message

I'm trying to do few quick prototypes of using RabbitMQ as message broker for internal services as well as messages from external clients received by gateway over websocket connection.
I decided it would be best (and probabaly only) option for client to publish messages as json, and then for gateway to simply send the unaltered json messages forward.
I've seen that RawRabbit have the ability to take raw Json as message and then deserialize it to C# class.
What I can't find is some example and/or documentation of how the process should look like. Also cannot find documentation of how the Json message should be formatted.

Kafka to Database data Pipelining

I have data coming into Kafka, I want to push data from kafka to database(Postgresql).
I am following the steps of this link "https://hellokoding.com/kafka-connect-sinks-data-to-postgres-example-with-avro-schema-registry-and-python/" and I am getting the error. Any suggestions.
This is no code required, this can be achieved by using Kafka connect and mysql connector available on Confluent.
https://www.confluent.io/hub/confluentinc/kafka-connect-jdbc
The error you're getting is this:
Failed to deserialize the data for topic
Error deserializing Avro message
Unknown magic byte
This is because you have specified Avro converter, but the data in your topic is not Avro.
See this article for details: https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained

Cumulocity JSON via MQTT / measurement and error topic

I'm trying to send measurement values to my Cumulocity Tenant via JSON MQTT.
For my test connection I use MQTTBox.
I have already successfully sent data / create new deivces via MQTT.
100,MQTT-Simulation-1,c8y_MQTTdevice
211,80
I have now tried to change my format to JSON MQTT.
According to the Cumulocity IoT Guide (http://www.cumulocity.com/guides/mqtt/json/) I have defined my MQTT topic endpoint.
<api>/<resource>/<action>/<id>
measurement/measurements/create/231
Topic to publish: measurement/measurements/create/231
Payload Type: String / JSON / XML / Characters
Payload: {"c8y_Temperature": {"T": {"unit": "°C","value": 35.742410434759904}}}
As described in the guide, the payload of SmartRest can be used?
The examples in the Guide looks different as described, no ID can be found here in the topic to publish.
A subscription to the topic error is also not possible. Only to the topic smartrest error s/e.
JSON via MQTT will only be possible in the next release for my main tenant - cumulocity support.