Read JSON data values - json

For example,
if the data in kafka toipc looks like this
{
"header": {
"name": "jake"
},
"body": {
"Data":"!#$%&&"
}
}
So how do I read the value "!#$%&&" from my consumer application? I need to process the data once I get that data

You'll need to consume the data using String Serde, JSON Serde, or define your own.
If you define your own, then you'd call value.getBody().getData(), like any other Java Object, where value is the argument from mapValues, peek, filter, etc. Kafka Streams DSL
For the others, the answer will depend on what JSON library you're using, but the answer isn't unique to Kafka, so read that library's documentation on parsing strings.
Here's one example of consuming using String Serde - https://github.com/confluentinc/kafka-streams-examples/blob/7.1.1-post/src/main/java/io/confluent/examples/streams/JsonToAvroExample.java#L118

Related

How do you efficiently handle highly dynamic JSON structures in Swift?

I am writing a Swift program to interact with our backend Database. These database tables can change format, and each row is from DynamoDB and is represented by a JSON object. Within the program, it will grab the cache from the server which contains the Database structures and then the current contents of the database. I don't want to hardcode the database structures into the app as that would obviously make this too very hard to maintain. What is the best way to read in these rows into a generic dictionary object with dynamic names that can hold any of the JSON types?
One way you can handle parsing dynamic JSON is to have a key or a set of keys that gives information about which type of model to decode. For example, you can have a key type in your JSON objects:
{
...
"type": "some type"
...
}
and an intermediate model:
struct IntermediateModel: Codable {
let type: String // or enum
}
you first decode all your JSON objects to the intermediate model and then decode them to the actual model type based on the value of type.
I have written some helper protocols that make this easier here.

How to convert a Json to Graphql mutation format [duplicate]

I have a Json object and I want to convert it graphql mutation query inorder to invoke a graphql api.
So my requirement is I need to create a springboot project and expose a POST API which will take input as a json and convert it into graphql mutation query format,
then I need to consume a graphql api with the mutation query. Can any one please provide any pointer to it as I am not able to proceed.
Pass your object as a variable. You are able to put it in JSON format in the Graphql variables definitions.
If you look at the official tutorial on mutations, in the query, they pass the ReviewInput as a variable, $review:
mutation CreateReviewForEpisode($ep: Episode!, $review: ReviewInput!) {
createReview(episode: $ep, review: $review) {
stars
commentary
}
}
As the ReviewInput is, at least in my use case, a standard object from the model, used for both REST and GraphQl, the JSON can be directly pasted in the GraphQl in the corresponding review field in the GraphQl variables:
{
"ep": "JEDI",
"review": {
"stars": 5,
"commentary": "This is a great movie!"
}
}
{ "stars": 5, "commentary": "This is a great movie!" } being your JSON.
So, in short, no conversion is required, you just need your API to put the JSON in the correct mutation template.

Parsing json from incoming datastream to perform simple transformations in Flink

I am using Apache Flink with AWS Kineses for the first time. Basically my objective is to transform incoming data from a Kinesis stream in such a way that I can perform simple transformations such as filtering and aggregating.
I am adding the source using the below:
return env.addSource(new FlinkKinesisConsumer<>(inputStreamName, new SimpleStringSchema(), inputProperties));
Eventually when I am print the incoming stream I am getting json data in as expected:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<String> input = createSourceFromStaticConfig(env);
input.print();
This is a sample result of the print:
{"event_num": "5530", "timestmap": "2019-03-04 14:29:44.882376",
"amount": "80.4", "type": "Purchase"} {"event_num": "5531", "timestmap": "2019-03-04
14:29:44.881379", "amount": "11.98", "type": "Service"}
Can someone enlighten me how I can access these json elements in such a way that I can perform simple transformation such as, selecting only records containing "Service" as a type?
As you are using SimpleStringSchema the resulting stream of events is of type String. Therefore you would need to parse the string first and then you could apply filters etc.
You may want to have a look though at JsonNodeDeserializationSchema, which will produce ObjectNode.

Is Apache Ignite suited for NoSQL schema

Is JSON within a JSON supported within Apache Ignite?
Example:
{
"stuff": {
"onetype": [
{"id":1,"name":"John Doe"},
{"id":2,"name":"Don Joeh"}
],
"othertype": {"id":2,"company":"ACME"}
},
"otherstuff": {
"thing": [[1,42],[2,2]]
}
}
Goal is being able to query based on any field in a JSON. So far with Apache Ignite I have seen that with creating a class and then storing object of it - is possible to add indexes and query json on a first level of Key/Value pairs but did not see any example for a nested JSON.
Is it maybe better to use MongoDB or Cassandra for that kind of need (to index and query any nested field within a JSON)?
JSON is treated as a regular string when it's put into a cache.
When a JSON has only a single level, then it's possible to represent it as either POJO or BinaryObject, put it into a cache and benefit from all the querying capabilities, but nested objects cannot be indexed and queried properly so far.
As an option, you could use ScanQueries

Avro Schema: force to interpret value (map, array) as string

I want to convert JSON to Avro via NiFi. Unfortunately the JSON has complex types as values that I want to see as a simple string!
JSON:
"FLAGS" : {"FLAG" : ["STORED","ACTIVE"]}
How can I tell AVRO to simply store "{"FLAG" : ["STORED","ACTIVE"]}" or "[1,2,3,"X"]" as a string?
Thank you sincerely!
The JSON to Avro conversion performed in NiFi's ConvertJSONToAvro processor does not really do transformation in the same step. There is a very limited ability to transform based on the Avro schema, mostly omitting input data fields in the output. But it won't coerce a complex structure to a string.
Instead, you should do a JSON-to-JSON transformation first, then convert your summarized JSON to Avro. I think what you are looking for is a structure like this:
{
"FLAGS": "{\"FLAG\":[\"STORED\",\"ACTIVE\"]}"
}
NiFi's JoltTransformJSON, ExecuteScript processors are great for this. If your records are simple enough, maybe even a combination of EvaluateJsonPath $.FLAGS and ReplaceText { "FLAGS": "${flags:escapeJson()}" }.