Convert JSON to Avro in Jmeter - json

I need to convert a JSON message to Avro so that I can send it to Kafka.
I have the sample JSON request and also the schema from an avsc file for Avro.
Any ideas how I can do this please?
Thanks

You will need:
To download Avro Java libraries and put them (along with the dependencies) to JMeter Classpath
To restart JMeter to pick up the libraries
To add a suitable JSR223 Test Element with the relevant Groovy code to perform the conversion, a piece of example code is below:
String schemaJson = 'your schema here'
String genericRecordStr = 'your json payload here'
def schemaParser = new org.apache.avro.Schema.Parser()
def schema = schemaParser.parse(schemaJson)
def decoderFactory = new org.apache.avro.io.DecoderFactory()
def decoder = decoderFactory.jsonDecoder(schema, genericRecordStr)
def reader = new org.apache.avro.generic.GenericDatumReader()(schema)
def record = reader.read(null, decoder)
To send the message to Kafka, out of box JMeter doesn't support Kafka a couple of options are described in the Apache Kafka - How to Load Test with JMeter article

Assuming by "Avro", you are using the Schema Registry. If you're able to install the Kafka REST Proxy, you could send JSON events to it, then the proxy can be configured to integrate with the Registry to send Avro objects to the brokers
Or you can write a custom sampler, as linked in the other answer, but you'd swap the StringSerializer for an Avro Serializer implementation (you'd need to add that to the Jmeter classpath)

Related

How to create Kafka topic from a JSON file and then extract subset of the topic and output to another JSON file

I am new with Kafka tool.
Here are the steps I would like to perform in Kafka:
Connect Kafka and input data from a JSON file (I am familiar with this part)
Publish to a Kafka topic
Extract subset of data from the topic and publish (create) to a new topic
Extract data from that new topic and output to a JSON file
Note: my coding preference is Python.
If you want to use Python, you can use Faust to map data between topics
By default, it uses JSON serialization, and if you want to write to a new file, you want to consume through a stream

How do I add assertion in JSON request in Data driven testing . I am using SOAP UI free version

I am doing Data driven testing for JSON request using Soap UI free version. I have done the parameterization using Excel as Datasource. Now I want to validate the response for each input value. Can anyone help me on how can I proceed with this?
You have to use Groovy Scripiting for that..
for example
Request="""{
"qoteRef": "H0005T",
"lineusiness": "4",
"scief": "A02",
"pode": "CH7ND",
"do": "1977-06-12",
"soeOfB": "hello"
}"""
Below code can read one element of json and then you can validate it
def json= new JsonSlurper().parseText(Request)
def quoteRef=json.qoteRef
using JsonSlurper inside Groovy Script you can proceed with validations

How to write the output from DB to a JSON file using Spring batch?

I am new to spring batch and there is a requirement for me to read the data from DB and write in to JSON format. whats the best way to do this ? Any api's are there? or we need to write custom writer ? or i need to user JSON libraries such as GSON or JACKSON ? Please guide me...
To read data from a relational database, you can use one of the database readers. You can find an example in the spring-batch-samples repository.
To write JSON data, Spring Batch 4.1.0.RC1 provides the JsonFileItemWriter that allows you to write JSON data to a file. It collaborates with a JsonObjectMarshaller to marshal an object to JSON format. Spring Batch provides support for both Gson and Jackson libraries (you need to have the one you want to use in the classpath). You can find more details here.
Hope this helps.
You do not need GSON or Jackson Libraries if you DB support JSON.
Example : In SQL Server there is an option to get data out of DB as JSON String instead of resultset.
Reference - https://learn.microsoft.com/en-us/sql/relational-databases/json/format-query-results-as-json-with-for-json-sql-server?view=sql-server-2017
https://learn.microsoft.com/en-us/sql/relational-databases/json/format-nested-json-output-with-path-mode-sql-server?view=sql-server-2017
Example - select (select * from tableName for json path) as jsonString;
This will already give you output in JsonString which you can write to a file.

Json request validation in Jmeter

As my json request contains all mandatory parameters in request body, I want to validate all parameters using Jmeter.
Do let me know if it is possible to validate all request parameters presents in json body using jmeter or jmeter plugins
Normally people are interested in validating responses, not the requests therefore I am not aware of any suitable Test Elements either bundled or available via Plugins. For responses validation you have JSON Path Assertion
If for some reason you need to validate the request and fail the sampler if the validation fails you can use JSR223 Post Processor. Example code:
def request = sampler.getArguments().getArgument(0).getValue()
def json = new groovy.json.JsonSlurper().parseText(request)
// do any checks you need, for example
if (!json.keySet().contains("somekey")) {
log.info("Key \"somekey\" was not found in the request")
prev.setSuccessful(false)
}
References:
JsonSlurper
Parsing and producing JSON
Groovy Is the New Black

Kafka Serializer JSON [duplicate]

This question already has answers here:
Writing Custom Kafka Serializer
(3 answers)
Closed 2 years ago.
I am new to Kafka, Serialization and JSON
WHat I want is the producer to send a JSON file via kafka and the consumer to consume and work with the JSON file in its original file form.
I was able to get it so JSON is converter to a string and sent via a String Serializer and then the consumer would parse the String and recreate a JSON object but I am worried that this isnt efficient or the correct method (might lose the field types for JSON)
So I looked into making a JSON serializer and setting that in my producer's configurations.
I used the JsonEncoder here : Kafka: writing custom serializer
But when I try to run my producer now, it seems that in the toBytes function of the encoder the try block is never returning anything like i want it to
try {
bytes = objectMapper.writeValueAsString(object).getBytes();
} catch (JsonProcessingException e) {
logger.error(String.format("Json processing failed for object: %s", object.getClass().getName()), e);
}
Seems objectMapper.writeValueAsString(object).getBytes(); takes my JSON obj ({"name":"Kate","age":25})and converts it to nothing,
this is my producer's run function
List<KeyedMessage<String,JSONObject>> msgList=new ArrayList<KeyedMessage<String,JSONObject>>();
JSONObject record = new JSONObject();
record.put("name", "Kate");
record.put("age", 25);
msgList.add(new KeyedMessage<String, JSONObject>(topic, record));
producer.send(msgList);
What am I missing? Would my original method(convert to string and send and then rebuild the JSON obj) be okay? or just not the correct way to go?
THanks!
Hmm, why are you afraid that a serialize/deserialize step would cause data loss?
One option you have is to use the Kafka JSON serializer that's included in Confluent's Schema Registry, which is free and open source software (disclaimer: I work at Confluent). Its test suite provides a few examples to get you started, and further details are described at serializers and formatters. The benefit of this JSON serializer and the schema registry itself is that they provide transparent integration with producer and consumer clients for Kafka. Apart from JSON there's also support for Apache Avro if you need that.
IMHO this setup is one of the best options in terms of developer convenience and ease of use when talking to Kafka in JSON -- but of course YMMV!
I would suggest to convert your event string which is JSON to byte array like:
byte[] eventBody = event.getBody();
This will increase your performance and Kafka Consumer also provides JSON parser which will help you to get your JSON back.
Please let me know if any further information required.