I want to perform validation of JSON response against DB data.
For example I have a student table with column as "StudentID" , "StudentName" and "StudentAddress"
and In JSON response we have element as "StudentNumber", "StuName" and "StuAddress" (Name is different in both JSON and DB)
Question 1: How can I compare entire JSON against the DB data to match it in JMeter.
Question 2: If I want to perform validation like if in Database "StudentID"=1 then in JSON response "StudentNumber" should be equal to "A". How can I validate it in JMeter in a single script
If you want to compare entire JSON to the result of the database query which returns more than 1 row - the only option is going for JSON Assertion and implementing your pass/fail criteria in Groovy.
Groovy has built-in JSON support so it can parse the response as JSON and convert it to maps/lists/values
Groovy can iterate the ResultSet object containing the query results, see Debugging JDBC Sampler Results in JMeter for more details
If you want to compare individual entries:
In the JDBC Test Element specify variable name to store the value from the database
Use JSON Extractor or JSON JMESPath Extractor to store the value from JSON
Compare 2 variables using Response Assertion
Following is a sample code used within a JSR223 element to process JSON response
import groovy.json.JsonSlurper
try{
def responseJSON = prev.getResponseDataAsString()
if(responseJSON) {
def jsonSlurper = new JsonSlurper()
def object = jsonSlurper.parseText(responseJSON)
def itemCount = object['items']['id'].size()
log.info("Items $itemCount")
for (i in 0..<itemCount){
log.info("Student Number ${object['items']['id'][i]}")
log.info("Student Name ${object['items']['studentName'][i]}")
log.info("Address ${object['items']['address'][i]}")
}
}catch (anything){
log.info("Error ")
}
Groovy has support for working with JSON. Documentation.
Related
overall aim
I have data landing into blob storage from an azure service in form of json files where each line in a file is a nested json object. I want to process this with spark and finally store as a delta table with nested struct/map type columns which can later be queried downstream using the dot notation columnName.key
data nesting visualized
{
key1: value1
nestedType1: {
key1: value1
keyN: valueN
}
nestedType2: {
key1: value1
nestedKey: {
key1: value1
keyN: valueN
}
}
keyN: valueN
}
current approach and problem
I am not using the default spark json reader as it is resulting in some incorrect parsing of the files instead I am loading the files as text files and then parsing using udfs by using python's json module ( eg below ) post which I use explode and pivot to get the first level of keys into columns
#udf('MAP<STRING,STRING>' )
def get_key_val(x):
try:
return json.loads(x)
except:
return None
Post this initial transformation I now need to convert the nestedType columns to valid map types as well. Now since the initial function is returning map<string,string> the values in nestedType columns are not valid jsons so I cannot use json.loads, instead I have regex based string operations
#udf('MAP<STRING,STRING>' )
def convert_map(string):
try:
regex = re.compile(r"""\w+=.*?(?:(?=,(?!"))|(?=}))""")
obj = dict([(a.split('=')[0].strip(),(a.split('=')[1])) for a in regex.findall(s)])
return obj
except Exception as e:
return e
this is fine for second level of nesting but if I want to go further that would require another udf and subsequent complications.
question
How can I use a spark udf or native spark functions to parse the nested json data such that it is queryable in columnName.key format.
also there is no restriction of spark version, hopefully I was able to explain this properly. do let me know if you want me to put some sample data and the code for ease. Any help is appreciated.
Response from http request:
{"Preferredvalue":{"notations":[]}}
def response = new groovy.json.JsonSlurper().parse(prev.getResponseData())
I am able to get up to notations and also the size.
If the size is 0, I want to update the notations as below
{"Preferredvalue":{"notations":[{"Name":${firstName},"lName":${lastName}}]}
firstName and lastName are Jmeter variable which are fetched from another calls, and I want to use these values in my another call and send a PUT request.
Searched a lot but couldnt find an answer :(
Best,
Something like:
def response = new groovy.json.JsonSlurper().parse(prev.getResponseData())
def notations = response.Preferredvalue.notations
if (notations.size() == 0) {
notations.add([Name: vars.get('firstName'), lName: vars.get('lastName')])
}
def request = new groovy.json.JsonBuilder(response).toPrettyString()
vars.put('request', request)
should do the trick for you. Refer generated value as ${request} where required
More information:
Apache Groovy - Parsing and producing JSON
Apache Groovy: What Is Groovy Used For?
JSON response returns an object with the following value.
2019-03-20T14:51:30.579+0000
I want to ignore the .579+0000 part for my validation. How can I trim it from the actual value so that I get:
2019-03-20T14:51:30
I would recommend parsing the object value as a Date, this way you will have possibility to convert it to whatever format you like.
Given you have the following JSON response:
{
"someObject": "2019-03-20T14:51:30.579+0000"
}
You can do the transformation as follows:
Add JSR223 PostProcessor as a child of the request which returns the above JSON
Put the following code into "Script" area:
def originalDate = new groovy.json.JsonSlurper().parse(prev.getResponseData()).someObject
log.info("Original date: " + originalDate)
vars.put("myDate", Date.parse("yyyy-MM-dd'T'HH:mm:ss.SSSX", originalDate).format("yyyy-MM-dd'T'HH:mm:ss"))
log.info("Converted date: " + vars.get("myDate"))
you will need to amend this someObject bit with the path to the JSON attribute holding this date. Once done you should be able to access the "trimmed" data as ${myDate} where required.
References:
SimpleDateFormat
Groovy: Parsing and producing JSON
Groovy Goodness: Working with Dates
The Groovy Templates Cheat Sheet for JMeter
import com.jayway.jsonpath.JsonPath
def idCSV = new File('id.csv')
def index = [fileOne.json, fileTwo.json]
def jsonString
index.each { file ->
jsonString = ________
def ids = JsonPath.read(jsonString, '$..id')
ids.each { id ->
idCSV << id << newLine
}
}
How to fill the jsonString = ____, so that I can json file into string and parse the string to extract ids and some information from the json string.
And I don't to do it in http request-> GET-> file format.
Previously i have extraced jsonString from http response and it worked well now I want to do it this way.
Use JsonSlurper:
def jsonString = new groovy.json.JsonSlurper().parseText(new File("json.txt").text)
My expectation is that you're looking for File.getText() function
jsonString = file.text
I have no full vision why do you need to store the values from JSON in a CSV file, however there is an alternative way of achieving this which doesn't require scripting as your approach will work with 1 concurrent thread only, if you will add more users attempting writing into the same file - you'll run into a race condition :
You can read the files from the folder into JMeter Variables via Directory Listing Config
The file can be read using HTTP Request sampler
The values cane be fetched using JSON Extractor, they will be automatically stored into JMeter Variables so you will able to use them later on
If you need the values to be present in the file (although I wouldn't recommend this approach cause it will cause massive disk IO and potentially can run your test) you can go for the Flexible File Writer
I have a json response in 1 request like this:
{"total":1,"page":1,"records":2,"rows":[{"id":1034,"item_type_val":"Business
Requirement","field_name":"Assigned To","invalid_value":"Jmeter
System","dep_value":"","dep_field":""},{"id":1033,"item_type_val":"Risk","field_name":"Category","invalid_value":"Energy","dep_value":"Logged
User","dep_field":"Assigned To"}]}
and in 2nd request like this:
{"total":1,"page":1,"records":2,"rows":[{"id":1100,"item_type_val":"Business
Requirement","field_name":"Assigned To","invalid_value":"Jmeter
System","dep_value":"","dep_field":""},{"id":1111,"item_type_val":"Risk","field_name":"Category","invalid_value":"Energy","dep_value":"Logged
User","dep_field":"Assigned To"}]}
Both are same but different id's. I need to verify the 1st json response from 2nd json response and compare both that both are same or not. here both are same but having different id's which should be acceptable. how can i do this by regex so i can ignore the id's and match whole content?
Not sure if you can do it with a single regex but other way out is you can create a map of it and then compare everything except 'id'
I believe the easiest way would be just discarding these id entries using JSR223 PostProcessor and Groovy language which comes with JSON support
Add JSR223 PostProcessor as a child of the sampler, which returns your JSON
Put the following code into the JSR223 PostProcessor's "Script" area
import groovy.json.JsonBuilder
import groovy.json.JsonSlurper
def slurper = new JsonSlurper()
def jsonResponse = slurper.parseText(prev.getResponseDataAsString())
jsonResponse.rows.findAll { it.remove("id") }
def newResponse = new JsonBuilder(jsonResponse).toPrettyString()
//depending on what you need
vars.put("responseWithoutId", newResponse) // store response withou ID into a JMeter Variable
prev.setResponseData(new String(newResponse)) // overwrite parent sampler response data
log.info(newResponse) // just print the new value to jmeter.log file
So you have the following choices:
vars.put("responseWithoutId", newResponse) - stores the new JSON (without these id) into a ${responseWithoutId} JMeter Variable
prev.setResponseData(new String(newResponse)) - after this line execution parent sampler data won't contain any "id"
log.info(newResponse) - just prints JSON without "id" to jmeter.log file
I don't know your test plan design, personally I would store responses from 2 requests into 2 different JMeter Variables i.e. ${response1} and ${response2} using above approach and compare them with the Response Assertion like: