Jmeter - Passing specific JSON response to HTTP request dynamically - json

I am having a specific requirement in Jmeter(2.13) where i need to pass two parameters multiple times dynamically as id and parentObjectApiName
{
"id":"SomeNumber",
"parentObjectApiName":"SomeName"
},
{
"id":"SomeNumber",
"parentObjectApiName":"SomeName"
},
}
Which i will be getting from a response as :
{
"detailMap": {
"RootNumber": [
{
"id": "SomeNumber",
"properties": {
},
"isDeleted": false,
"version": "2017-11-20T08:13:30+00:00",
"referenceId": null,
"parentObjectApiName": "SomeName"
},
{
"id": "SomeNumber",
"properties": {
},
"isDeleted": false,
"version": "2017-04-21T15:40:10.742+00:00",
"referenceId": null,
"parentObjectApiName": "SomeName"
},
{
:
},
]
}
"state": {
"errorDetails": []
}
}
Is there any workaround for the above requirement by using beanshell in Jmeter(2.13).

Your requirement can be achieved by following the steps below.
Add "JSON Extractor" to the request where the response contains the parameters which you want to pass in next request.(Image-1)
JSON Extractor configuration(Image-2)
Keeping the JSON Extractor as it is, Add "Beanshell PostProcessor" to the request and keep the following part of code and try. Your desired id & parentObjectApiName will be stored in variable "json". You can call it in your next request as ${json}
import java.io.file;
import java.util.*;
import org.apache.jmeter.services.fileserver;
StringBuilder output = new StringBuilder();
Random random = new Random();
int max = Integer.parseInt(vars.get("id_matchNr"));
for(int i=1;i<=max;i++)
{
output.append("{");
output.append("\"id\":\"" + vars.get("id_"+i) + "\",\"parentObjectApiName\":" + vars.get("parentObjectApiName_"+i));
output.append("}").append( "," );
}
String withoutLastComma = output.substring( 0, output.length( ) - ",".length( ) );
vars.put("json", withoutLastComma.toString());
Image-1
Image-2

Be aware that since JMeter 3.1 it is recommended to use JSR223 Test Elements and Groovy language for any scripting in JMeter. Groovy has much better performance than Beanshell does, moreover Groovy has built-in JSON support.
Add JSR223 PostProcessor as a child of the request which returns the above JSON
Make sure you have groovy selected in the "Language" dropdown and Cache compiled script if available box ticked
Put the following code into "Script" area:
import groovy.json.JsonBuilder
import groovy.json.JsonOutput
import groovy.json.JsonSlurper
import groovy.json.internal.LazyMap
def text = prev.getResponseDataAsString()
log.info('Original response: ' + text)
def json = new JsonSlurper().parseText(text)
def data = new ArrayList()
json.detailMap.RootNumber.each { rootNumber ->
def map = new LazyMap()
map.put("id", rootNumber.id)
map.put("parentObjectApiName", rootNumber.parentObjectApiName)
data.add(map)
}
vars.put('json',JsonOutput.prettyPrint(JsonOutput.toJson(data)))
log.info('Generated json: ' + vars.get('json'))
The above code will generate the following JSON:
[
{
"id": "SomeNumber",
"parentObjectApiName": "SomeName"
},
{
"id": "SomeOtherNumber",
"parentObjectApiName": "SomeOtherName"
}
]
You will be able to access it as ${json} where required (i.e. in next HTTP Request sampler "Body Data" tab)

Related

How to parse JsonArray in Scala and writing them in a DataFrame?

Using my Scala HTTP Client I retrieved a response in JSON format from an API GET call.
My end goal is to write this JSON content to an AWS S3 bucket in order to make it available as a table on RedShift running a simple AWS Glue crawler.
My thinking is to parse this JSON message and somehow converting into a Spark DataFrame, so later on I can save it to my preferred S3 location in the format of .csv, .parquet, or whatever.
The JSON file looks like this
{
"response": {
"status": "OK",
"start_element": 0,
"num_elements": 100,
"categories": [
{
"id": 1,
"name": "Airlines",
"is_sensitive": false,
"last_modified": "2010-03-19 17:48:36",
"requires_whitelist_on_external": false,
"requires_whitelist_on_managed": false,
"is_brand_eligible": true,
"requires_whitelist": false,
"whitelist": {
"geos": [],
"countries_and_brands": []
}
},
{
"id": 2,
"name": "Apparel",
"is_sensitive": false,
"last_modified": "2010-03-19 17:48:36",
"requires_whitelist_on_external": false,
"requires_whitelist_on_managed": false,
"is_brand_eligible": true,
"requires_whitelist": false,
"whitelist": {
"geos": [],
"countries_and_brands": []
}
}
],
"count": 148,
"dbg_info": {
"warnings": [],
"version": "1.18.1621",
"output_term": "categories"
}
}
}
The content I would like to map to a Dataframe is the one contained by the "categories" JSON Array.
I have managed to parse the message using json4s.JsonMethods method parse this way:
val parsedJson = parse(request) \\ "categories"
Obtaining the following:
output: org.json4s.JValue = JArray(List(JObject(List((id,JInt(1)), (name,JString(Airlines)), (is_sensitive,JBool(false)), (last_modified,JString(2010-03-19 17:48:36)), (requires_whitelist_on_external,JBool(false)), (requires_whitelist_on_managed,JBool(false)), (is_brand_eligible,JBool(true)), (requires_whitelist,JBool(false)), (whitelist,JObject(List((geos,JArray(List())), (countries_and_brands,JArray(List()))))))), JObject(List((id,JInt(2)), (name,JString(Apparel)), (is_sensitive,JBool(false)), (last_modified,JString(2010-03-19 17:48:36)), (requires_whitelist_on_external,JBool(false)), (requires_whitelist_on_managed,JBool(false)), (is_brand_eligible,JBool(true)), (requires_whitelist,JBool(false)), (whitelist,JObject(List((geos,JArray(List())), (countries_and_brands,JArray(List()))))))))
However, I am completely lost on how to proceed. I have even tried using another library for Scala called uJson:
val json = (ujson.read(request))
val tuples = json("response")("categories").arr /* <-- categories is an array */ .map { item =>
(item("id"), item("name"))
This time I have only parsed two fields for testing, but this shouldn't change much. Hence, I obtained the following structure:
tuples: scala.collection.mutable.ArrayBuffer[(ujson.Value, ujson.Value, ujson.Value, ujson.Value)] = ArrayBuffer((1,"Airlines",false,"2010-03-19 17:48:36"), (2,"Apparel",false,"2010-03-19 17:48:36"))
However, also this time I do not know how to move forward and everything I try results in errors, mostly related to format incompatibility.
Please, feel free to propose any other approach to achieve my goal even if it changes totally my workflow. I rather learn something properly. Thanks
We can use the following code to convert JSON to Spark Dataframe/Dataset
val df00 =
spark.read.option("multiline","true").json(Seq(JSON_OUTPUT).toDS())

JMeter Dynamic JSON list generation

i'm using the following groovy script :
def oldRequest = new groovy.json.JsonSlurper().parseText(sampler.getArguments().getArgument(0).getValue())
oldRequest.values().removeAll{it.equals('null')}
oldRequest.advancedFilters.values().removeAll{it.equals('null')}
def newRequest = new groovy.json.JsonOutput().toJson(oldRequest)
sampler.getArguments().removeAllArguments()
sampler.setPostBodyRaw(true)
sampler.addNonEncodedArgument('',new groovy.json.JsonOutput().prettyPrint(newRequest),'')
to remove Keys from JSON request where the values are "Null", i also want to include logic to convert below JSON:
{
"sortOrder": "A",
"sortField": "policyNumber",
"searchTerritories": [ter1|ter2|ter3],
"pageNumberRequested": "1",
"pageCountRequested": "50",
"policyStatus" : "${ActionStatus}",
"includeTerm : "null",
"advancedFilters": {
"test" : "null",
"test1" : [A|B],
"test1" : [C|D|E]}
}
to:
{
"sortOrder": "A",
"sortField": "policyNumber",
"searchTerritories": ["ter1","ter2","ter3"],
"pageNumberRequested": "1",
"pageCountRequested": "50",
"policyStatus" : "${ActionStatus}",
"advancedFilters": {
"test1" : ["A","B"],
"test2" : ["C","D","E"]}
}
I want the input JSON values to be converted from [ter1|ter2|ter3] to ["ter1","ter2","ter3"] and [A|B] [C|D|E] converted to ["A","B"] ["C","D","E"], please help me with the groovy script modifications required.
Your source data is not a valid JSON, you can check it using i.e. online JSON validation tool
Therefore unfortunately you will not be able to use JsonSlurper, you will have to treat the source data as normal text and amend it according to your needs using i.e. Regular Expressions
Example code:
def oldRequest = sampler.getArguments().getArgument(0).getValue()
log.info('Before: ' + oldRequest)
oldRequest = oldRequest.replaceAll('(\\w+)\\|', '"$1",').replaceAll('(\\w+)\\]', '"$1"]').replaceAll("(?m)^.*null.*(?:\\r?\\n)?","")
def matcher = (oldRequest =~ /test\d+/)
def i = 0
while (matcher.find()) {
oldRequest = oldRequest.replaceFirst(matcher.group(0), 'test' + ++i)
}
log.info('After: ' + oldRequest)
sampler.getArguments().removeAllArguments()
sampler.addNonEncodedArgument('', oldRequest, '')
sampler.setPostBodyRaw(true)
Demo:
More information:
Groovy Goodness: Matchers for Regular Expressions
Apache Groovy - Why and How You Should Use It

How to manipulate json data in Jmeter

In a Jmeter Script, I need to process a http response e manipulate a json for send in next request, because actually this manipulation occurs in a Angular client.
My Http reponse:
[
{
"function":"nameA",
"rast":"F1",
"tag":"EE",
"probs":"0,987"
},
{
"function":"nameB",
"rast":"F2",
"tag":"SE",
"probs":"0,852"
},
{
"function":"nameC",
"rast":"F3",
"tag":"CE",
"probs":"0,754"
}
]
I need convert the result in json bellow to post in next request:
[
{
"function":"nameA",
"rast":"F1",
"type":{
"name":"EE"
},
"id":"alpha"
},
{
"function":"nameB",
"rast":"F2",
"type":{
"name":"SE"
},
"id":"alpha"
},
{
"function":"nameC",
"rast":"F3",
"type":{
"name":"CE"
},
"id":"alpha"
}
]
I filter the response with this JSON Extractor:
[*].["function", "rast", "tag"]
But now I need to solve other problems:
Add an id attribute (same for all functions)
Add an object with the name type.
Move the tag attribute into the object called type.
Rename the tag attribute to the name.
Add JSR223 PostProcessor as a child of the request which returns the original JSON
Put the following code into "Script" area:
def json = new groovy.json.JsonSlurper().parse(prev.getResponseData()).each { entry ->
entry << [type: [name: entry.get('tag')]]
entry.remove('tag')
entry.remove('probs')
entry.put('id', 'alpha')
}
def newJson = new groovy.json.JsonBuilder(json).toPrettyString()
log.info(newJson)
That's it, you should see the generated JSON in jmeter.log file.
If you need to have it in a JMeter Variable add the next line to the end of your script:
vars.put('newJson', newJson)
and you will be able to access generated value as ${newJson} where required
More information:
Groovy: Parsing and producing JSON
Apache Groovy - Why and How You Should Use It

Jmeter Dynamic Json Array Generation from CSV file

I have a following Json data to post.
{
"id": 1,
"name": "Zypher",
"price": 12.50,
"tags": [{
"tag": 1,
"tagName": "X"
},
{
"tag": 2,
"tagName": "Y"
},
{
"tag": 2,
"tagName": "Z"
}]
}
My Jmeter Test Plan is as following,
- Test Plan
- Thread Group
- Http Request Defaults
- Http Cookie Manager
- Simple Controller
- CSV Data Set Config (Sheet_1)
- Http Header Manager
- Http Request (The hard coded json was provided here as body data)
Every thing works fine. Now I want to use csv to parametrised my Json.
Sheet_1:
id,name,price
1,Zypher,12.50
I modified json with these 3 parameters and its works for me. Now I want to parametrise detail portion. I have no idea how to do this.
All I want to keep my json like this,
{
"id": ${id},
"name": ${name},
"price": ${price},
"tags": [
{
"tag": ${tag},
"tagName": ${tagName}
}]
}
How could I dynamically make json array tags for details portion from csv data? I want it to be looped as row provided in csv file.
Updated csv
id,name,price,tag,tagname
1,Zypher,12.50,7|9|11,X|Y|Z
It would be great in this format
id,name,price,tag
1,Zypher,12.50,7:X|9:Y|11:Z
tag has two properties dividing by :
You can do it using JSR223 PreProcessor and Groovy language, something like:
Given you have the following CSV file structure:
id,name,price,tag
1,Zypher,12.50,X|Y|Z
And the following CSV Data Set Config settings:
Add JSR223 PreProcessor as a child of the HTTP Request sampler and put the following code into "Script" area:
import groovy.json.JsonBuilder
def json = new JsonBuilder()
def tagsValues = vars.get("tags").split("\\|")
class Tag {int tag; String tagName }
List<Tag> tagsList = new ArrayList<>()
def counter = 1
tagsValues.each {
tagsList.add(new Tag(tag: counter, tagName: it))
counter++
}
json {
id Integer.parseInt(vars.get("id"))
name vars.get("name")
price Double.parseDouble(vars.get("price"))
tags tagsList.collect { tag ->
["tag" : tag.tag,
"tagName": tag.tagName]
}
}
sampler.addNonEncodedArgument("",json.toPrettyString(),"")
sampler.setPostBodyRaw(true)
Remove any hard-coded data from the HTTP Request sampler "Body Data" tab (it should be absolutely blank)
Run your request - JSON payload should be populated dynamically by the Groovy code:
References:
Parsing and producing JSON - Groovy
Groovy Is the New Black
Update:
for CSV format
id,name,price,tag
1,Zypher,12.50,7:X|9:Y|11:Z
Replace the below Groovy code:
List<Tag> tagsList = new ArrayList<>()
def counter = 1
tagsValues.each {
tagsList.add(new Tag(tag: counter, tagName: it))
counter++
}
with
List<Tag> tagsList = new ArrayList<>();
tagsValues.each {
String[] tag = it.split("\\:")
tagsList.add(new Tag(tag: Integer.parseInt(tag[0]), tagName: tag[1]))
}

Get rid of Mongo $ signs in JSON

I am building python backend for SPA (Angular) using MongoDB.
Here is what I use: Python 3.4, MongoDB 3, Flask, flask-mongoengine and flask-restful
Now I receive the following JSON from my backend:
[
{
"_id": {
"$oid": "55c737029380f82fbf52eec3"
},
"created_at": {
"$date": 1439129906376
},
"desc": "Description.....",
"title": "This is title"
},
etc...
]
And I want to receive something like that:
[
{
"_id": "55c737029380f82fbf52eec3",
"created_at": 1439129906376,
"desc": "Description.....",
"title": "This is title"
},
etc...
]
My code for now:
from flask import json
from vinnie import app
from flask_restful import Resource, Api
from vinnie.models.movie import Movie
api = Api(app)
class Movies(Resource):
def get(self):
movies = json.loads(Movie.objects().all().to_json())
return movies
api.add_resource(Movies, '/movies')
Model:
import datetime
from vinnie import db
class Movie(db.Document):
created_at = db.DateTimeField(default=datetime.datetime.now, required=True)
title = db.StringField(max_length=255, required=True)
desc = db.StringField(required=True)
def __unicode__(self):
return self.title
What is the best way to format convenient JSON for front-end?
If you are confident you want to get rid of all the similar cases, then you can certainly write code that matches that pattern. For example:
info = [
{
"_id": {
"$oid": "55c737029380f82fbf52eec3"
},
"created_at": {
"$date": 1439129906376
},
"desc": "Description.....",
"title": "This is title"
},
#etc...
]
def fix_array(info):
''' Change out dict items in the following case:
- dict value is another dict
- the sub-dictionary only has one entry
- the key in the subdictionary starts with '$'
In this specific case, one level of indirection
is removed, and the dict value is replaced with
the sub-dict value.
'''
for item in info:
for key, value in item.items():
if not isinstance(value, dict) or len(value) != 1:
continue
(subkey, subvalue), = value.items()
if not subkey.startswith('$'):
continue
item[key] = subvalue
fix_array(info)
print(info)
This will return this:
[{'title': 'This is title', 'created_at': 1439129906376, 'desc': 'Description.....', '_id': '55c737029380f82fbf52eec3'}]
Obviously, reformatting that with JSON is trivial.
I found a neat solution to my problem in flask-restful extension which I use.
It provides fields module.
Flask-RESTful provides an easy way to control what data you actually render in your response. With the fields module, you can use whatever objects (ORM models/custom classes/etc.) you want in your resource. fields also lets you format and filter the response so you don’t have to worry about exposing internal data structures.
It’s also very clear when looking at your code what data will be rendered and how it will be formatted.
Example:
from flask_restful import Resource, fields, marshal_with
resource_fields = {
'name': fields.String,
'address': fields.String,
'date_updated': fields.DateTime(dt_format='rfc822'),
}
class Todo(Resource):
#marshal_with(resource_fields, envelope='resource')
def get(self, **kwargs):
return db_get_todo() # Some function that queries the db
Flask-RESTful Output Fields Documentation