Need some help in LoadRunner scripting in REST API. In my requirement, I'm passing the JSON file in the web_custom_request LR Function,
Following is the content of the JSON File
{"serviceLayerOperationRequest":{
"contextObjectId": "36045467715",
"payload":"{\"UserSessionSearchCriteria\":{\"os_st_id\":\"36045467715\",\"LastName\":\"test\",\"FirstName\":\"test\"}}",
"operationLabel": "CustomerSearch",
"serviceOpInvocationId": "1111111",
"sessionId": "{SessionID}"
}}
New Value is fetched from the previous response body and successfully written into the Parameter SessionID
Currently, it picks the string {SessionID}.
In the above JSON file, variable sessionId's value is dynamic, so I want to parameterize it from the Parameters. What should be the correct syntax for this?
Related
I have a CSV file like this
"{""data"":{""student"":{""name"":""random name""}}}",
"{""data"":{""student"":{""name"":""random name2""}}}"
CSV image for better understanding:
Here I have two JSON strings.
I tried to send these as a JMeter variable as a POST ${body}. It actually takes the value from the CSV, but JMeter sends the value as a string rather than a JSON body. Is there any way to perse those data from CSV and send those as a POST JSON body?
For example, the POST body should be like this:
{
"data": {
"student": {
"name": "random name"
}
}
}
But now, it's like this
"{""data"":{""student"":{""name"":""random name""}}}"
I config the CSV data set in JMeter and send the variable this way:
Just for your info, I do not want to separate the data from the JSON one by one and put variables in the POST body for every JSON data separately. I want the full JSON body from the CSV.
JMeter sends what it finds in CSV file, remove extra quotation marks from the CSV file and JMeter will start sending the valid JSON.
If you cannot manipulate the data in the CSV file, i.e. it's coming from an external source, you can remove these extra quotation marks using JSR223 PreProcessor.
If you want just to send new line from the file with each subsequent request and the file is not very big take a look at __StringFromFile() function, it just returns the next line from the file each time it's being called.
More information on JMeter Functions concept: Apache JMeter Functions - An Introduction
I am trying to pass the values of the basketId and basketItemId from the Create Basket Json response to Jmeter variable so that I can use these values in the next requests. I set up everything according to the documentation, however I am not able to see the values in the debug sampler in the Tree - Response Data area. Any help is appreciated?
You're not seeing your values because your sampler hasn't been executed
It has not been executed because you need to provide the same number of the "Default Values" as the variables you declared. If you look at jmeter.log file you will see something like:
Mismatch between number of variables, json expressions and default values
The solution would be adding the default values like:
null;null
And assuming your Json Path expression matches something in the response you will see both values.
More information: API Testing With JMeter and the JSON Extractor
I need the real payload json data to be able to assert it against another hardcoded json file in munit (mule 3.9 and dataweave 1). The issue is the payload show as "org.mule.munit.common.util.ReusableByteArrayInputStream#53534c15" under payload. When I convert it to java I can see the data, but not in json format. How can I extract the json in this byte array stream to be able to assert it against a json hardcoded file.
I resolved it by using the "Byte to String" block
Then, I have added the "Assert Equals" block, but made sure to format both values likes this.
#[payload.replaceAll("\\s+","")]
#[getResource('sample.json').asString().replaceAll("\\s+","")]
This did exactly what I needed.
I have a JSON file in Azure Blob storage that I need to parse and insert rows into SQL using the Logic App.
I am using the "Get Blob Content" and my first attempt was to then pass to "Parse JSON". It returns and error": InvalidTemplate. Unable to process template language expressions in action 'Parse_JSON' inputs at line '1' and column '2856'"
I found some discussion that indicated that the content needs to be converted to a string so I used "Compose" and edited the code as suggested to
"inputs": "#base64ToString(body('Get_blob_content').$content)"
This works but then the InvalidTemplate issue gets pushed to the Parse function and I get the InvalidTemplate error there. I have tried wrapping the output in JSON expression and a few other things but I just can't get it to parse.
If I take a sample or even the entire JSON and put it into the INPUT of the Parse function it works without issue but it will not accept the blob content as JSON.
The only thing I have been able to do successfully from blob content is to take it as a string and update a row in SQL to later use the OPENJSON in SQL...but I run into an issue there that is for another post.
I am at a loss of what to do.
You don't post much information about your logic app actions, so maybe you could refer to my flow design. I test with a json data with array.
The below is my flow picture. I'm not using compose action, and use decodeBase64(body('Get_blob_content')['$content']) as the Parse Json content.
And if select property from the json, you need set the array index. I set a variable to get a value 'body('Parse_JSON')1['name']'.
you could have a try with this, if still fail, please provide more information or some sample to let us have a test.
I'm validating my response from a GET call through a .json file
match response == read('match_response.json')
Now I want to reuse this file for various other features as only one field in the .json varies. Let's say this param in the json file is "varyingField"
I'm trying to pass this field every time I am matching the response but not able to
def varyingField = 'VARIATION1'
match response == read('match_response.json') {'varyingField' : '#(varyingField)'}}
In the json file I have
"varyingField": "#(varyingField)"
You are trying to use an argument to read for a JSON file ? Sorry such a thing is not supported in Karate, please read the docs.
Use this pattern:
create a JSON file that has all your "happy path" values set
use the read() syntax to load the file (which means this is re-usable across multiple tests)
use the set keyword to update only the field for your scenario or negative test
For more details, refer this answer: https://stackoverflow.com/a/51896522/143475