Parsing CSV by column in JMeter - csv

Given a CSV file with some data arranged in columns instead of rows:
Parameters;Data Set 1;Data Set 2
param_1;A;1
param_2;B;2
param_3;C;3
param_4;D;4
param_5;E;5
Is it possible to use this as a "CSV config element" in JMeter? For sure it won't work with the standard config elements but maybe there is another way?

CSV config element cannot do this. You should use a BeanShell Sampler or JSR223 Sampler to read the file and process each line. Here is a simple Java code for BeanShell sampler:
BufferedReader br = new BufferedReader(new FileReader("filename"));
String line = br.readLine();
while (!line.isEmpty()) {
String parts = line.split(";");
String paramName = parts[0];
String dataSet1 = parts[1];
String dataSet2 = parts[2];
// save them in jmeter props or vars and use later
line = br.readLine();
}

In CSV Data Set Config write the names of the parameters in Variable Names(comma-delimited) with , separation. (like Parameters,DataSet1,DataSet2).
Set the Loop Count for the Thread Group as the number of lines you have to read.
Now you will get the values by accessing the variables ${Parameters}, ${DataSet1}, ${DataSet2}.

Instead of "CSV config element", use "User Parameters" pre-processor, it will work as you expected. But you need to add values manually.

Related

Parse a CSV after a PreProcessor script on JMeter

I'm trying to create a performance test on JMeter where I need to have a variable number of parameters.
This is the CSV file I'm using, so in this case I need 2 variables
inputParameter,var
7,v5
-2,v8
I found that it can be done by using JSR223 PreProcessor so I tried using this script
{
BufferedReader reader = new BufferedReader(new FileReader("path"));
String row = reader.readLine();
String[] header = row.split(",");
row = reader.readLine();
String[] values = row.split(",");
for (int i = 0; i < header.length; i++) {
String name = header[i];
String value = value[i];
sampler.addArgument(name, value);
}
}
This script creates the variables as it should and puts the value of the first row on it. But the problem I have is that I can't find a way to parse a CSV file after the script to change the varibales value.
I tried this
String value = "${"+name+"}";
But it does not get the value of ${imputParameter} that I get from the CSV Data Set Config, it just adds the value %24%7inputParameter%24%7
Is there any way to parse the CSV file after the script runs to modify the value of the variables created by it?
Thanks in advance!
Use vars
String value = vars.get(name);
vars - JMeterVariables - e.g.vars.get("VAR1");
Unfortunately your explanation doesn't make a lot of sense (at least for me), going forward consider:
Providing first 3 rows of your CSV file
Configuration of your CSV Data Set Config
Actual output of the HTTP Request sampler (Request -> Request Body) tab of the View Results Tree listener
Expected output of the HTTP Request sampler
Output of the Debug Sampler (Response Data -> Response Body tab of the View Results Tree listener)

Empty csv file while using BeanShell Postprocessor

i'm trying to save my variables into a csv file by using BeanShell Postprocessor, Code:
String id = "${userID}";
FileWriter fstream = new FileWriter("JmeterBean.csv",true);
fstream.write(id+"\n");
fstream.close();
Test Plan:
HTTP Request GetUsersById => return all IDs
Json extractor => from my response
{"#class":"com.test.dto.userDTO",
"author":"John",
"id":"89BC331D723F", },
{"#class":"com.test.dto.userDTO",
"author":"Alex",
"id":"FTH7JBDRF567",
}
Name of variale : userID
JSON path expression: $.[?(#.#class=='com.test.dto.userDTO')].id
Match Numbers: -1
BeanShell Postprocessor
But my csv file is always empty and look like that:
Use vars to get variable
String id = vars.get("userID");
vars - (JMeterVariables) - gives read/write access to variables:
vars.get(key);
And prefer using JSR223 PostProcessor over Beanshell PostProcessor
String id = vars.get("userID");
FileWriter fstream = new FileWriter("JmeterBean.csv",true);
fstream.write(id+"\n");
fstream.close();
I did it but i got the same result with null in my csv file:
If the JSON Extractor produces more than 1 match you don't have the userID variable, you will have something like:
userID_1=89BC331D723F
userID_2=FTH7JBDRF567
userID_matchNr=2
so I would recommend double checking which JMeter Variables are produced by the JSON Extractor using Debug Sampler and View Results Tree listener combination.
Since JMeter 3.1 it's recommended to use JSR223 Test Elements and Groovy language for scripting
Assuming all above add JSR223 PostProcessor (make sure it's located after the JSON Extractor) and use the following code:
1.upto(vars.get('userID_matchNr') as int, { number ->
new File('JmeterBean.csv') << vars.get('userID_' + number) << System.getProperty('line.separator')
})
#The below mentioned solution worked for me:
OrderID = vars.get("primaryRefID");
FileWriter fstream = new
FileWriter("C://Users/Documents/JMeter/OrderID.csv",true);
BufferedWriter out = new BufferedWriter(fstream);
out.write("TC-"+OrderID);
out.write(System.getProperty("line.separator"));
out.close();
fstream.close();

Read json file in jsr223 sampler in jmeter and extract data

import com.jayway.jsonpath.JsonPath
def idCSV = new File('id.csv')
def index = [fileOne.json, fileTwo.json]
def jsonString
index.each { file ->
jsonString = ________
def ids = JsonPath.read(jsonString, '$..id')
ids.each { id ->
idCSV << id << newLine
}
}
How to fill the jsonString = ____, so that I can json file into string and parse the string to extract ids and some information from the json string.
And I don't to do it in http request-> GET-> file format.
Previously i have extraced jsonString from http response and it worked well now I want to do it this way.
Use JsonSlurper:
def jsonString = new groovy.json.JsonSlurper().parseText(new File("json.txt").text)
My expectation is that you're looking for File.getText() function
jsonString = file.text
I have no full vision why do you need to store the values from JSON in a CSV file, however there is an alternative way of achieving this which doesn't require scripting as your approach will work with 1 concurrent thread only, if you will add more users attempting writing into the same file - you'll run into a race condition :
You can read the files from the folder into JMeter Variables via Directory Listing Config
The file can be read using HTTP Request sampler
The values cane be fetched using JSON Extractor, they will be automatically stored into JMeter Variables so you will able to use them later on
If you need the values to be present in the file (although I wouldn't recommend this approach cause it will cause massive disk IO and potentially can run your test) you can go for the Flexible File Writer

NIFI: get value from json

i have a queryCassandra which generate json like this one:
{"results":[{"term":"term1"},{"term":"term2"}..]}
Now, i want to get from this all the term values separated by some separator in string format; ex :
term1,term2,term3
So i can pass this list as a string parameter for a java main program which i've alreat set.
(i only need the transofrmation, not the java program execution)
Thank you !
You can easily get those values by using following ways.
GetFile-->EvaluateJsonPath-->PutFile
In get file you have to specify location of json file.
In EvaluateJsonPath configure like following properties.,
Destination:flowfile-attribute
Return Type:json
input.term1:$.results.[0].term //To get term
input.term2:$.results.[1].term
At the result of Evaluate json you have two attributes in which having those values.
Result attributes:
input.term1: term1
input.term2: term2
Above code works for me,so feel free to upvote/accept as answer.
as a variant use ExecuteScript with groovy lang:
import groovy.json.*
//get input file
def flowFile = session.get()
if(!flowFile)return
//parse json to map/array objects
def content = session.read(flowFile).withStream{ stream-> return new JsonSlurper().parse( stream ) }
//transform
content = content.results.collect{ it.term }.join(',')
//write new content
flowFile = session.write(flowFile,{ stream->
stream << content.getBytes("UTF-8")
} as OutputStreamCallback)
session.transfer(flowFile, REL_SUCCESS)

Using CSV file to read test data from

I need to test a various links of a site (no need to login) with 100's of users and loop it for some number of times using JMeter. I want to put those links in a "CSV file", so that all the links to be tested are read from file.
How do I accomplish this task?
Prepare kind of csv-file with list of your test-params and use it to parametrize your test-samplers, using at least the following:
CSV Data Set Config
Look into the following links for details:
How to get Jmeter to use CSV data for GET parameters?
Use jmeter to test multiple Websites
use csv parameters in jmeter httprequest path
Force a thread to use same input line when using CSV Data Set Config
Jmeter functions:
__CSVRead,
__StringFromFile.
Variables From CSV sampler from jmeter-plugins.
1. Prepare your test-urls in csv-file, e.g. in the following format:
url1
url2
...
urlN
Ensure that test-URLs don't contain http:// prefix (as per HTTP Request params -> Server).
2. Use schema for your script as below:
CSV Data Set Config:
Filename: [path to your csv-file with test-urls]
Variable Names: testURL
Recycle on EOF?: True
Stop thread on EOF?: False
Sharing mode: Current thread
Thread Group:
Number of Threads: N
Loop Count: M
HTTP Request // your http call
Server Name or IP: ${testURL} // use variable with extracted URL
This will start N users, each users will read M entries from list of test-urls. If M > number of entries in list of test-urls then user will recycle the list on EOF.
In one of the comments, it's mentioned that you can't read the CSV more than once per loop. You can go and have multiple threads, each reading the CSV file once, but then the file is close and won't be read on the next loop. Also, if you set the CSV to recycle, then CSV file is read over and over again indefinitely. So the question becomes how do you loop a CSV file a certain number of times as opposed to indefinitely?
I posted my answer to that in another post (https://stackoverflow.com/a/64086009/4832515), but I'll copy & paste it incase that link doesn't work in the future.
I couldn't find a simple solution to this. I ended up using beanshell scripts, which let you use code very similar to java to do some custom stuff. I made an example JMeter project to demonstrate how to do this (yes it's ridiculously complicated, considering all I want to do is repeat the CSV read):
Files:
my file structure:
JMeterExample
|
⊢--JMeterTests.jmx // the JMeter file
⊢--example.csv // the CSV file
contents of my CSV:
guest-id-1,"123 fake street",
guest-id-2,"456 fake street",
guest-id-3,"789 fake street",
so in this thread group, I'm going to just have 1 user, and I'll loop 2 times. I intend to send 1 request per CSV line. So there should be 6 requests sent total.
Thread Group
User Defined Variables
This is kind of optional, but the filepath is subject to change, and I don't like changing my scripts just for a change in configuration. So I store the CSV filename in a "User Defined Variables" node.
If you are storing the CSV file in the same directory as your JMeter test, you can just specify the filename only.
If you are saving the CSV in a folder other than the directory containing your JMeter file, you will need to supply an absolute path, and then slightly modify the beanshell script below: you'll need to comment out the line that loads the file relatively, and comment in the line that loads from an absolute path.
BeanShell Sampler to parse and store CSV lines
Add a Beanshell Sampler which will basically take in a path, and parse & store each line as a variable. The first line will be stored as a variable called csv_line_0, the 2nd line will be csv_line_1 and so on. I know it's not a clean solution but... I can't find any clean simple way of doing this clean simple task. I copied and pasted my code below.
import org.apache.jmeter.services.FileServer;
import java.text.*;
import java.io.*;
import java.util.*;
String temp = null;
ArrayList lines = new ArrayList();
BufferedReader bufRdr;
ArrayList strList = new ArrayList();
// get the file
try {
// you can use this line below if your csvFilePath is an absolute path
// File file = new File(${csvFilePath});
// you can use this line below if your csvFilepath is a relative path, relative to where you saved this JMeter file
File file = new File(org.apache.jmeter.services.FileServer.getFileServer().getBaseDir() + "/" + ${csvFilePath});
if (!file.exists()) {
throw new Exception ("ERROR: file " + filename + " not found");
}
bufRdr = new BufferedReader(new InputStreamReader(new FileInputStream(file), "UTF8"));
} catch(Exception e){
log.error("failed to load file");
log.error(e.getMessage());
return;
}
// For each CSV line, save it to a variable
int counter = 0;
while(true){
try{
temp = bufRdr.readLine();
if(temp == null || temp.equals("<EOF>")){
break;
}
lines.add(temp);
vars.put("csv_line_" + String.valueOf(counter), temp);
counter++;
} catch(Exception e){
log.error("failed to get next line");
log.error(e.getMessage());
break;
}
}
// store the number of CSV lines there are for the loop counter
vars.put("linesCount", String.valueOf(lines.size()));
Loop Controller
Add a Loop Controller that loops once for each CSV line. ${linesCount} is a count of the number of CSV lines and is calculated from the above beanShell script.
Beanshell script to extract data from current CSV Line
This script will run once per CSV line. It will go and grab the current line, and parse out whatever data is on it. You'll have to modify this script to get the data you want. In my example, I only had 2 columns, where column 1 is a "guestId", and column 2 is an "address".
__jm__loopController__idx is a variable JMeter defines for you, and is the index of the loop controller. The variable name is __jm__{loop controller name}__idx.
String index = vars.get("__jm__loopController__idx");
String line = vars.get("csv_line_" + index);
String [] tokens = line.split(",");
vars.put("guestId", tokens[0]);
vars.put("address", tokens[1]);
Http request sampler
Here's the HTTP request that's using the data extracted.
result
When running this, as desired, I end up sending 6 http requests over to the endpoint I defined.