How to get the csv alias for a thread in Jmeter sharing mode current thread? - csv

Here is my test plan structure.
User Login
Runtime Controller
while controller !<> EOF
CSV dataset (items to add)
search and add to cart
Click cart.
Proceed to check out
Order submit.
Beanshell sampler to close CSV
User Logout.
I want each thread to read the csv till EOF and add these items to cart, hence I used the sharing
mode as current thread.Since add to cart and order submission is getting repeated for the test
duration I am closing the file and resetting the variable after order submit so that next iteration
will again start to read from beginning.
The beanshell code is :
import org.apache.jmeter.## Heading ##services.FileServer;
FileServer.getFileServer().closeFile("Scripts_Helan\\DSOrderParts.csv");
String pPartNum = vars.get("pPartNum");
vars.put("pPartNum", "");
But when I run the test Jmeter log is showing file name as
Stored: Scripts_Helan\DSOrderParts.csv Alias: Scripts_Helan\DSOrderParts.csv#1309262272
Don't I have to use the Alias in closeFile? How can I get it?[enter image description here][1]
[enter image description here][1]

I don't exactly get why you are using beanshell code here.
You can handle the "start all over when done" part with setting up the Thread Group accordingly
You can handle that "stop thread at end of file" part with setting up the CSV Data Set Config accordingly
Please clarify what makes handling the file in beanshell code neccessary.

Related

SnappyData submit a jar to cluster with parameters

SnappyData documentation give an example on how to submit a jar to a cluster:
https://snappydatainc.github.io/snappydata/howto/run_spark_job_inside_cluster/
But what if I need to submit the jar with the same class CreatePartitionedRowTable
multiple times, but with different paramter, say different suffix to append to the names of the tables created, How do I do that?
UPDATE:
To be more precise, say I want to submit the jar with different parameters when I submit the jar, something like this
bin/snappy-job.sh submit
--app-name CreatePartitionedRowTable
--class org.apache.spark.examples.snappydata.CreatePartitionedRowTable
--app-jar examples/jars/quickstart.jar
--lead localhost:8090
--CustomeParam suffix
the additional
--CustomeParam suffix
will be passed in to the job, and the code can pick up this parameter suffix, and appending the suffix to the table names to be created, so that I don't have to modify my code every time that I want to submit the jar with a different suffix.
Update 2:
I just went through the examples and found an example usage:
https://github.com/SnappyDataInc/snappydata/blob/master/examples/src/main/scala/org/apache/spark/examples/snappydata/CreateColumnTable.scala
so basically run like this:
* bin/snappy-job.sh submit
* --app-name CreateColumnTable
* --class org.apache.spark.examples.snappydata.CreateColumnTable
* --app-jar examples/jars/quickstart.jar
* --lead [leadHost:port]
* --conf data_resource_folder=../../quickstart/src/main/resources
and use config to get the customized parameter.
Each time you submit your app jar with snappy-job.sh it will create a new Job and run it. It could be the same jar with different content. Do you see any exception or the modified class (CreatePartitionedRowTable) is not getting picked ?

Automating Requests Through Postman

I have ~10,000 records in a csv file that I need to run through postman and check the output it gives.
I was wondering (because I've been doing this manually) if there's a way to import the csv file into postman, and use the rows like variables and just 'scroll' through each result it gives out.
Cheers,
Daniel
This feature exists in Postman runner.
You have to make sure your CSV column's name(s) match the name(s) of your variable(s) in your collection.
You select your CSV file in the "data" part, just above the "start run", and when you run your collection, it will parse your file and replace the variable(s) with the value in the corresponding column and execute the request.
Then it will loop and get the next value and execute, and so on.
It just depends on the content of your CSV file and how you use it in your Postman request.
You shall have a look at those links :
https://www.getpostman.com/docs/postman/collection_runs/working_with_data_files
http://blog.getpostman.com/2014/10/28/using-csv-and-json-files-in-the-postman-collection-runner/
They will give you complete explanation about how it works.
Alex

how to run particular http request to multiple times in jmeter?

I've multiple http samples in Test Plan but i want to run only one request for multiple times based on csv data set config and remaining http requests will run once particular sample is complete all the iteration based on csv data set config.
In above image i've 5 http samples and i've csv data set for first request,once first request is run for entire csv data set test data then another samples should run,please provide solution how to perform this type of operation in jmeter.
you can use Loop Control and provide the number of hits you needed.
To go to Loop Control:
Right click on the Thread/ThreadGroup
Click on Add
select the Logic controller
Select the Loop Controller
According to the problem, there is a need to execute a particular request to execute in a loop. The loop could be organized in the following way.
-> Thread Group
-> Loop Controller
-->Particular Request
-> Other Request
The CSV Dataset could be found in Config Element and should contain a path to a particular file with data (e.g. accounts.csv).
Both elements could be located in
Loop controller: Add ->Logic Controller -> Loop Controller
CSV Data Set : Add -> Config Elements -> CSV Data Set Config.
The loop can run for the exact amount of records in a file if
Loop Count is equal to a number of strings in the accounts.csv
Forever, it could be stopped by Test Duration automatically or by Stop button.
The CSV Data Set config should have a path to a file with data.
If a file contains headers, then Variables will be assigned based on column names from first string, otherwise, it is necessary to provide variables names in configuration.
ANSWER: If you use 9 variables and all of them are set as header names in a file, then don't provide them in CSV Data Set Config. They will be created automatically, so you can use all 9 variables through names in your file.
If headers are not provided, set 9 all in a Variable Names field, using comma as separator.
Configure your CSV Data Set Config like:
Reference name: foo
Recycle on EOF: false
Stop thread on EOF: false
Put the CSV Data Set Config under the While Controller and put ${__javaScript("${foo}" != "<EOF>",)} into "Condition" input
Add If Controller under the While Controller and put "${foo}" != "<EOF>" into the "Condition" input
Put your first request as a child of the If Controller
Replace foo with your actual JMeter Variable reference name everywhere in the script.
This way JMeter will loop until there are entries in the CSV file and will proceed with the remaining requests when the end of CSV file will be reached.
See Reading all Values from the CSV and Continue chapter of the Using the While Controller in JMeter article for more detailed information on configuring JMeter for your scenario.
You can verify with only one variable and Rest variable you don't required in If and while controller.

How to generate logs with timestamp in sikuli?

I want to generate logs with time stamp being in place of popups i have mentioned.
Please suggest some steps for generating logs with time stamp.
Code:-
click("1450851018693.png")
wait(2)
click(Pattern("1450851091319.png").targetOffset(1,2))
click(Pattern("1450851555941.png").targetOffset(-201,1))
type("111")
click(Pattern("1450851201892.png").targetOffset(-13,2))
type("121")
wait(1)
if exists ("1450851253342.png"):
popup("start button is enabled")
click("1450851253342.png")
In the above code instead of popups i want the messages to be logged in file with time stamp.
Please Help..
You can use the logging module from Python importing it and getting a logger instance:
import logging
FORMAT='%(asctime)-15s %(message)s'
logging.basicConfig(format=FORMAT)
logger=logging.getLogger('')
Then, use it in you code (per default, level info is not printed to the console, but warning is)
logger.warning('My message')
You should have a log entry in your like:
2016-03-07 13:10:43,151 My message
See Logging Python for description and basic tutorial.

Manipulating json string with Jmeter

Im new to Jmeter and web applications at all, I need some help with a json post.
I have an application with a POST request that sends a json string to save the data that was created/changed.
Here an example of the json code that is sent through the POST request
{"ID":0,"Description":"Test 1"}
With that the first user will create a new registry "Test 1", the second user will create another registry with the same description. I´d like to be able to set the ID and Description info as variables to manipulate them. When the post receive ID 0 is to make a new registry, when It receive an specific ID is to update that registry.
I´m trying to simulate some scenarios that user 1 create, user 2 update or user 1 create, user 2 create different. Things like that.
If your aim is to update the ID and the Description # run time,
Keep the ID and the Description in a CSV file. (something like below)
0, Test1
0, Test1
1, Test2
Use "CSV Data Set Config" (under Config element) to read the CSV file. update the file name and enter "ID,Description" in the variable names.
Update the JSON as
{"ID":${ID},"Description":"${Description}"}
This approach will read the data from CSV and update the JSOn # run time and send the request.
first thread / loop will take the first row and second thread will take the second row.
It should work..let me know if anything.