How to maintain sessionid across different thread groups - json

I'm doing API testing using JSON.
My Jmeter's TestPlan looks like below:
Test Plan
Thread Group 1 (run once)
- Login
Thread Group 2 (I will run this multiple times)
- Do some opeartion
Thread Group 3 (run once)
- Logout
I want to pass sessionid from Thread Group 1 to Thread Group 2 and 3.

To extract sessionId use Regular Expression Extractor

you can use following code to pass a value to another thread group using beanshell postprocessor listener element in Jmeter
Beanshell code to save variable
import org.apache.jmeter.util.JMeterUtils;
JMeterUtils.setProperty("propname", "value");
Beanshell code to retrieve variable from another thread group
import org.apache.jmeter.util.JMeterUtils;
vars.put("localvariable", JMeterUtils.getProperty("propname"));
var testVar=vars.get("localvariable");
log.info("# NEXT THREAD GROUP value="+testVar);
Code using Jmeter's getprperty(),setproperty() API to pass the values. Also you can use JMeter Plugins has Inter-Thread Communication.
hope this will help. :)

How to use Beanshell guide contains an example of sharing cookies between different thread groups, scroll down to Advanced Examples section.
If you "session" assumes cookie-based session you'll need to do the following:
Add HTTP Cookie Manager to the Test Plan (or each thread group if you prefer)
Tell the Cookie Manager to save cookies as variables by setting CookieManager.save.cookies property to true in jmeter.properties file which lives under /bin folder of your JMeter installation or passing it as command-line argument as follows
jmeter -JCookieManager.save.cookies=true -n ... -t ... -l ...

Another approach is to have only one Thread Group instead of three and add CookieManager to your Thread Group. Have a Loop Controller to run the operation multiple times.
Your test can be structured as:
Test Plan
Thread Group
- Cookie Manager
- Login
- Loop Controller (run this multiple times)
- Do some operation
- Logout

Related

Cloud Build fails due to Cloud Function container not being ready yet

we have a Google cloudbuild pipeline that does the following:
Create a temp function
Run some tests
Deploy the production function
However, the second step often fails due to the first container not being ready yet:
Step #3: Unexpected token '<' at 2:1
Step #3: <html><head>
Step #3: site unreachable
looks like it is returning some placeholder html from nginx.
How can we fix that?
Currently, we just put an ugly sleep between steps
You probably want to have a look to the Cloud Functions API, there you can find the operations endpoint that will tell you if the operation is finished or not (assuming v1, otherwise look below): https://cloud.google.com/functions/docs/reference/rest/v1/operations/get
The operation is the same Id that is returned in the creation operation. Also you can list them with the list endpoint (in the same doc).

Execution ID on Google Cloud Run

I am wondering if there exists an execution id into Cloud Run as the one into Google Cloud Functions?
An ID that identifies each invocation separately, it's very useful to use the "Show matching entries" in Cloud Logging to get all logs related to an execution.
I understand the execution process is different, Cloud Run allows concurrency, but is there a workaround to assign each log to a certain execution?
My final need is to group at the same line the request and the response. Because, as for now, I am printing them separately and if a few requests arrive at the same time, I can't see what response corresponds to what request...
Thank you for your attention!
Open Telemetry looks like a great solution, but the learning and manipulation time isn't negligible,
I'm going with a custom id created in before_request, stored in Flask g and called at every print().
#app.before_request
def before_request_func():
execution_id = uuid.uuid4()
g.execution_id = execution_id

How can i use multiple JSON files, into a Thread Group /(bzm - Concurrency Thread Group) of Jmeter 5.1.1, to do a concurrent testing?

using Jmeter 5.1.1, I want to do a concurrent testing.
I use "Thread Group" or "bzm - Concurrency Thread Group" into a "Test Plan" of Jemter.
I have to use multiple JSON files.
so i created multiple HTTP requests and input JSON files into them.
my Jmeter screenshot
multiple JSON files are into multiple HTTP requests
Is it a Concurrent Testing?
How can I use multiple JSON files into a HTTP request?
In your scenario JMeter will kick off 3 threads in 1 second and let them run for another second. The number HTTP Request samplers which will be executed will depend on your application response time. If 3 virtual users running requests for 1 second is what you call "concurrent testing" then yes. You can always check how many virtual users were online during each test phase using Active Threads Over Time listener.
Looking into the structure of your requests you can achieve the same by using a single HTTP Request sampler, JMeter offers a number of functions which can be used for parameterization, for example you can use:
__time() function as createdDate
__Random() function as Id
__RandomString() function as LastName and/or Title
Check out Apache JMeter Functions - An Introduction article for more information on JMeter Functions concept.
I add multiple "Thread Group" into a "Test Plan". Inside "Test Plan" Unselect "Run thread groups consecutively".
Inside every "Thread Group" configure "Number of Threads(users)" as "1", "Ramp-Up Period" as "0", "Loop Count" as "1".
Now, add "HTTP Request" into each "Thread Group". Every "HTTP Request" you can add your "JSON file" and configure the "URL".
Configure all the "HTTP Request" according to this.
Now add "View Results Tree" as a listener into a "Test Plan".
Now run the Test plan.
example of using multiple JSON files to do a concurrent testing

Supplying 1 JDBC request result to multiple threads in JMeter

I'm facing an issue in Jmeter. The API I am testing, get the parameters from a prior JDBC request.
This works fine when there is only 1 thread. But, when I run multiple threads it throws the error below
{"Message": "A transient error has occurred. Please try again. (1205)","Data":null}
Here is the screenshot
I need to run 5 threads without having to run the JDBC request 5 times.
I can retrieve 5 results in 1 JDBC call and supply them sequentially for each of the thread. Is this possible? How can I do this?
Worst-case scenario I will have to manually set up CSV file instead of JDBC calls.
Normally people use setUp Thread Group for test data preparation and tearDown Thread Group for eventual clean-up. I would suggest moving your JDBC Request under the setUp Thread Group and run it with 1 virtual user.
If you have to keep the test plan structure as it is and can amend the SQL query to return more results, be aware that according to the JDBC Request sampler documentation the results look like:
myVar_#=5
myVar_1=foo
myVar_2=bar
myVar_3=baz
myVar_4=qux
myVar_5=corge
Therefore you can use the values using __V() and __threadNum() functions combination like:
${__V(myVar_${__threadNum},)}

How do I get the Junit test results using Email-ext in Jenkins

What do I need to add to the default html_gmail.jelly script to have it show the classes that were tested including how many tests were ran within each class?
When a Jenkins job is complete you can drill down to the Junit Test Results in an address that looks like:
http://somecompany.jenkins.com/view/App_Automation/job/Application_Under_Test/129/testReport/com.AUT.testing.mobile/
The test results are generated by the build.xml so is it just a matter of pointing to that xml file?
The email-ext page shows a clean example but not the tokens that are used to achieve that: http://wiki.hudson-ci.org/download/attachments/3604514/html.jpg
Currently using the ${FAILED_TESTS} token generates a nice Tested; Failed; Skipped number, but nothing that points to which tests passed/failed/skipped. I would like to show the total number of tests including which tests were actually ran.
Thanks ahead of time
OK I figured out how to display the pass and failed methods by adding var=pass or var=fail to the token of those assignments.
First go to the Jelly script in the this path:
~/.hudson/plugins/email-ext/WEB-INF/classes/hudson/plugins/emailext/templates/automation.jelly
$DEFAULT_SUBJECT (${build.testResultAction?.failCount} ${build.testResultAction?.failureDiffString})
SETTING UP THE CONFIG IN JENKINS
DEFAULT SUBJECT:
$PROJECT_NAME - Build # $BUILD_NUMBER - $BUILD_STATUS!
DEFAULT CONTENT:
$PROJECT_NAME - Build # $BUILD_NUMBER - $BUILD_STATUS:
Check console output at $BUILD_URL to view the results.
Changes:
${CHANGES}
Changes Since Last Success
${CHANGES_SINCE_LAST_SUCCESS}
Failed Tests:
${FAILED_TESTS}
Build Log:
${BUILD_LOG}
Total Amount of Tests:
${TEST_COUNTS, var}
Total = $TEST_COUNTS
Failed = ${TEST_COUNTS,var="fail"}
Total = $TEST_COUNTS
Passed = ${TEST_COUNTS,var="pass"}
Job Description:
${JOB_DESCRIPTION}
Place this in the email job
${JELLY_SCRIPT,template="html-with-health-and-console"}
Note the templates available are noted in the path ~/.hudson/plugins/email-ext/WEB-INF/classes/hudson/plugins/emailext/templates/automation.jelly or create your own.