Can anyone tell me how I can loop over the same CSV file twice without having to stop the thread?
Currently I have
Thread Group
+-- While Controller
+-- CSV (True,True,False)
Loop Controller (2)
+-- SOAP/RPC Request
So the idea is to retrieve 5 requests and then repeat the same 5 request again.
NOTE: i do not want to send the same request twice.
Assuming your question means: "I want to reuse the data set, but do not want to send the same request concurrently.", then you should be able to set up the following configuration. You shouldn't need a while loop.
Thread Group
* Loop Count = 2
* Thread Count = 5
|
+-- CSV
* Recycle on EOF = true
* Stop thread on EOF = false
* Sharing mode = All threads
To break that down:
The Loop count on the Thread Group will run the entire thread group the specified number of times, with 5 threads, each one of which will make one request from your CSV file
The "Recycle" and "Stop thread" settings on the CSV Data Set config will cause the data set to be reused until the threads have completed.
Another way of accomplishing this is to set up a loop count as a multiple of the number of URLs you are going to sample:
Thread Group
* Loop Count = 10
* Thread Count = 1
|
+-- CSV
* Recycle on EOF = true
* Stop thread on EOF = false
* Sharing mode = All threads
That's slightly less efficient, since it means the requests have to execute serially rather than in parallel threads, but it is another way of doing it.
Related
Below is my Test plan to read Data from Multiple CSV file. I wants to test Scenario like
1. 10 users performed operation on 100 documents. Idealy each user should get 10 documents and perfromed the operation on it.
TestPlan
Thread Group
While controller
LoginUserDataConfig
LoginRequestRecordingController
HTTPLoginRequest
DocumentOperationRecordingController
DocIDList
HttpSaveRequest
But with above plan It is taking only 10 document and stop the process. I run the script by changing CSVDataConfigu setting like Shared Mode to All Thread\Current Thread but not getting desired output.
Can any one correct my test plan.
Thread Settings:
Number of Thread: 10
Ramp-Up Period: 2
loop count: 1
LoginUserDataConfig Settings:
Allowed Quoted Data: False
Recycle on EOF? False
Stop Thread on EOF: True
Sharing mode: Current Thread Group
DocIDList Settings:
Allowed Quoted Data: False
Recycle on EOF? False
Stop Thread on EOF: True
Sharing mode: Current Thread Group
You should mark loop count as forever and it will continue until End Of File of CSV (100 IDs)
I have a code to calculate the mean of the first five values of each column of a file, for then use these values as a reference point for all set. The problem is that now I need to do the same but for many files. So I will need to obtain the mean of each file to then use these values again with the originals files. I have tried in this way but I obtain an error. Thanks.
%%% - Loading the file of each experiment
myfiles = dir('*.lvm'); % To load every file of .lvm
for i = 1:length(myfiles) % Loop with the number of files
files=myfiles(i).name;
mydata(i).files = files;
mydata(i).T = fileread(files);
arraymean(i) = mean(mydata(i));
end
The files that I need to compute are more or less like this:
Delta_X 3.000000 3.000000 3.000000
***End_of_Header***
X_Value C_P1N1 C_P1N2 C_P1N3
0.000000 -0.044945 -0.045145 -0.045705
0.000000 -0.044939 -0.045135 -0.045711
3.000000 -0.044939 -0.045132 -0.045706
6.000000 -0.044938 -0.045135 -0.045702
Your first line results in 'myfiles' being a structure array with components that you will find defined when you type 'help dir'. In particular, the names of all the files are contained in the structure element myfiles(i).name. To display all the file names, type myfiles.name. So far so good. In the for loop you use 'fileread', but fileread (see help fileread) returns the character string rather than the actual values. I have named your prototype .lvm file DinaF.lvm and I have written a very, very simple function to read the data in that file, by skipping the first three lines, then storing the following matrix, assumed to have 4 columns, in an array called T inside the function and arrayT in the main program
Here is a modified script, where a function read_lvm has been included to read your 'model' lvm file.
The '1' in the first line tells Octave that there is more to the script than just the following function: the main program has to be interpreted as well.
1;
function T=read_lvm(filename)
fid = fopen (filename, "r");
%% Skip by first three lines
for lhead=1:3
junk=fgetl(fid);
endfor
%% Read nrow lines of data, quit when file is empty
nrow=0;
while (! feof (fid) )
nrow=nrow + 1;
thisline=fscanf(fid,'%f',4);
T(nrow,1:4)=transpose(thisline);
endwhile
fclose (fid);
endfunction
## main program
myfiles = dir('*.lvm'); % To load every file of .lvm
for i = 1:length(myfiles) % Loop with the number of files
files=myfiles(i).name;
arrayT(i,:,:) = read_lvm(files);
columnmean(i,1:4)=mean(arrayT(i,:,:))
end
Now the tabular values associated with each .lvm file are in the array arrayT and the mean for that data set is in columnmean(i,1:4). If i>1 then columnmean would be an array, with each row containing the files for each lvm file. T
This discussion is getting to be too distant from the initial question. I am happy to continue to help. If you want more help, close this discussion by accepting my answer (click the swish), then ask a new question with a heading like 'How to read .lvm files in Octave'. That way you will get the insights from many more people.
I am writing a load testing script for radius server using tcl and expect.
I am invoking radclient, that comes inbuild with the radius server, from my script on remote server.
scripts does following:
take remote server IP
- spawn ssh to remote server
- invoke radclient
- perform load test using radclient commands
- need to collect the result from the output (as shown in the sample output) into a variable
- Extract authentication/sec as Transaction per second (TPS) from output or variable from previous step
Need help on last two steps:
Sample output from radclient:
*--> timetest 20 10 2 1 1
Cycles: 10, Repetitions: 2, Requests per Cycle: 10
Starting User Number: 1, Increment: 1
Current Repetition Number=1
Skipping Accounting On Request
Total Requests=100, Total Responses=100, Total Accepts=0 Total Not Accepts=100
1: Sending 100 requests and getting 100 responses took 449ms, or 0.00 authentications/sec
Current Repetition Number=2
Skipping Accounting On Request
Total Requests=100, Total Responses=100, Total Accepts=0 Total Not Accepts=100
2: Sending 100 requests and getting 100 responses took 471ms, or 0.00 authentications/sec
Expected Output:
TPS achieved = 0
You might use something like this:
expect -re {([\d.]+) authentications/sec}
set authPerSec $expect_out(1,string)
puts "TPS achieved = $authPerSec"
However, that's not to say that the information extracted is the right information. For example, when run against your test data it is likely to come unstuck as there are two places where you have authentications/sec due to all the repetitions; we don't account for that at all! More complex patterns might extract more information and so on.
expect {
-re {([\d.]+) authentications/sec} {
set authPerSec $expect_out(1,string)
puts "TPS achieved #[incr count] = $authPerSec"
exp_continue
}
"bash$" {
# System prompt means stop expecting; tune for what you've got...
}
}
Doing the right thing can be complex sometimes…
In my project I am using Jmeter for load testing.
I have created a test plan as follows:
Login to the application using a csv file which has 10 unique username and password (successfully done)
A regular expression extractor will get the session id from the response which will be used in the next sampler for submitting a request. (sessionId extracted successfully)
The next sampler will submit a request. This sampler accepts two parameters: sessionId & licenseRequest. A little details for the java-script is as follows:
url: "groovy/enqueue",
type: "POST",
global: false,
data: {sessionId: uSessionId, licenseRequest: JSON.stringify(requestJSON) },
dataType: "text",
For submitting the request I have created a csv file.
The csv is similar like this:
Entry 1:
{"activations":["<activation-code>","<activation-code>"],"email":"<emailIdofUser>","csvEntries":[{"model":"<modelname>","serial":"<serialNo>"}],"comment":"testing jmeter"}
What I have found out that while submitting the request in th second sampler the post request is malformed:
POST data:
sessionId=vZNjFjW38cid&licenseRequest=%3CEOF%3E
As you can see the licenseRequest's value is not correct. It is sending the EOF, which is not desired.
Ensure that you have the CSV Data Set Recycle on EOF and Stop Thread on EOF values correctly:
if you plan to iterate over the file more than once you must set Recycle on EOF = True (i.e. instruct jmeter to move back to the top of the CSV file);
set Stop Thread on EOF = False if you are using a loop controller, Stop Thread on EOF = True if you are using a while controller and want to stop after reading the whole csv dataset;
in case if you need that each of N threads reads and uses single and unique line from csv-file you have to set Sharing mode: Current thread group for CSV Data Set Config (number of csv-entries should be in this case the same as threads number, or Recycle on EOF? False should be set otherwise);
in case if you need that each of N threads reads and uses all lines from csv-file you have to set Sharing mode: Current thread for CSV Data Set Config.
Don't forget to look into jmeter.log or use Log Viewer to detect any issues with csv usage.
The simplest case is like the following:
Test Group
Number of Threads = 10
CSV Data Set Config (User Logins)
Filename: ... (your csv-file should have 10 entries)
Recycle on EOF = False
Stop Thread on EOF = True
Sharing Mode = All threads
CSV Data Set Config (License Request)
Filename: ... (your csv-file should have 10 entries)
Recycle on EOF = False
Stop Thread on EOF = True
Sharing Mode = All threads
Login Sampler
License request Sampler
This will produce 10 threads each with separate login-entry and license-request-entry from csv's.
Scenario:
Package#1 creates a flat file that contains multiple messages (one per line) that is dropped in an external system's "INPUT" folder. The file is picked up by the external system and processed and the responses are written in the same format to a file in the "OUTPUT" folder. The file starts to be written while the external system is still processing messages, so it is written as foo.rsppro. When processing is complete and all response messages are written it is renamed foo.rsp.
I need to pick up that file once it is complete (i.e. after the rename) and process using Package#2, which will start immediatly following Package#1. When Package#2 starts, the external system could be in three states:
Processing the first message not yet complete and no response file written yet, in which case I need to wait for foo.rsppro to be written, then renamed to foo.rsp
Processing in progress and foo.rsppro written, in which case I need
to wait for foo.rsppro to be renamed to foo.rsp
Processing completed, foo.rsppro has been written and been renamed to foo.rsp, in which case I just need to process foo.rsp.
I have tried:
using a file in use task but that errors if the expected file isn't present when the task begins (i.e. errors for scenario 1 and 2)
using a file watcher task but that appears to ignore file renames by design, so will never handle scenario 1 or 2
Aside from building a script task, is there a custom task that will handle all three scenarios?
Edit: SSIS 2008 R2
only a script task can help in your case.
consider using FileSystemWatcher within the script if possible or have an application/windows service which can monitor file system using FileSystemWatcher and invoke your packages when the event is triggered.
humm, it seems that you can solve it by using a for each loop container on the output folder and set it to read only .rsp files. That would deal with your .rsp files.
how can scenario 1 and 2 happens if package 2 will only run after package1 is finish? As I understand, package1 renames the file so it will only end when all the files are processes and renamed
EDIT:
ok, no worry, there is a solution for everything.
How about, you create a variable on package1 called #TotalNumberOfFiles with the total number of files to be processed, then you use package one to call pacakge2 (not sure if you are doing this already, but if not is very simple, just use a execute pacakge task) and on package2 you create a "parent package variable" (this is very simple too in case you have never done it) and package 2 just start processing when there are #TotalNumberOfFiles files on the output folder with the .rsp extension?
EDIT2:
I dont know jf there is a command to get that, maybe google it, but if you dont find out you can add a foreachloop container pointing to the output directory and do something like this on a script task:
Public Sub Main()
Dts.Variables("User::filesCount").Value = Dts.Variables("User::FilesCount").Value + 1
Dts.TaskResult = ScriptResults.Success
End Sub
after it finishes counting, just compare with TotalNumberOfFiles. If equal, move to the next task, else sleep for a while and count again
Final code used as follows. Basically loops through until either the file is found or the max specified number of attempts is hit.
Imports System.Threading is required for Thread.sleep. It may not be the most processor efficient method but this is 100% dedicated hardware and the packages is are running in serial.
'loop until number of required attempts is hit or file is found
Do Until iCounter = iAttempts Or bFileFound = True
'Check if the file exists
If File.Exists(sFilename) Then
'Switch bFileFound to true
bFileFound = True
'Report that file has been found to VERIFY_Input_File_Exists_INT variable
Dts.Variables("VERIFY_Input_File_Exists_INT").Value = True
Dts.Events.FireInformation(1, "DEBUG:", sFilename & " found successfully.", "", 0, False)
Else
'sleep for specified time
Thread.Sleep(iInterval * 1000)
Dts.Events.FireInformation(1, "DEBUG:", sFilename & " not found successfully. Sleeping for " & iInterval & "* 1000", "", 0, False)
End If
'increment counter
iCounter = iCounter + 1
Loop