JMeter HTML Reporting - Why is "infinity" appearing for throughput and network speed? - html

This appeared for two larger requests, neither of which failed/errored, in a test case with a single user run.
However, this does not appear for the five-user run of the same test case.
I haven't been able to find any documentation on Apache regarding the appearance of infinity during test runs.
Has anyone faced this? If so, did you find a way to get the reporting tool list the true numeric value?
Example of "infinity" appearing in the statistics.json 1

If you have "Infinity" in the statistics.json it means that the relevant Sampler has failed somewhere somehow (it hasn't been executed for some reason).
The reason can be found in:
.jtl results file, take a look at "responseMessage" column
jmeter.log file
If you want to see where the values are coming from and how the statistics are being built and processed - increase JMeter's logging verbosity for the HTML Reporting Dashboard packages by adding the next line to log4j2.xml file:
<Logger name="org.apache.jmeter.report.dashboard" level="debug" />
The easiest way to reproduce the issue is just creating a "bad" request, for example adding a HTTP Request sampler like this:
it won't be executed because you cannot have :// characters in the DNS hostname and this transaction will be having "Infinity" values in the statistics.json

Related

Why my Soffid JSON REST Web Services Connector does not update an object in the target system?

I am trying to connect my Soffid 3 server with our custom web application named Schrift. I am using а JSON REST Web Services Connector for this purpose. I added REST Web service plugin and then configured an agent with JSON/XML/SOAP Rest webservice type.
Loading of objects is working fine. My REST connector connects to the web service successfully and gets data of the accounts.
The problem is when I am trying to update some data (for example, I am trying to lock an account), nothing happens. And unfortunately I don't know what should be happening. When should REST connector send updated data to the managed system and in which way? I didn't find any log entries saying that REST connector was trying to update an object on managed system. Maybe I did smth wrong or missed something.
I would appreciate for any help. I can post any conf or log details if you need.
Update#1
(I did some investigation after the first answer)
I checked the agent settings: Read only and Manual account creation are set to no
The account was set to unmanaged type, but I succeeded in changing its type to shared and then to single without getting an error. Now it is set to single
The task queue is empty.
Also I've checked that update method is present and update properties are set correctly. updateParams is not set (it means that all attributes should be sent to the managed system).
But when I change status of the account (from Enable to Disable), nothing happens.
In the console log I can see only these lines
14-Sep-2021 13:26:29.708 INFO [BPM-Scheduler:192.168.7.121:1] com.soffid.iam.bpm.job.JobExecutorThread.run No job to execute
When I manually run the task Analize impact for changes on Schrift, Execution log shows
Changes detected for accounts
=============================
NO CHANGE DETECTED
Changes detected for roles
=============================
NO CHANGE DETECTED
Update#2
After many attempts I made some progress. Now when I make some changes in the account, the task named UpdateAccount baklykov#irf.com.ua#Schrift appears, but runs with an error.
At first it was 415 Unsupported Media Type error as I wrote in comments, but now it looks a little different
Throws exception updating object : Extensible object [type = account]
EmployeeEmail: baklykov#irf.com.ua
IsLockedOut: true (log truncated) ...
caused by Unexpected response, Content-Type: null
Update#3
I found out that soffid's request for updating the object was in improper format (all the parameters were passed in the html request instead of putting them in json body)
After researching I found a method's property called Encoding and set it to application/json value.
Now the parameters are passed in json body (that's what I need), but now the problem is that soffid puts all the parameters in json body, including the key parameter by which the object for updating should be determined. My guess this is the reason why the object in the target system is still not updated.
In other words my application expects a request like this:
https://myapp.mysite.com/api/v1/Soffid/Employees?EmployeeEmail=baklykov%40irf.com.ua :
{"EmployeeLastName":"Baklykov","EmployeeFirstName":"Ivan"}
but Soffid sends this:
https://myapp.mysite.com/api/v1/Soffid/Employees:
{"EmployeeLastName":"Baklykov","EmployeeFirstName":"Ivan","EmployeeEmail":"baklykov#irf.com.ua"}
The system should have created a UpdateAccount task in the task queue. Please, verify:
The task engine is in automatic mode. In read-only or manual mode, no task will be created.
If you are updating an account, check the account is not set as unmanaged. In that case, no tasks is created.
Finally, verify the task queue has not held the task up.
Have you checked the engine mode? Look at Main Menu > Administration > Configure Soffid > Integration engine > Smart engine settings
It should be set to automatic.

Getting specific data from video surveillance web-interface in Zabbix

guys! I'm looking for a solution or some ideas on how to solve my task.
There is a video surveillance camera(vendor: Hikvision) with an accessible web-interface.
In the web-interface, there is a field Device Name containing data I need to retrieve by means of the Zabbix server and further to use this data for renaming discovered hosts.
Since Hikvision cameras support SNMP, I've tried the SNMP agent in Zabbix. I turned out that Hikvision MIB doesn't contain data from that field.
Also exploring web-interface through Developer tools in Google Chrome I stumbled upon the string Request URL: http://10.90.187.16/ISAPI/System/deviceInfo which gives such response in XML format:
<DeviceInfo xmlns="http://www.hikvision.com/ver20/XMLSchema" version="2.0">
<deviceName>1.5.1.1</deviceName>
<deviceID>566eec0b-6580-11b3-81a1-1868cb48861f</deviceID>
<deviceDescription>IPCamera</deviceDescription>
<deviceLocation>hangzhou</deviceLocation>
<systemContact>Hikvision.China</systemContact>
<model>DS-2CD2155FWD-IS</model>
<serialNumber>DS-2CD2155FWD-IS20170417AAWR749464587</serialNumber>
<macAddress>18:68:cb:48:86:1f</macAddress>
<firmwareVersion>V5.4.5</firmwareVersion>
<firmwareReleasedDate>build 170124</firmwareReleasedDate>
<encoderVersion>V7.3</encoderVersion>
<encoderReleasedDate>build 170123</encoderReleasedDate>
<bootVersion>V1.3.4</bootVersion>
<bootReleasedDate>100316</bootReleasedDate>
<hardwareVersion>0x0</hardwareVersion>
<deviceType>IPCamera</deviceType>
<telecontrolID>88</telecontrolID>
<supportBeep>false</supportBeep>
<supportVideoLoss>false</supportVideoLoss>
</DeviceInfo>
Where the tag <deviceName>1.5.1.1</deviceName> contains required data and now the question is how to put two and two together by means of Zabbix.
Digging into Zabbix documentation I've found an article about creating an Item based on HTTP agent with XML request . Unfortunately there are not any exmaples how to do it exactly.
Has somebody had such experience? Any clues will be helpful
You can create an HTTP Agent item, set it to TEXT type and point it to http://10.90.187.16/ISAPI/System/deviceInfo (don't forget the authentication, if required!), Zabbix will retrieve the full XML.
To get the desired value you have to create a dependent item, point it to the previous item and set up a preprocessing step.
Create a single XML Xpath preprocessing rule with parameter string(/DeviceInfo/DeviceName) to get the 1.5.1.1 value
If you want to get the firmware version, create another dependent item and set up the XPath to string(/DeviceInfo/FirmwareVersion) and so on for every element you need.
If you want a single value you can use a single item, adding the preprocessing rule to the http agent item. I use my solution for flexibility, maybe one day I'll need another XML element or maybe a firmware update will add some element to the page.
Dependent items are more flexible, but of course the full XML uses more storage in the database for stuff you don't need right now: it's a tradeoff, either way works!

Wireshark show a yellow row on the parameter line of SDP

The context : I'm building an RTSP / RTP server in C#, i saw that there was two main libs that can help me, the first is Managed Media Aggregation. The second is SharpRTSP. I use the first for the packetization of data for RTP and the second to handle the RTSP side with the SDP.
I analyzed packet with wireshark because when i try to access the video content of my server, it success to connect but i have no data incoming, so i look to everything that can be the cause. The analisys show me that the following line in the SDP (on the describe answer of the server) is marked with yellow.
fmtp:96 packetization-mode=1; // param 1
profile-level-id=4267; // param 2
sprop-parameter-sets=Z0IACvhBog==,aM44gA== // param 3
The question : My question is a two part question.
1. Can someone tell me why this line is returned with a warning by wireshark ?
2. Is it possible that wireshark return a warning because one of the parameter isn't correct ?
Thanks a lot for your time !
If you expand the packet details, you should see an "Expert Info" indication as to why Wireshark categorized the packet as a warning. You can also open the "Expert Infos" dialog via Analyze -> Expert Info or by clicking on the small circle in the lower left hand side of the status bar. That dialog will show you all the "Expert Infos" for all packets, grouped by severity.
Further, you can even apply a display filter for expert infos. The syntax differs depending on what version of Wireshark you're using, but it's one of these two:
Versions: 1.2.0 to 1.10.14
Versions: 1.12.0 to 2.2.3 (and later)
Wireshark's SDP dissector adds several "Expert Info" entries; you can always browse the packet-sdp.c source code to try to find out more information as to why Wireshark might have added a particular one. The expert info details begin down around line 3153 and the only entry that is categorized as PI_WARN is for an "Invalid conversion", the logic which determines this being up around lines 1338-1370. So is that the "Expert Info" you're seeing? If so, then there would appear to be something wrong with the profile-level-id, but without a packet capture to look at, I wouldn't be able to tell you exactly what that is.
It's also possible there was a Wireshark bug with the conversion? You haven't stated which version of Wireshark you're running, but you could try to update to the latest available version of Wireshark to see if the warning goes away. If it doesn't and you're confident that the packet is correctly formatted, you could open a Wireshark bug report and supply a capture file for the developers to use for testing.

Jmeter how to stop only one thread, not all the threads.

I have 50 users in my ThreadGroup with 50 seconds rump up (50 rows in my .csv config file). After certain HTTP request I would like to test for certain condition,and if pass, continue to next HTTP requests. Soft of read on google that BeanShell Assertion with the code
String response = SampleResult.getResponseDataAsString();
if(response.contains("\"HasError\":true")){
SampleResult.setStopThread(true);
}
should resolve my problem. But the problem is that this function actually stops the entire test execution, all remaining users (where I might have some more values at the .csv file to test). IS there any convenient way not to stop the entire test? If anybody faced that problem please advise.
You can set a thread to stop on Sampler error by configuring it in the thread-group component. Mark the 'stop thread' in the 'Action to be taken after Sampler error' section.
To ensure that you get a Sampler error by configuring a Response Assertion.

503 Service Unable: No registered leader was found after waiting for 4000 ms

I've recently started using solr. I'm using the latest Solr v6.1.0. I followed the quick start tutorial to get a feel of it. Being, a windows user I had to resort to the other way of importing my .csv data using Post tool for Windows
I am primarily interested in seeing how Solr can handle and search large data sets like the one I have. It is a 522 MB my_db.csv file which properly formatted (ran various python scripts to check that).
I started the solr cloud by the usual procedure. Then, I imported a part of this dataset (to be specfic, 29 lines of my_db.csv) to see if it works.
Shell:
C:\Users\MAC\Downloads\solr-6.1.0\solr-6.1.0>java -Dc=gettingstarted -Ddata=files -Dauto=yes -jar example\exampledocs\post.jar example\exampledocs\29lines.csv
Result was:
SimplePostTool version 5.0.0
Posting files to [base] url http://localhost:8983/solr/gettingstarted/update...
Entering auto mode. File endings considered are xml,json,jsonl,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
POSTing file 29lines.csv (text/csv) to [base]
1 files indexed.
COMMITting Solr index changes to http://localhost:8983/solr/gettingstarted/update...
Time spent: 0:01:28.106
Fortunately, it worked perfectly and I was able to use the default velocity search wrapper that they provide by going to http://localhost:8983/solr/gettingstarted_shard2_replica1/browse . It had all my data stored so far. 29 rows to be precise.
Now, I wanted to see if the whole 522 MB of data would be imported for which I used the same command (just replaced the .csv file, ofcourse) and then I run it. I did expect it to take a while, and after nearly 10 minutes it had inserted around 32,674 out of 1,300,000 and then it threw out this error.
Result was:
SimplePostTool version 5.0.0
Posting files to [base] url http://localhost:8983/solr/gettingstarted/update...
Entering auto mode. File endings considered are xml,json,jsonl,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
POSTing file omdbFull.csv (text/csv) to [base]
SimplePostTool: WARNING: Solr returned an error #503 (Service Unavailable) for url: http://localhost:8983/solr/gettingstarted/update
SimplePostTool: WARNING: Response: <?xml version="1.0" encoding="UTF-8"?>
<response>
<lst name="responseHeader"><int name="status">503</int><int name="QTime">128191</int></lst><lst name="error"><lst name="metadata"><str name="error-cla
ss">org.apache.solr.common.SolrException</str><str name="root-error-class">org.apache.solr.common.SolrException</str></lst><str name="msg">No register
ed leader was found after waiting for 4000ms , collection: gettingstarted slice: shard2</str><int name="code">503</int></lst>
</response>
SimplePostTool: WARNING: IOException while reading response: java.io.IOException: Server returned HTTP response code: 503 for URL: http://localhost:89
83/solr/gettingstarted/update
1 files indexed.
COMMITting Solr index changes to http://localhost:8983/solr/gettingstarted/update...
Time spent: 0:08:36.342
Summary
This was strange. I wasn't exaclty sure why this had happened. Is it perhaps that I have to change some kind of a "timeout" parameter for it to commit in? Unfortunately I wasn't able to see any such option for the windows post tool.
I found the solution to my problem. The problem wasn't that the file was huge. Which in my case was around 500 MB csv. I'm sure it will go through for even larger files.
The thing is, I think Solr has some kind of auto recognizing the kind of values are input in an index. For instance, my CSV had a column with "Years" like "2015","2014","1970"... etc. but when this column also had improper years which I didn't know, like "2014-2015", "1980-1988".
Solr would stop and throw out an exception because this was not a year but an year range. It wasn't expecting a value of this sort.
Summary
To fix the problem, I simply filtered out the faulty year rows and volla! it processed my 500 MB csv in around 15 minutes. After thatm I had a good nice database ready to be searched!