I am using the below Ant Target:
<target name="test-report" depends="compile-ommittedmanage-unittest" description="Generate Test Results as HTML">
<delete dir="${deploy}\html" />
<delete dir="${deploy}\xml" />
<delete dir="${deploy}\html\release" />
<mkdir dir="${deploy}\html" />
<mkdir dir="${deploy}\html\release" />
<mkdir dir="${deploy}\xml" />
<taskdef name="junitreport" classname="org.apache.tools.ant.taskdefs.optional.junit.XMLResultAggregator" />
<coverage>
<junit printsummary="on" haltonfailure="off" haltonerror="off" fork="yes" forkmode="once">
<batchtest fork="yes" todir="${deploy}\xml" filtertrace="on">
<fileset dir="${ommittedmanage-unittestbin}" includes="**/*Test*.class" />
</batchtest>
<formatter type="plain" usefile="false" />
<formatter type="xml" usefile="true" />
<classpath>
<path refid="ommittedManage.classpath" />
</classpath>
</junit>
</coverage>
<echo message="running JUnit Report" />
<junitreport todir="${deploy}\xml">
<fileset dir="${deploy}\xml">
<include name="TEST-*.xml" />
</fileset>
<report format="frames" todir="${deploy}\html" />
</junitreport>
</target>
to try and generate XML and HTML reports. But there are no XML reports generated even though the Jenkins server I use, run through all the tests:
test-report:
[delete] Deleting directory C:\Users\jenkins\omitted\workspace\omitted Ant\deploy\html
[delete] Deleting directory C:\Users\jenkins\omitted\workspace\omitted Ant\deploy\xml
[mkdir] Created dir: C:\Users\jenkins\omitted\workspace\omitted Ant\deploy\html
[mkdir] Created dir: C:\Users\jenkins\omitted\workspace\omitted Ant\deploy\html\release
[mkdir] Created dir: C:\Users\jenkins\omitted\workspace\omitted Ant\deploy\xml
[coverage] Enhancing junit with coverage
[junit] WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/C:/Users/jenkins/jenkins-dependencies/apache-ant-1.9.4-bin/apache-ant-1.9.4/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/C:/Users/jenkins/omitted/workspace/omitted%20Ant/deploy/classpath/ant.jar!/org/apache/tools/ant/Project.class
[junit] Running JUnitplayground.ParameterizedTest
[junit] Testsuite: JUnitplayground.ParameterizedTest
[junit] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,64 sec
[junit] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,64 sec
[junit] ------------- Standard Output ---------------
[junit] Addition with parameters : 1 and 2
[junit] Addition with parameters : 2 and 3
[junit] Addition with parameters : 3 and 4
[junit] Addition with parameters : 4 and 5
[junit] ------------- ---------------- ---------------
[junit]
[junit] Testcase: sumTest[0] took 0,015 sec
[junit] Testcase: sumTest[1] took 0 sec
[junit] Testcase: sumTest[2] took 0 sec
[junit] Testcase: sumTest[3] took 0 sec
[junit] Running com.omitted.coruscant.util.DateUtilUnitTest
[junit] Testsuite: com.omitted.coruscant.util.DateUtilUnitTest
[junit] Tests run: 43, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,188 sec
[junit] Tests run: 43, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,188 sec
[junit]
[junit] Testcase: testGetNextDay took 0 sec
[junit] Testcase: testIsSameInstant took 0 sec
[junit] Testcase: testGetFirstDayOfWeek took 0 sec
[junit] Testcase: testFormatUserDateTime took 0 sec
[junit] Testcase: testEndOfDay took 0 sec
[junit] Testcase: testDateTime took 0 sec
[junit] Testcase: testDifferenceHours took 0 sec
[junit] Testcase: testDifferenceMonth took 0 sec
[junit] Testcase: testMaxDateIfNullDate1 took 0 sec
[junit] Testcase: testOverlap took 0,094 sec
[junit] Testcase: testDay took 0 sec
[junit] Testcase: testGetCalendarFromDate took 0 sec
[junit] Testcase: testToday took 0 sec
[junit] Testcase: testGetDatePart took 0 sec
[junit] Testcase: testGetNextDayOrMidnight took 0 sec
[junit] Testcase: testLatestDate took 0 sec
[junit] Testcase: testBuildDate took 0 sec
[junit] Testcase: testIsForever took 0 sec
[junit] Testcase: testAddMonth took 0 sec
[junit] Testcase: testGetDayOfWeekIndex took 0 sec
[junit] Testcase: testFormatDate took 0 sec
[junit] Testcase: testIsOlderThanDays took 0 sec
[junit] Testcase: testDifferenceDays took 0 sec
[junit] Testcase: testCreateMonthEndingDatesBetween took 0 sec
[junit] Testcase: testDayOfWeek took 0 sec
[junit] Testcase: testIsBeforeOrSameDay took 0 sec
[junit] Testcase: testParseISODate took 0 sec
[junit] Testcase: testParseISOTime took 0 sec
[junit] Testcase: testCreateDayDatesBetween took 0 sec
[junit] Testcase: testParseISODateTime took 0,016 sec
[junit] Testcase: testIsOlderThanHours took 0 sec
[junit] Testcase: testIsLastDayOfMonth took 0 sec
[junit] Testcase: testNullIfmaxDate took 0 sec
[junit] Testcase: testToSqlTimestamp took 0 sec
[junit] Testcase: testGetOverlappingHours took 0 sec
[junit] Testcase: testEarliestDate took 0 sec
[junit] Testcase: testToSqlDate took 0 sec
[junit] Testcase: testMaxDateIfNullDate took 0 sec
[junit] Testcase: testCreatePeriodBreak took 0 sec
[junit] Testcase: testIsFirstDayOfMonth took 0 sec
[junit] Testcase: testGetHolidayYear took 0 sec
[junit] Testcase: testStartOfDay took 0 sec
[junit] Testcase: testLastDayOfMonth took 0 sec
[junit] Running com.omitted.manage.bl.AdressUtilUnitTest
[junit] Testsuite: com.omitted.manage.bl.AdressUtilUnitTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,015 sec
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,015 sec
[junit]
[junit] Testcase: testSynchAddressWithNullArguments took 0 sec
[junit] Running com.omitted.util.calc.FITest
[junit] Testsuite: com.omitted.util.calc.FITest
[junit] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,031 sec
[junit] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,031 sec
[junit]
[junit] Testcase: testEAN took 0 sec
[junit] Testcase: test took 0,015 sec
[junit] Running com.omitted.util.calc.IntrestTest
[junit] Testsuite: com.omitted.util.calc.IntrestTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,015 sec
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,015 sec
[junit]
[junit] Testcase: calulateInterestFeeByMonthTest took 0,015 sec
[junit] Running com.omitted.util.calc.JUnitEANValidationTest
[junit] Testsuite: com.omitted.util.calc.JUnitEANValidationTest
[junit] Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,047 sec
[junit] Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,047 sec
[junit] ------------- Standard Output ---------------
[junit] Checking validity of EAN Checksum: 5798009932600
[junit] Checking validity of EAN Checksum: 5798000428614
[junit] Checking validity of EAN Checksum: 5798000428256
[junit] Checking validity of EAN Checksum: 5798000428620
[junit] Checking validity of EAN Checksum: 5798000428621
[junit] Checking validity of EAN Checksum: 5798000428622
[junit] Checking validity of EAN Checksum: 5798000428623
[junit] Checking validity of EAN Checksum: 5798000428624
[junit] Checking validity of EAN Checksum: 5798000428625
[junit] Checking validity of EAN Checksum: 5798000428626
[junit] Checking validity of EAN Checksum: 5798000428627
[junit] Checking validity of EAN Checksum: 5798000428628
[junit] Checking validity of EAN Checksum: 5798000428629
[junit] Checking validity of EAN Checksum: 5798000428601
[junit] Checking validity of EAN Checksum: 4012195172451
[junit] Checking validity of EAN Checksum: 57980004286
[junit] EAN number must be 13 digits
[junit] Checking validity of EAN Checksum: 5798000428629123
[junit] EAN number must be 13 digits
[junit] ------------- ---------------- ---------------
[junit]
[junit] Testcase: testValidationOfFICheckCipher[0] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[1] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[2] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[3] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[4] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[5] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[6] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[7] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[8] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[9] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[10] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[11] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[12] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[13] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[14] took 0 sec
[junit] Testcase: testValidationOfFICheckCipher[15] took 0,016 sec
[junit] Testcase: testValidationOfFICheckCipher[16] took 0 sec
[junit] Running com.omitted.util.calc.JUnitFICheckCipherTest
[junit] Testsuite: com.omitted.util.calc.JUnitFICheckCipherTest
[junit] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,016 sec
[junit] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0,016 sec
[junit] ------------- Standard Output ---------------
[junit] Checking Cipher: 2684014996532
[junit] Checking Cipher: 660405152
[junit] Checking Cipher: 33259336900139
[junit] ------------- ---------------- ---------------
[junit]
[junit] Testcase: testCalculationOfFICheckCipher[0] took 0 sec
[junit] Testcase: testCalculationOfFICheckCipher[1] took 0 sec
[junit] Testcase: testCalculationOfFICheckCipher[2] took 0 sec
[echo] running JUnit Report
[junitreport] Processing C:\Users\jenkins\omitted\workspace\omitted Ant\deploy\xml\TESTS-TestSuites.xml to C:\Windows\TEMP\null28497120
[junitreport] Loading stylesheet jar:file:/C:/Users/jenkins/jenkins-dependencies/apache-ant-1.9.4-bin/apache-ant-1.9.4/lib/ant-junit.jar!/org/apache/tools/ant/taskdefs/optional/junit/xsl/junit-frames.xsl
[junitreport] Transform time: 1328ms
[junitreport] Deleting: C:\Windows\TEMP\null28497120
So since no XML is generated, I wager that's why empty HTML reports are generated. The reason I got the {deploy}\html\finish folder, is because the plugin responsible for publishing the HTML report, copies the files to a new location.
How would I make the XML reports generate proper?
I am fairly new to ant and the use of jenkins, so please excuse my noob mistakes.
Related
My test suite was working fine until it was using the version 0.9.2.
I have a test runner with KarateOptions in it to specify the feature files that are to be executed
#KarateOptions(tags = {"~#ignore"},
features = {
"src/test/java/com/pro/api/tests/features/beforesuitescenarios/feature1.feature",
"src/test/java/com/pro/api/tests/features/customerscenarios/feature2.feature",
"src/test/java/com/pro/api/tests/features/servicerequestscenarios/feature3.feature",
"src/test/java/com/pro/api/tests/features/invoicescenarios/feature4.feature",
})
And the test runner for this was using cucumber runner,
#Test
public void testAllFeatures() throws Exception {
String karateOutputPath = "target/surefire-reports";
KarateStats stats = CucumberRunner.parallel(getClass(), 1, karateOutputPath);
generateReport(karateOutputPath);
assertTrue("There are scenario failures", stats.getFailCount() == 0);
}
I tried upgrading the framework to 0.9.5 and modified the runner like it's mentioned in the latest docs,
#Test
public void testAllFeatures() throws Exception {
String karateOutputPath = "target/surefire-reports";
Results stats = Runner.parallel(getClass(), 1, karateOutputPath);
generateReport(karateOutputPath);
assertTrue("There are scenario failures", stats.getFailCount() == 0);
}
Now when I execute this suite, The tests are getting executed properly. But after the test execution of all the feature files are completed, It is throwing an error for the line
Results stats = Runner.parallel(getClass(), 1, karateOutputPath);
With the following IllegalArgumentException,
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1,295.291 s <<<
FAILURE! - in com.pro.api.tests.features.TestRunner
[ERROR] testAllFeatures(com.pro.api.tests.features.TestRunner) Time elapsed: 1,295.22 s
<<< ERROR!
java.lang.IllegalArgumentException: Illegal group reference
at com.pro.api.tests.features.TestRunner.testAllFeatures(TestRunner.java:55)
What am I missing while calling the runner ? How to fix this issue ?
Further when I tried to add exception handler for the failing step, I got the following error log,
java.lang.IllegalArgumentException: Illegal group reference
at java.base/java.util.regex.Matcher.appendExpandedReplacement(Matcher.java:1068)
at java.base/java.util.regex.Matcher.appendReplacement(Matcher.java:998)
at java.base/java.util.regex.Matcher.replaceFirst(Matcher.java:1408)
at java.base/java.lang.String.replaceFirst(String.java:2081)
at com.intuit.karate.core.Engine.saveTimelineHtml(Engine.java:500)
at com.intuit.karate.Runner.parallel(Runner.java:357)
at com.intuit.karate.Runner$Builder.parallel(Runner.java:181)
at com.pro.api.tests.features.TestRunner.testAllFeatures(TestRunner.java:56)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:384)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:345)
at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:126)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:418)
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1,181.97 s <<< FAILURE! - in com.pro.api.tests.features.TestRunner
[ERROR] testAllFeatures(com.pro.api.tests.features.TestRunner) Time elapsed: 1,181.905 s <<< ERROR!
java.lang.NullPointerException
at com.pro.api.tests.features.TestRunner.testAllFeatures(TestRunner.java:61)
Something in saveTimelineHtml is failing
Thanks for the hint - this is indeed a bug in the timeline reporting code.
Issue reference: https://github.com/intuit/karate/issues/1085
So you need to wait for the next version, or there should be an RC version pretty soon so that you can try it out.
#RunWith(MockitoJUnitRunner.class) //Class cannot be resolved to a type //#SpringBootTest public class MbankingApplicationTest {
#Mock CanmbTransactionDao dataServiceMock;
#Mock CanmbBaseDao baseDao;
#InjectMocks CanmbTransactionServiceImpl businessImpl;
#Autowired OminiController controller;
Customer customer;
#Test public void test() {
Customer cust= new Customer();
cust.setMbnumber("+919990176197");
cust.setDeviceid("abcdef");
UserProfileMaster profile = new UserProfileMaster();
profile.setChkflag(22);
profile.setStatus("ACTIVE");
cust.setUserProfile(profile);
cust.setMpin("123456");
cust.setMpinsalt("12345");
when(dataServiceMock.getUserMbAndDevice("+919990176197", "abcdef")).thenReturn(cust);
this.customer = cust;
assertEquals(cust, businessImpl.getUserMbAndDevice("+919990176197", "abcdef"));
}
#Test public void testMpin() {
Customer cust = new Customer();
cust.setMbnumber("+919990176197");
cust.setDeviceid("abcdef");
UserProfileMaster profile = new UserProfileMaster(); profile.setChkflag(22);
profile.setStatus("ACTIVE");
cust.setUserProfile(profile);
cust.setMpin("d150cb2c64171a95eb3fa1bbf2ea786aef16b04d389a1ac67a52c75e95f61e66");
cust.setMpinsalt("12345");
when(dataServiceMock.getUserMbAndDevice("+919990176197", "abcdef")).thenReturn(cust);
//assertEquals(cust, businessImpl.getUserMbAndDevice("+919990176197", "abcdef"));
MBSOMNIIntegration reqData = new MBSOMNIIntegration();
reqData.setMbnumber("+919990176197");
reqData.setDeviceid("abcdef");
reqData.setMpin("123456");
OMNIIntegration omni=new OMNIIntegration();
// businessImpl.validateOmniMpin(reqData, omni, "123");
ResponseData data= new ResponseData();
Map<String,String> ominiMap= new HashMap<>();
ominiMap.put("Msg", "verified"); ominiMap.put("statusCode", "0");
data.setStatusCode(0);
data.setTid("");
data.setData(ominiMap);
when(businessImpl.validateOmniMpin(reqData, omni, "123")).thenReturn(data);
assertEquals(data,businessImpl.validateOmniMpin(reqData, omni, "123"));
}
Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.835 sec
<<< FAILURE! - in com.npst.mb.MbankingApplicationTest testMpin(com.npst.mb.MbankingApplicationTest) Time elapsed: 0.823 sec
<<< FAILURE! java.lang.AssertionError: expected: com.npst.mb.pojo.ResponseData but was: com.npst.mb.pojo.ResponseData at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:120) at org.junit.Assert.assertEquals(Assert.java:146) at com.npst.mb.MbankingApplicationTest.testMpin(MbankingApplicationTest.java:93)
Results:
Failed tests: MbankingApplicationTest.testMpin:93 expected: com.npst.mb.pojo.ResponseData but was: com.npst.mb.pojo.ResponseData
Tests run: 2, Failures: 1, Errors: 0, Skipped: 0
[INFO]
------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO]
------------------------------------------------------------------------ [INFO] Total time: 12.493 s [INFO] Finished at: 2019-01-17T14:57:43+05:30 [INFO] Final Memory: 36M/346M [INFO]
------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (default-test) on project CapexMbankingPhase2: There are test failures. [ERROR] [ERROR] Please refer to /home/npstx/raj/Canara Projects/MBS_APP_OMNI/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
I am using Ant version 1.9.7 and JUnit 4.12. My build.xml looks like this:
<target name="run-junit" depends="init, compile, compile-junit">
<junit printsummary="yes" >
<formatter type="xml"/>
<classpath><pathelement location="lib/junit-4.12.jar"/></classpath>
<batchtest fork="yes" todir="${out.dir}">
<fileset dir="${bin.dir}">
<include name="**/*Test.class"/>
</fileset>
</batchtest>
</junit>
</target>
When running ant run-junit in a console it just gives me:
[junit] Running at.abc.def.ghi.ABCTest
[junit] Tests run: 5, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.065 sec
Test at.abc.def.ghi.ABCTest FAILED
But no more details. How can I resolve this?
To get more details about each failed test, use <formatter type="plain" usefile="false"/> instead of <formatter type="xml"/>:
<junit printsummary="yes">
<formatter type="plain" usefile="false"/>
<classpath><pathelement location="lib/junit-4.12.jar"/></classpath>
<batchtest fork="yes" todir="${out.dir}">
<fileset dir="${bin.dir}">
<include name="**/*Test.class"/>
</fileset>
</batchtest>
</junit>
Using a "plain" formatter gives output similar to the following:
[junit] Running TestMyTest1
[junit] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.014 sec
[junit]
[junit] Testcase: testMyTest took 0.003 sec
[junit] FAILED
[junit] expected:<firefox> but was:<null>
[junit] junit.framework.AssertionFailedError: expected:<firefox> but was:<null>
[junit] at TestMyTest1.testMyTest(TestMyTest1.java:17)
[junit]
[junit] Test TestMyTest1 FAILED
I'm using siege to locate some problem pages on our new sitemap and am having trouble getting it to stop after it runs through the urls.txt file. I have tried using reps=once in the command line, as well as in the .siegerc config file. I find that I have to use the config file, as I want the output written verbosely to a log file so that I can see page load times, 302 and 404 errors, etc. and import them into excel. However, no matter what I try I cannot get siege to stop when it completes the url.txt file- it just reruns it over again.
I have configured for 40 concurrent users, the time and reps variable is commented out in config, the url.txt file is in config. The syntax I run at cmd line is...
sudo siege --reps=once -v > outputfile.csv
I have tried setting the reps in config, no luck. Any ideas where I'm going wrong?
I ran into similar problems and trying multiple options I got it to work with:
# siege -c 10 -b -r 10 -f urls.txt
where urls.txt is a simple list of URLs like
http://ip-address/url1.html
http://ip-address/url2.html
....
....
The logs were written into a file specified in the siegerc file. ${HOME}/var/siege.log
2016-08-05 17:52:59, 100, 0.88, 4, 0.09, 113.64, 4.55, 9.67, 100, 0
2016-08-05 17:53:00, 100, 0.91, 4, 0.09, 109.89, 4.40, 9.76, 100, 0
2016-08-05 17:53:01, 100, 0.90, 4, 0.09, 111.11, 4.44, 9.78, 100, 0
2016-08-05 17:53:02, 100, 0.89, 4, 0.09, 112.36, 4.49, 9.64, 100, 0
2016-08-05 17:53:03, 100, 0.86, 4, 0.08, 116.28, 4.65, 9.84, 100, 0
2016-08-05 17:53:04, 100, 0.89, 4, 0.09, 112.36, 4.49, 9.80, 100, 0
2016-08-05 17:53:05, 100, 0.88, 4, 0.09, 113.64, 4.55, 9.83, 100, 0
2016-08-05 17:53:06, 100, 0.88, 4, 0.09, 113.64, 4.55, 9.89, 100, 0
2016-08-05 17:53:07, 100, 0.87, 4, 0.09, 114.94, 4.60, 9.79, 100, 0
2016-08-05 17:53:07, 100, 0.88, 4, 0.09, 113.64, 4.55, 9.85, 100, 0
}
I also observed that the logfile option is either buggy or very strict.
'-l filename.log' does not work.
$ siege -c 10 -b -r 10 -f urls.txt -l ./siege.log
** SIEGE 2.70
** Preparing 10 concurrent users for battle.
The server is now under siege...
done.
Transactions: 0 hits
Availability: 0.00 %
Elapsed time: 0.08 secs
Data transferred: 0.00 MB
Response time: 0.00 secs
Transaction rate: 0.00 trans/sec
Throughput: 0.00 MB/sec
Concurrency: 0.00
Successful transactions: 0
Failed transactions: 100
Longest transaction: 0.00
Shortest transaction: 0.00
FILE: /home/xxxx/var/siege.log
You can disable this annoying message by editing
the .siegerc file in your home directory; change
the directive 'show-logfile' to false.
But --log=filename.log works. e.g.
# siege -c 10 -b -r 10 -f urls.txt --log=./siege.log
$ siege -c 10 -b -r 10 -f urls.txt --log=./siege.log
** SIEGE 2.70
** Preparing 10 concurrent users for battle.
The server is now under siege...
HTTP/1.1 200 0.08 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.08 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.09 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.09 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 45807 bytes ==> /8af6cacb-50ed-40b6-995f-49480f9f74fa.html
HTTP/1.1 200 0.10 secs: 55917 bytes ==> /create_and_delete_networks.html
HTTP/1.1 200 0.10 secs: 55917 bytes ==> /create_and_delete_networks.html
HTTP/1.1 200 0.10 secs: 55917 bytes ==> /create_and_delete_networks.html
HTTP/1.1 200 0.10 secs: 55917 bytes ==> /create_and_delete_networks.html
HTTP/1.1 200 0.09 secs: 55917 bytes ==> /create_and_delete_networks.html
done.
Transactions: 100 hits
Availability: 100.00 %
Elapsed time: 0.89 secs
Data transferred: 4.60 MB
Response time: 0.09 secs
Transaction rate: 112.36 trans/sec
Throughput: 5.16 MB/sec
Concurrency: 9.74
Successful transactions: 100
Failed transactions: 0
Longest transaction: 0.15
Shortest transaction: 0.05
FILE: ./siege.log
You can disable this annoying message by editing
the .siegerc file in your home directory; change
Hope this helps.
I want to separate tests into 3 different categories:
unit
componnet
system
Then I want to run them separately on different phases and display results of execution these tests into 3 different surefire reports, or maybe one but with tests resultsd divided into 3 different catagories.
How to achieve it with maven?
I know I can run tests separately using failsafe maven plugin. So it is not problem.
There only problem have I can devide report into 3 catagories.
I am using maven-surefire-plugin with junit categories.
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.2</version>
<executions>
<execution>
<id>unit-tests</id>
<goals>
<goal>test</goal>
</goals>
<configuration>
<groups>com.mycompany.mavenproject2.UnitTest</groups>
<reportsDirectory> ${project.build.directory}/surefire-reports/unit</reportsDirectory>
<reportNameSuffix>UNIT</reportNameSuffix>
</configuration>
</execution>
<execution>
<id>comp-tests</id>
<goals>
<goal>test</goal>
</goals>
<configuration>
<groups>com.mycompany.mavenproject2.ComponentTest</groups>
<reportsDirectory> ${project.build.directory}/surefire-reports/comp</reportsDirectory>
<reportNameSuffix>COMPONENT</reportNameSuffix>
</configuration>
</execution>
<execution>
<id>sys-tests</id>
<goals>
<goal>test</goal>
</goals>
<configuration>
<groups>com.mycompany.mavenproject2.SystemTest</groups>
<reportsDirectory> ${project.build.directory}/surefire-reports/sys</reportsDirectory>
<reportNameSuffix>SYSTEM</reportNameSuffix>
</configuration>
</execution>
</executions>
</plugin>
It works fine,except that it run first all tests not separating them into categories.
How to remove such behaviour?
Build produced an output.
T E S T S
Running com.mycompany.mavenproject2.AppTest
UnitTest
ComponentTest
SystemTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 sec
Results :
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0
[surefire:test]
Surefire report directory: C:\Users\mz\Documents\NetBeansProjects\mavenproject2\target\surefire-reports\unit
T E S T S
Concurrency config is parallel='none', perCoreThreadCount=true, threadCount=2, useUnlimitedThreads=false
Running com.mycompany.mavenproject2.AppTest
UnitTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 sec
Results :
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
[surefire:test]
Surefire report directory: C:\Users\mz\Documents\NetBeansProjects\mavenproject2\target\surefire-reports\comp
T E S T S
Concurrency config is parallel='none', perCoreThreadCount=true, threadCount=2, useUnlimitedThreads=false
Running com.mycompany.mavenproject2.AppTest
ComponentTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 sec
Results :
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
[surefire:test]
Surefire report directory: C:\Users\mz\Documents\NetBeansProjects\mavenproject2\target\surefire-reports\sys
T E S T S
Concurrency config is parallel='none', perCoreThreadCount=true, threadCount=2, useUnlimitedThreads=false
Running com.mycompany.mavenproject2.AppTest
SystemTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 sec
Results :
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0