Parsing JUnit XML with Hudson - junit

I'm using Jasmine-reporters with Jasmine to output a bunch of JUnitXML format files that I need Hudson to parse for me and report on success/failure. Does anybody know how I would ask Hudson to go test a bunch of XML as part of the build process? Thanks!

In the job's "Post-build Actions", there should be a check box to "Publish JUnit test result report". You can put an ant-glob expression (as if you were writing an "includes" element) to match your xml files. I'm basing this on what I see in my Jenkins server; Hudson should be equivalent for your question.

Related

FitNesse: How to send tests execution reports from Jenkins to an endpoint in JSON format?

I have a task to send reports of periodic execution of FitNesse tests to some specific endpoint in some specific JSON format.
I set periodic execution of tests in Jenkins properties and saving it in XML, but now I need to parse information about results of it.
It cannot be just step in "after build" property in Jenkins (or can, but I don't know a plugin for it), but what it would be and how I can do this?
Especially, I don't need information about the test, only general moments like date of the test, pass rate, status, name of the project, etc.
I think the best way to solve this is to make a script that parses the XML file, and creates the required JSON file. We normally use python scripts for this.
If you need certain generic information of the build in the script, like build number, you can pass this to your script using the Jenkins environments.
To call the script just add a batch or shell step, and place it below your fitnesse build step, to make sure the XML is generated before calling the script.
FitNesse comes with a jUnit runner which allows you to execute a test/suite. If you create a test class annotated with #RunWith(FitNesseRunner.class) and include its execution in a Jenkins Maven job (where the jUnit class is executed by either surefire or failsafe plugin), the outcome of the tests executed will be picked up automatically by Jenkins, just like it picks up other/regular jUnit tests (as surefire or failsafe will include them in their XML reports and Jenkins will pick these up).
You can find a sample Maven FitNesse project using (a slightly customised version of) this approach at https://github.com/fhoeben/sample-fitnesse-project. How to run the tests on Jenkins is described at https://github.com/fhoeben/hsac-fitnesse-fixtures#to-run-the-tests-on-a-build-server:
Have the build server checkout the project and execute mvn clean test-compile failsafe:integration-test. The result in JUnit XML results can be found in: target/failsafe-reports (Jenkins will pick these up automatically for a Maven job)
You indicate you don't need the HTML results, but they will be made available. They can be found in: target/fitnesse-results/index.html, and you could choose to use the 'HTML Publisher' Jenkins plugin to link to them from each build.

Junit Pass/Fail reports in SonarQube using Junit xml

Can we have Junit test reports (not coverage) Test Pass / Fail Percentage report on Sonar Qube Dashboard (From my understanding, it should appear in the 'measures' tab) from the Junit report xml generated from Build ?
What is the step by step procedure to get Unit Test execution (Pass/Fail) reports.
You need to set sonar.tests path in sonar-project.properties file. It should be the directory where your test case report xml file is placed.
(https://docs.sonarqube.org/display/SONAR/Analysis+Parameters)
Once you will set the correct path, then you will be able to see the reports at Messures>Coverage.
Everything is clearly documented on the "Code Coverage by Unit Tests for Java Project" documentation page.
This will create metrics that you can indeed browse on the "Measures" page:

generating reports in Jenkins

Need to run a job in Jenkins after successfully running the tests it needs to perform as post-build action for generating test reports
For this i have configured
Publish JUnit test result report
In the field
Test Report XMLs: continuum/*/target/surefire-reports/*TestSuite.xml
'continuum/*/target/surefire-reports/TestSuite.xml' doesn't match anything: 'continuum' exists but not 'continuum//target/surefire-reports/*TestSuite.xml'
Can you please help me out in resolving the error....??
I assume you have an 'Execute JUnit tests' Build step. This will produce a JUnit XML file to a location you specify, say, TestOutput/junitresults.xml.
In the 'Publish JUnit test result report' Post Build step you just need to specify TestOutput/junitresults.xml.
As long as your tests executed and produced the output file the Post Build step won't fail to publish it, whether the tests failed or not.
You shouldn't be trying to publish files in the surefire-reports directory unless that it where you told JUnit to write its output file. Normally you wouldn't.
If you want to make it even simpler just tell JUnit to write its output file to the Jenkins WORKSPACE root by removing the TestOutput/ and just specify junitresults.xml.

Bamboo's JUnit Parser won't parse my gtest output.xml

I was trying to add some automated Unit Tests to my project with Bamboo and have been facing some problems. The Unit Tests themselves are done with googletest, which creates an XML file which should be compatible with the JUnit parser.
However, I'm getting the following error when executing Bamboo's JUnit Parser:
02-Apr-2013 12:11:22 Starting task ''Parse UnitTest output' of type 'com.atlassian.bamboo.plugins.testresultparser:task.testresultparser.junit'
02-Apr-2013 12:11:22 Parsing test results...
02-Apr-2013 12:11:22 Failing task since test cases were expected but none were found.
02-Apr-2013 12:11:22 Finished task 'Parse UnitTest output'
This doesn't seem to have anything to do with the .xml file itself, as I've tried a few. This included my own output.xml, generated by googletest and the sample outputs from https://confluence.atlassian.com/display/BAMBOO/JUnit+parsing+in+Bamboo.
I also adapted said files against the two proposed .xsd files, which should match the output that the JUnit Parser expects, but all to no effect.
Update:
Up until now I told the JUnit Parser to look for ${bamboo.build.working.directory}/output.xml
When I tried **/*.xml it worked.
As I understand it now, after very carefully reading the task description, I have to give it a folder. But I can also give it the files, if I do it in ant-style (with a glob?). This is at the very least very confusing and still doesn't fully answer the inital question. So if anyone could enlighten me, please do.
This is a super-old question, I figured I'd add an answer for posterity. As a few people have commented, the configuration value for test output files requires a relative path. The question is, relative to what?
I think the answer to that depends on how you have your source repositories configured, but in general it will be relative to the root of your project. If all else fails, look at where bamboo is putting your source code when it gets checked out; that'll be the directory to which bamboo appends the test output search path.
For the configuration syntax, you're correct that ant-style patterns can be used (Learning Ant path style for reference).
Just as an example, if you have a project which on your local machine lives at C:\git\MyProject, and your test results end up at C:\git\MyProject\Output\Tests\output.xml, then you'd specify Output/Tests/output.xml in the 'Specify custom results directories' field of the appropriate task configuration. You could also use Output/**/*.xml to search for all .xml files in the Output directory.

Generating JUnit reports from the command line

I have a test setup for a cloud system that uses a mixture of python for process level control and junit for internal state inspection. Essentially, I bring up several VMs to server as the cloud and then a junit VM which is a member of the cloud but drives tests and checks internal state. Our existing cloud management stuff is driven by python and I would like to maintain this.
I have a working setup that will run the JUnit command line via
java -ea -cp <classpath> org.junit.runner.JUnitCore <tests>
but this does not produce an report file. I know that ant is capable of producing an xml report, but I do not want to involve ant in this process (I have enough moving parts already).
Is there a way to launch junit from the command line such that it produces a report?
Ideally, I would have the junit tests produce xml reports, the python tests produce xml reports, and then merge them together for consumption by our CI system.
Update: The command line execution must support Windows, Linux, and Mac. We are not allowed to ship an external ant, although packaging an internal ant might be an option.
The JUnit library does not have any XML output options. To achieve such a thing, you'll need to write your own RunListener, which listens for the output and will in your case write the XML file.
However, to get the XML file in the correct format so that it can be read by CI system, I think it would be far easier to just use ant, either via the command line using a build.xml (JUnitReport), or using the java api: How can i use Apache ANT Programmatically.
EDIT: Initially, we had four options:
Use ant from the command line
Use ant programmatically (using the Java API)
Use the XMLJUnitResultFormatter directly with JUnitCore
Create a custom RunListener which produces the correct XML output.
Given the restrictions added by the OP, we can't use ant from the command line, which eliminates 1.
After looking more closely at the Ant JUnit task, it seems to be impossible to use this with JUnitCore (adding a TestListener), because ant uses the name of the test class directly, so you can't do a bridge class. From XMLJUnitResultFormatter.java
private void formatError(String type, Test test, Throwable t) {
...
nested.setAttribute(ATTR_TYPE, t.getClass().getName());
String strace = JUnitTestRunner.getFilteredTrace(t);
Text trace = doc.createTextNode(strace);
nested.appendChild(trace);
}
This eliminates 3.
Invoke Ant programmatically, via the Java API. I can't find any recent documentation on this. This seems to be hard.
So, finally, I would do 4, a custom RunListener, using the code from XMLJUnitResultFormatter as a base. And then, I'd publish it on github.com, so this question could be answered properly :-)