Show tests results from custom plugin (utplsql) - junit

I'm using sonar for quite a long time and for me it is really great tool. Nowadays with plsql based project I have decided to use utplsql maven plugin to watch plsql tests results. Utplsql plugin outputs reports in junit like xml format. Unfortunately sonar is not presenting data from utplsql reports. This is plsql so there is no coverage or real java test classes - just an xml report. How to feed sonar just to view tests results, only main statistics like failed, success, all.

You might want to have a look at the Generic Test Coverage plugin. It will not be able to import directly xUnit type reports, but a bit of XSLT should allow you to convert to the correct format.

Related

CSV to JSON benchmarks

I'm working on a project that uses parallel methods to convert text from one form to another. We're going to implement a CSV to JSON converter to demonstrate the speedups that are possible using our parallel framework.
We want to benchmark our converter once it's finished. What are the fastest libraries/stand-alone programs/etc out there that are capable of doing CSV-JSON conversion? I found a list of potential candidates here:Large CSV to JSON/Object in Node.js, but I'm not sure how fast the listed options are. In the worst case I'll benchmark them myself, but if someone already knows what the "best in class" converters are it'd save me some time.
Looks like the maintainer of csvtojson has developed a benchmark application. I think I can add my csv to json converter to his benchmark project to test my converter.
if your project can consider in-browser apps, I suggest csvtojson as it is by far the speediest converter on the market as of 2017.
I created it myself so I may be a bit biaised, but I specifically developed it for a bigger project that required big csv to json crunching.
Tell me if it served.

How to take screenshot on test failure with junit 5

Can someone tell me please: how to take a screenshot when test method fails (jUnit 5). I have a base test class with BeforeEach and AfterEach methods. Any other classes with #Test methods extends base class.
Well, it is possible to write java code that takes screenshots, see here for example.
But I am very much wondering about the real problem you are trying to solve this way. I am not sure if you figured that yet, but the main intention of JUnit is to provide you a framework that runs your tests in various environments.
Of course it is nice that you can run JUnit within your IDE, and maybe you would find it helpful to get a screenshot. But: "normally" unit tests also run during nightly builds and such - in environments where "taking a screenshot" might not make any sense!
Beyond that: screenshorts are an extremely ineffective way of collecting information! When you have a fail, you should be locking for textual log files, html/xml reports, whatever. You want that failing tests generate information that can be easily digested.
So, the real answer here is: step back from what you are doing right now, and re-consider non-screenshot solutions to the problem you actually want to solve!
You don't need to take screen shots for JUnit test failes/passes, rather the recommended way is to generate various reports (Tests Passed/Failed Report, Code coverage Report, Code complexity Report etc..) automatically using the below tools/plugins.
You can use Cobertura maven plugin or Sonarqube code quality tool so that these will automatically generate the reports for you.
You can look here for Cobertura-maven-plugin and here for Sonarqube for more details.
You need to integrate these tools with your CI (Continuous Integration) environments and ensure that if the code is NOT passing certain quality (in terms of tests coverage, code complexity, etc..) then the project build (war/ear) should fail automatically.

Publishing results from shell script onto Jenkins

I have a regression build script which builds 90+ modules . The script maintains a list of what passed and what failed. Is there a plugin or easy waorkaround to display the status of those 94 modules?
Yes -- you can use the JUnit plugin to do that. Despite its name, it's not tied to unit testing alone.
The plugin can
display the success status of individual sub-tests
give a "failed since" indication for failed sub-tests
provide summary statistics on total passed/failed count over builds
Only caveat: you must convert your result list to JUnit XML format, so the plugin can process this as input data. The format is rather straightforward, though, and conversion should not be much effort.

Cucumber examples reuse in different features/scenarios

I've been using cucumber for awhile and I've stumbled upon a problem:
Actual question:
Is there a solution to import the examples from a single file/db using cucumber specifically as examples?
Or alternatively is there a way to define a variable while already in-step to be an example?
Or alternatively again, is there an option to send the examples as variables when I launch the feature file/scenario?
The Problem:
I have a couple of scenarios where I would like to use exactly the same examples, over and over again.
It sounds rather easy, but the examples table is very large (more specifically it contains all the countries in the world and their appropriate continents). Thus repeating it would be very troublesome, especially if the table needs changing (I will need to change all the instances of the table separately)
Complication:
I have a rerun function that knows when a specific example failed and reruns it after the test is done.
Restrictions:
I do not want to edit my rerun file
Related:
I've noticed that there is already an open discussion about importing it from csv here:
Importing CSV as test data in Cucumber?
However that discussion is invalid to me because I have the rerun function that only knows to work only with examples, and the solution suggested there ruins that.
Thank you!
You can use CSV and other external file systems with QAF using different BDD syntax.
If you want to use cucumber steps or cucumber runner, you can use QAF-cucumber and BDD2 (preferred) or Gherkin syntax. QAF-cucumber will enable external test data and other qaf features with cucumber.
Below is the example feature file uses BDD2 syntax can be run using TestNG or Cucumber runner.
Feature: feature uses external data file
#datafie:resources/${env}/testdata.csv
#regression
Scenario: Another scenario exploring different combination using data-provider
Given a "${precondition}"
When an event occurs
Then the outcome should "${be-captured}"
testdata.csv file may look like:
TestcaseId,precondition,be-captured
123461,abc,be captured
123462,xyz,not be captured
You can run using TestNG or Cucumber runner. You can use any of inbuilt data provider or custom as well.

generate questionary from html to a xml

I have a logistic web where there is a formulary to submit news package. I'm currently doing test runings on that website and I want to do some for that package formulary... The problem is that I want to do a lot of tests with diferent info in the formularys, and do that editing an xml is a bit tiresome.
I am looking for a tool that can generate a formulary about an html website (or with the .class object) with all the fields there are and where I can fill it and generate an automatic xml to do the tests.
My boss told me that it probably could do it with "wsdl" but i dont have any idea about that. Can you help me with any solution?
I'm working with .NET c#, and for the tests: gladio, watin and NUnit.
I'm not sure what you mean by formulary to submit news package, but if you are looking for sample data to do tests with, here are some sample data generators: http://www.webresourcesdepot.com/test-sample-data-generators/
You can use the sample data to create and save xml files for your tests.
Also, wsdl is short for Web Services Description Language, which is on no use for your purpose.