RSpec report failure outside tests to JUnit - junit

How do I configure rspec to report the absence of examples as a failure in the JUnit report?
To give you some context: I recently parallelized my rspec test suite using parallel_rspec. Doing so, I forgot to migrate a --require XYZ from the .rspec file to .rspec_parallel. As most of my specs started with require 'XYZ' on their own, I didn't notice the error right away. If one of the runners starts with a spec that does not have that include, it fails for some symbols not being defined (see here). However, the generated JUnit reports just state that there was no test being executed so that failing workers went unnoticed in our Jenkins.
I am quite certain that the lacking --require flag was breaking the runner, but I would like to catch if that should not have been the single cause. Therefore, how would I configure rspec to report "no tests" as a failure when using --format RspecJunitFormatter --out tmp/rspec.xml?
$ rspec --version
RSpec 3.7
- rspec-core 3.7.1
- rspec-expectations 3.7.0
- rspec-mocks 3.7.0
- rspec-support 3.7.1

Related

Errors Uploading to CircleCI with MySQL Database - Maven Clean Also Creates Issues When Run Before Maven Test

I'm using the Hibernate framework along with Maven in IntelliJ. I'm creating a MySQL database, I also have some ORM classes that map the MySQL database, and then I'm running some JUNIT tests to make sure everything works.
Where I'm having trouble is in two places, which are related to each other:
When I run mvn test, sometimes my JUNIT tests work fine and are
able to query the simulated database, establish a connection (even
though it's just with the simulated database), execute a statement,
etc. However, sometimes, if I run mvn clean before running mvn test,
while the JUNIT tests still execute, the tests output with failures
(not errors, just failures, thought this is still bad, of course).
The problem outlined in #1 is essentially duplicated when I upload to GitHub and run CircleCI (which isn't surprising, since CircleCI runs mvn test when doing its integration testing). Most of my uploads failed, but one of them finally worked. However, I'm not exactly sure why the "final" upload was successful while the others weren't.
The error messages I'm getting either from mvn test or the CircleCI builds are typically as follows. These errors are from my pent-ultimate upload, the one I did just prior to the next upload which actually worked:
java.sql.SQLNonTransientConnectionException: Could not create connection to database server. Attempted reconnect 3 times. Giving up.
com.mysql.cj.exceptions.CJException: Public Key Retrieval is not allowed
java.sql.SQLNonTransientConnectionException: Could not create connection to database server
I should also note that my intention is to run mvn clean first, then upload to CircleCI, however, running mvn clean seems to be somehow involved in perpetuating these errors.
As far as different resources I'm using go, here they are. If I'm forgetting something, please let me know and I should be able to include it.
In my hibernate.cfg.xml file, I have the following lines:
<property name="connection.url">jdbc:mysql://localhost:3306/stocks</property>
<property name="connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="hibernate.dialect">org.hibernate.dialect.MySQLDialect</property>
At the end of the word "stocks" on the first line, I have sometimes appended any of the following (sometimes I only appended one of the following, other times I combined them, depending on the error(s) from either Maven or CircleCI). Appending some combination of these lines seemed to help get things to work, but running mvn clean seemed to halt any effect these additions were having:
autoReconnect=true
useSSL=false
allowPublicKeyRetrieval=true
Running the JUNIT tests from within IntelliJ usually works, but if I run mvn clean first, then IntelliJ usually won't work, unless I then go back into this file and append ?autoReconnect=true&useSSL=false. If I do that, then IntelliJ will run the JUNIT tests fine.
In my config.yml file for CircleCI, I have the following code. Certain statements were added in MAVEN_OPTS based on other research I did to try to counteract the errors I was getting, but I don't know if these statements are having any impact one way or the other:
# Java Maven CircleCI 2.0 configuration file
#
# Check https://circleci.com/docs/2.0/language-java/ for more details
#
version: 2
jobs:
build:
docker:
# specify the version you desire here
- image: circleci/openjdk:8-jdk
# Specify service dependencies here if necessary
# CircleCI maintains a library of pre-built images
# documented at https://circleci.com/docs/2.0/circleci-images/
# - image: circleci/postgres:9.4
- image: circleci/mysql:latest-ram
environment:
- MYSQL_ROOT_PASSWORD: (my real password goes here)
- MYSQL_DATABASE: stocks
- MYSQL_USER: bob
- MYSQL_PASSWORD: (the real password goes here)
working_directory: ~/repo
environment:
# Customize the JVM maximum heap limit
MAVEN_OPTS: -Xmx3200m -Dmaven.wagon.http.ssl.insecure=true -Dmaven.wagon.http.ssl.allowall=true -Dmaven.wagon.http.ssl.ignore.validity.dates=true
steps:
- checkout
- run: sudo apt install -y mysql-client
# Download and cache dependencies
- restore_cache:
keys:
- v1-dependencies-{{ checksum "pom.xml" }}
# fallback to using the latest cache if no exact match is found
- v1-dependencies-
- run: mvn dependency:go-offline
- save_cache:
paths:
- ~/.m2
key: v1-dependencies-{{ checksum "pom.xml" }}
# run tests!
- run: mvn integration-test
If anyone has any idea what's going on, I appreciate the help. My goal is to be able to upload to CircleCI by first running mvn clean so only the src files, pom.xml file, and .circleci folder are included in the upload. Also, not to belabor the point, but my most recent upload to CircleCI did in fact work, but I'm not sure what made that build work while all the others ones did not.

Running unit tests with Codeception in Yii2 project

Trying to setup remote Codeception Unit Tests in PhpStorm in a Yii2 project.
Using SSH I can log into the server go to the root directory of my Yii2 project and run :
> vendor/bin/codecept run unit
and the tests run.
I'm trying run these remote tests via PhpStorm, I've setup a Remote PHP CLI interpreter and I'm pointing to the Codeception library in my Yii2 project folder:
/var/www/vhosts/mydomain.com/httpdocs/yii2/vendor/bin/codecept
Test Runner points to:
/var/www/vhosts/mydomain.com/httpdocs/yii2/codeception.yml
Trying to run the tests the following command is executed:
> ssh://user#mydomain.com:22/opt/plesk/php/5.6/bin/php /root/.phpstorm_helpers/phpunit.php --no-configuration /var/www/vhosts/mydomain.com/httpdocs/yii2/tests
The process fails at it complains that it cannot find PHPUnit:
Process finished with exit code 1
Cannot find PHPUnit in include path (.:/opt/plesk/php/5.6/share/pear)
How do I get PhpStorm to look for PHPUnit in the yii2/vendor folder? Can I just tell PhpStorm to run a different command instead of this phpstorm_helpers? It seems that the documentation is out of date and the screenshots JetBrains provides are from a different version of PhpStorm, I'm running PhpStorm 2017.3
So after a LOT of digging, the issue was with the Run/Debug Configuration. Despite adding Codeception to the Test Frameworks section, clicking the run button still tried to execute a pure PHPUnit test.
To switch to run the test as Codeception, look at the top toolbar above the file tabs:
There you will be able to define various options:
Now under run you'll have additional options:
Choose the blue Codeception icon to run the test using Codeception instead of PHPUnit

Visual Studio Team Services Building JSON Scripts

I'm currently building scripts using Selenium Builder (which saves files as JSON) and i'm having a hard time running these scripts on VSTS. My question specifically is, can Visual Studio Team Services build JSON scripts and tie them in with its C.I.? If so, which approach must I take in order to do this / make it possible?
Thanks!
Here is my steps for your reference:
Deploy your own private build agent by following this link.
Configure the required environment on the build agent like Selenium Driver, Firefox so that the testing can be run on the build agent.
Upload the json file generated by Selenium Builder into VSTS Repository.
Create a build definition with two Command Line tasks: The first one runs npm install command to install se-interpreter:
And the second one run se-interpreter command to run the test in json file:
Queue the build, you will see the test been executed during the build:

Hudson + Maven + Emma/Sonar = Build Cycle Runs 2x

I have a bunch of Maven projects building in Hudson with Sonar sitting in the side-lines. Sonar gives me Sonar stats, FindBugs stats, and code-coverage.
I've noticed that regardless of if I use Sonar or if I use EMMA via Maven directly, the entire build cycle runs twice. This includes init (which in my case, reinitializes the database -- expensive) and unit tests (a few hundred -- also expensive).
How can I prevent this? I did a lot of reading, and it seems like this is due to the design of code-coverage plugins -- to keep uninstrumented classes separated from instrumented ones.
I've tried configurations like:
Maven runs: deploy, EMMA
Maven runs: deploy; deploy to Sonar on completion
The sonar documentation recommends running the sonar plugin in 2 stages:-
mvn clean install -Dtest=false -DfailIfNoTests=false
mvn sonar:sonar
The tests are bypassed in the first phase and run implicitly in the second stage.
A one line alternative is to run the following command:-
mvn clean install sonar:sonar -Dmaven.test.failure.ignore=true
but this will run the tests twice - as you have found.
To add to #Strawberry's answer, you could reuse the unit test reports instead of running them again. Refer to the section Reuse existing unit test reports in the sonar documentation
Once this is done, you can configure the following in Hudson
clean deploy sonar:sonar

Hudson + JUnit + embedded GlassFish, how to provide domain configuration?

I'm using NetBeans and GlassFish 3.0.1 to create an EJB3 application. I have written a few Unit Tests, which get run via JUnit and make use of the embedded GlassFish. Whenever I run these tests on my development machine (so from within NetBeans), it's all good.
Now I would like to let Hudson do those tests. At the moment it is failing with lookup failure on a resource (in this case the datasource to a JPA persistance unit):
[junit] SEVERE: Exception while invoking class org.glassfish.persistence.jpa.JPADeployer prepare method
[junit] java.lang.RuntimeException: javax.naming.NamingException: Lookup failed for 'mvs_devel' in SerialContext
After searching around and trying to learn about this, I believe it is related to the embedded GlassFish not having been configured with resources. In other words it's missing a domain.xml file. Right?
Two questions:
Why does it work with NetBeans on my dev box? What magic does NetBeans do in the background?
How should I provide the file? Where does the embedded GlassFish on the Hudson-box expect it?
Hudson is using the same Ant build-scripts (created by NetBeans).
I've read this post about instanceRoot and the EmbeddedFileSystemBuilder, but I don't understand enough of that. Is this needed for every TestCase (Emb. GF gets started/stopped for each bean-under-test)? Is this part of EJBContainer.createEJBContainer()? Again, why is it not necessary to do this when running tests on NetBeans?
Update
Following Peter's advice I can confirm: when running ant on a freshly checked out copy of the code, with the same properties as hudson is configured, the tests get executed!
10-1 it is a classpath issue as IDE's tend to swap paths in and out depending if you run normally or unittests.
Try running the tests on a commandline from a freshly checked out version from your SCM. Chances are you'll have the same error. Debugging on your local machine is a lot easier than on a remote machine.
When it builds reliably on the command line (in a separate directory) then it is time to move to hudson.