Using JUnit, Maven and Hudson/Jenkins for integration tests - junit

We are going to use Hudson/Jenkins build server to both build our server applications (just calling maven) and run integration tests against it. We are going to prepare 3 Hudson/Jenkins jobs: for build, deploy and run integration tests, which call each other in this order. All these jobs (build, deploy, integration tests) will be running nightly.
The integration tests are written with JUnit and are invoked by mvn test, (which will be invoked by the "test" Hudson/Jenkins job in turn). Since they require the server to be up and running we have to run that "deploy" job.
Does it make sense? Is there are any special server to deploy application and run tests or Hudson/Jenkins is ok for that?

It definitely makes sense, basically you are referring to a build pipeline. There is a Jenkins-plugin to help visualize the upstream/downstream projects (you create a new pipeline view in jenkins).
As for the deployment of the server component, this depends on what technology/stack you are running on. For instance you could write a script that deploys the application to a test environment using a post-build step in jenkins.
Another option is to use a maven plugin to deploy the application. You can separate the deployment step in profile, and run only the deploy goal on the deploy step etc.
Basically there are a lot of options, but the idea of a build pipeline makes a lot of sense. To read up on build pipelines and related topics I would suggest taking a look at Continuous Deployment.
For more information related to Jenkins, have a look at this video.
Does it make sense? Is there are any special server to deploy
application and run tests or Hudson/Jenkins is ok for that?
You can run the application on the same server as jenkins, but wether that makes sense depends on the application. If it depends heavily on a specific server setup, a better choice may be to run the server in a vm, and but the configuration in source control. There are plenty of tools to help automate this, of the top of my head you have Puppet, Chef and Vagrant

Depending on the technology of your server, you could do all of this in a single Hudson project, executing your integration tests using Maven's Failsafe plugin instead of Surefire.
This allows you to start and deploy prior to executing the integration tests, and shutdown your server after they have completed. It also allows you to separate your integration tests from your unit tests.
For Java EE applications, you can perform the start/deploy/stop steps using Cargo, or use an embedded Jetty containing and the Jetty Maven plugin.

Related

How to call an informatica workflow which running in different integration service

I have 2 workflows workflow 1 in Integration service 1 and workflow 2 integration service 2.
How do I call workflow 2 from workflow 1 I am currently trying to call then using command prompt but it didn't work
Just to let you know these integration servicce 1 is informatica 9.2
and integration service 2 is informatica version 10.2
PowerCenter does not provide suppor for cross-workflows dependencies. Regardless of whether these are configured to use same or a different Integration Service.
The best way to solve this kind of challenges is to use separate scheduling tool, such as AirFlow, Control-M, Autosys - or any other.
It is also possible to expose the workflow as a webservice and call it from different workflows, if needed. Not really convenitent, but possible.
Lastly, it's possible to use command line interface pmcmd startworkflow in a command task of one workflow to have the other one started.
I have done something similar this way:
The other WF is a web service one/ or is executed along a web service.
Add an application connection.
The WSH where your WF runs should be the endpoint of that connection.
Add this WF inside the mapping of the other one as a Web Service transformation.

How can I write a test module that needs an operative mySQL database?

I'm writing a nodejs npm module that needs a mySQL database to operate.
I would like to write a test module that connects to a "fake" database and do some operations on it.
I already have setup my test database locally in my developing machine, but I would like this tests to work in any machine.
What's the best practice for writing integration tests modules that depends on an operating mySQL database?
Does exists any public service in the net where I can get a temporary mysql user/password where I can do some operations for a limited time/size?
Usually you would set up a continuous integration (CI) system that executes your tests everytime you commit a change to your version control system. The CI system would provide a clean MySQL database your tests will run against. If you use a CI system in the cloud you often can easily configure it to provide the database. E.g. see Travis CI.
If you set up a CI system other developers will still need to run their own MySQL database on their computer if they want to execute the tests. Alternatively, you may use a mock instead of the real database in your tests. For details see: How do you mock MySQL (without an ORM) in Node.js
However, using a mock won't give you sufficient test results since the mock just emulates the database in a simple way. Sometimes the mock may be too simple or just be buggy. So you will need to run at least some of your tests also against the real database. Thus you may choose to run the tests against the real database with your CI system and run the tests against the mock during development.
Not quite fully understanding the question I would suggest Amazon RDS for short lived testing where accessing something over the public internet is required. Amazon web services can get pricey quick if used for any real traffic but still a good option for any proof of concept.

single application multiple environments in openshift

Is it possible to have single application multiple environments in openshift. I want to have environment like development, test and production in openshift. Is there a way to do this type of deployments automatically.
Its definitely possible to accomplish this in Openshift. You can leverage jenkins to due your builds and git to create dev, test, prod branches.

Executing junit on remote server in jenkins

can anyone help with this.
I am using Jenkins to deploy a build to a remote server, so far so good. However, I want to run JUnit tests on that remote server, but I cannot find how to do this within Jenkins. I have tried it within the ANT but it gives me an error regarding the junit.jar.
I believe that the tests are executing locally rather than remotely.
Any help would be appreciated; Jenkins is a very new experience to me.
Initially you have to be aware of few things. Jenkins is a CI tool which built with plenty of features to make things automated. If you need to run tests on remote server, then follow the sequences to create such a setup :
Install jenkins on a Machine and properly configure it as CI-Server.
Deploy your remote server with necessary tools and configure well.
On Jenkins server, install SSH plugin to run jobs on remote machine via ssh.
Add the remote server as slave node under Jenkins -> Manage Jenkins -> Manage Nodes -> Add Node menu on Jenkins server.
Configure the node as per your requirement.
Create a new job which could run the junit tests with pre/post build actions in jenkins.
Finally schedule the build for slave node and kick it off.
For step by step instructions, refer this answer.

What is the difference between using Glassfish Server -> Local and Remote

I am using Intellij IDEA to develop my applications and I use glassfish for my applications.
When I want to run/debug my application I can configure it from Glassfish Server -> Local and define arguments at there. However there is another section instead of Glassfish Server, there is a Remote section for configuration. I can easily configure and debug my application just defining host and port variables.
So my question is why to need for Glassfish Server Local configuration(except for when defining extra parameters) and what is difference between them(I mean performance or etc.)?
There are a number of development work-flow optimizations and automation that can be performed by an IDE when it is working with a local server. I don't have a strong background in IDEA, so I am not sure which of the following they may have implemented:
using in-place|exploded|directory deployment can eliminate jar/war/ear creation in the IDE and deconstruction in the server. This can be a significant time saver.
linked to 1 is smarter redeployment. In some cases, a file change (like changing a jsp or an html file) does not need to trigger redeployment.
JDBC driver integration allows users to configure their IDE to access a DB and then propagates that configuration (which usually includes driver jars, etc.) into the server's classpath as part of deployment of an app.
access to server log files during deployment and execution.
The ability to start and stop the server... even today, you do need to restart GlassFish sometimes.
view the generated Java sources of a JSP.
Most of these features are not available with a remote server and that has a negative effect on iterative development since the break between edit and validate can be fairly long.
This answer is based on my familiarity with the work that we have done for the NetBeans/GlassFish integration. The guys at IntelliJ are smart, so I would not be surprised if they have other features that are available when you are working with a local server.
Local starts Glassfish for you and performs the deployment. With Remote you start Glassfish manually. Remote can be used to debug apps running on another machines, Local is useful for development and testing.