I am completely new to apache camel.
I got some basic understanding about it.
Now I am going through some videos and documents to get some ideas for implementing junit test cases for apache camel spring DSL based spring boot application but it's not clear to me since there are many ways to implement or in very high level.
I am confused.. which one to follow and what is actually happening in those junits
Does anyone have example or link or videos which explains junit coverage for apache camel spring DSL based spring boot application?
I am particularly looking for junits.
also If you know someone good tutorials about apache camel let me know.
JUnit and Camel doesn't work the same as JUnit and "normal" code and as far as I am aware there's only fairly rudimentary ways to get coverage of a camel route from JUnit. Camel routes are a processing model that is essentially an in memory model of the various steps that need to run, so you can't use code coverage tools to track what parts get executed.
Consider this route (in a subclass of RouteBuilder ):
public void configure() throws Exception {
from("jms:queue:zzz_in_document_q")
.routeId("from_jms_to_processor_to_jms")
.transacted()
.log(LoggingLevel.INFO, "step 1/3: ${body}")
.bean(DocBean.class)
.log(LoggingLevel.INFO, "step 2/a3 - now I've got this: ${body}")
.process(new DocProcessor())
.log(LoggingLevel.INFO, "step 3/3 - and finally I've got this: ${body}")
.to("jms:queue:zzz_out_document_q");
}
and an associated test case, in a class that extends CamelBaseTestSupport:
#Test
public void testJmsAndDbNoInsert() throws Exception {
long docCountBefore = count("select * from document");
template.sendBody("jms:queue:zzz_in_document_q", new Long(100));
Exchange exchange = consumer.receive("jms:queue:zzz_out_document_q", 5000);
assertNotNull(exchange);
Document d = exchange.getIn().getBody(Document.class);
assertNotNull(d);
long docCountAfter = count("select * from document");
assertEquals(docCountAfter, docCountBefore);
}
When the unit test runs the app context will run the configure method, so I've got 100% coverage of my route before I even put a message on the queue! Except I don't, because all it's done is created the execution model in the camel route system and the various components and processors are now all going to run in the right order.
Beans and Processors will get included in the coverage reports, but if you have complex logic in the routes it's not going to give you coverage on this.
There is this capability, delivered around 2017 - https://issues.apache.org/jira/browse/CAMEL-8657 - but I haven't used it and am not sure how it will go working with whatever coverage tooling you use.
Related
I'm trying to write a test that needs both Robolectric 2.2 and PowerMock, as the code under test depends on some Android libraries and third party libraries with final classes that I need to mock.
Given that I'm forced to use the Robolectric test runner through:
#RunWith(RobolectricTestRunner.class)
...I cannot use the PowerMock test runner, so I'm trying to go with the PowerMock java agent alternative, without luck so far.
I have setup everything according to this guide but I'm facing a collision problem between classes required by the javaagent library and by robolectric through its dependency with asm-1.4. Both depend on
org.objectweb.asm.ClassVisitor
, but javaagent-1.5.1 ships with its own version where ClassVisitor is an interface while asm-1.4 version for the same namespace is an abstract class, with the corresponding error at runtime:
java.lang.IncompatibleClassChangeError: class org.objectweb.asm.tree.ClassNode has interface org.objectweb.asm.ClassVisitor as super class
I have even tried to modify the javaagent library jar to entirely remove the org.objectew.asm classes in there, but that doesn't work as ClassNotFoundException happens afterwards due to some other classes needed in the org.objectweb.asm package that only ship in the javaagent library jar, and not in the asm one.
Any ideas? According to examples out there the agent seems to work fine with, at least, the Spring test runner.
I had the same problem and while I didn't solve this problem as such, I wanted to share my approach, which removes the need for PowerMock (which is always a good thing in my view): I wanted to mock a call to
Fragment fooFragment = new FooFragment();
So what I did was addanother level of indirection. I created a FragmentProvider class:
public FragmentFactory fragmentFactory = new FragmentFactory();
[...]
Fragment fooFragment = fragmentFactory.getFooFragment();
After i did this, I could just mock out the factory with standard Mockito, like this:
FragmentFactory mockFactory = mock(FragmentFactory.class);
activity.fragmentFactory = mockFactory;
when(mockFactory.getFooFragment()).thenReturn(mockFooFragment);
I do the following:
From the Package Explorer I select "New, Other, JUnit Test Case"
I write this code:
package dk.sample;
import org.junit.*;
import static org.junit.Assert.*;
public class TestCase {
#Test
public void alwaysTrue(){
assertTrue( true );
}
}
I then select "Run As, JUnit test"
Get this error: "Class not found dk.sample.TestCase
java.lang.ClassNotFoundException: ...."
What do I miss? Have tried with different Run Configurations - but it seems like I miss a classpath somewhere? But to what and where?
To make JUnit work within Domino Designer you need to perform few additional steps:
set up source control for your application
adjust the on-disk project to be recognized as Java application
run JUnit tests within your on-disk project
Please note that java agents have to be tested in a different way..
You can find more detailed explanation about enabling JUnit for both XPages and Agents in the following blog post: Unit Tests for Lotus Domino Applications
Here's also a great how-to on this topic.
Coundn't get JUnit to work inside the Domino Designer. Instead of running the tests from DDE, I now run the tests from a XPages. This works like a dream. Made my own 'JUnit runner' class - that is, I just call the JUnit runners but handles the result my self in order to display it as html on the XPage.
Code can be found here: http://xpages.dk/wp-content/uploads/2013/10/junitrunner.txt
Danish blog post here: http://xpages.dk/?p=1162
I have a Grails service that sends out e-mails using a 3rd-party service by doing a HTTP call:
class EmailService {
def sendEmail(values) {
def valueJson = values as JSON
... // does HTTP call to 3rd party service
}
}
I've written a unit test to test this service (because an integration test spins up Hibernate and the entire domain framework, which I don't need):
#TestFor(EmailService)
class EmailServiceTests {
void testEmailServiceWorks() {
def values = [test: 'test', test2: 'test2']
service.sendEmail(values)
}
}
However, when I execute this unit test, it fails with this exception when it tries to do the as JSON conversion:
org.apache.commons.lang.UnhandledException: org.codehaus.groovy.grails.web.converters.exceptions.ConverterException: Unconvertable Object of class: java.util.LinkedHashMap
I then re-wrote my unit test to just do the following:
void testEmailServiceWorks() {
def value = [test: 'test', test2: 'test2']
def valueJson = value as JSON
}
And I get the same exception when it tries to do the as JSON conversion.
Does anyone know why I'm getting this exception, and how I can fix it?
Even though you are testing a service, you can apply the #TestMixin(ControllerUnitTestMixin) annotation to your test class to get Grails to set up the JSON converter.
The as JSON magic is created when the domain framework spins up.
You have to either change your test to an integration one or mock the asType.
def setUp(){
java.util.LinkedHashMap.metaClass.asType = { Class c ->
new grails.converters."$c"(delegate)
}
}
Rember to clean up after yourself in the tearDown, you wouldn't want metaprogramming leaks in your test suite.
def tearDown(){
java.util.LinkedHashMap.metaClass.asType = null
}
Edit:
If you come from the future, consider this answer: https://stackoverflow.com/a/15485593/194932
As Grails 3.3.x grails-test-mixins plugin is deprecated. #see migration guide.
For this problem you should implement GrailsWebUnitTest which is coming from Grails Testing Support Framework.
you can initialise the JSON in the setUp() . There are various marshallers which implement ObjectMarshaller , which need to be added to the ConverterConfiguration for JSON conversion to work.
http://grails.github.io/grails-doc/2.4.4/api/index.html?org/codehaus/groovy/grails/web/converters/marshaller/json/package-summary.html
example :
DefaultConverterConfiguration<JSON> defaultConverterConfig = new DefaultConverterConfiguration<JSON>()
defaultConverterConfig.registerObjectMarshaller(new CollectionMarshaller())
defaultConverterConfig.registerObjectMarshaller(new MapMarshaller())
defaultConverterConfig.registerObjectMarshaller(new GenericJavaBeanMarshaller())
ConvertersConfigurationHolder.setTheadLocalConverterConfiguration(JSON.class, defaultConverterConfig);
I just ran into this, and I really didn't want to implement GrailsWebUnitTest as recommended in another answer here. I want to keep my service test as "pure" and lean as possible. I ended up doing this:
void setupSpec() {
defineBeans(new ConvertersGrailsPlugin())
}
void cleanupSpec() {
ConvertersConfigurationHolder.clear()
}
This is how it happens under the hood when you implement GrailsWebUnitTest (via WebSetupSpecInterceptor and WebCleanupSpecInterceptor).
That said, the converters seem to be meant for use in the web tier, primarily for making it easy to transparently return data in different formats from a controller. It's worth considering why the service you're testing needs the converters in the first place.
For example, in my case, someone used the JSON converter to serialize some data to a string so it could be stored in a single field in the database. That doesn't seem like an appropriate user of the converters, so I plan on changing how it's done. Making the converters available in my service test is a temporary solution to allow me to improve our test coverage before I refactor things.
I was getting the same error when trying to unit test a controller that calls "render myMap as JSON". We use Grails 1.3.7 and none of the other solutions worked for me without introducing other problems. Upgrading Grails was not an alternative for us at the moment.
My solution was to use JSONBuilder instead of "as JSON", like this:
render(contentType: "application/json", {myMap})
See http://docs.grails.org/latest/guide/theWebLayer.html#moreOnJSONBuilder
(I realize this is old, but came here in search for a solution and so might others)
I have a method which works like this:
public void deploy(UserInput userInput) {
if (userInput is wrong)
return;
//start deployment process
}
The userInput consist of individual checks in the deploy method. Now, I'd like to JUnit test if the user input check algorithms behave right (so if the deployment process would start or not depending on the right or wrong user input). So I need to test this with right and wrong user inputs. I could do this task by checking if anything has been deployed at all, but in this case this is very cumbersome.
So I wonder if it's somehow possible to know in the corresponding JUnit test if the deploy method has been aborted or not (due to wrong user inputs)? (By the way, changing the deploy method is no option.)
As you describe your problem, you can only check your method for side effects, or if it throws an Exception. The easiest way to do this is using a mocking framework like JMockit or Mockito. You have to mock the first method after the checking of user input has finished:
public void deploy(UserInput userInput) {
if (userInput is wrong)
return;
//start deployment process
startDeploy(); // mock this method
}
You can also extend the class under test, and override startDeploy() if it's possible. This would avoid having to use a mocking framework.
Alternative - Integration tests
It sounds like the deploy method is large and complex, and deals with files, file systems, external services (ftp), etc.
It is sometimes easier in the long run to just accept that you're dealing with external systems, and test these external systems. For instance, if deploy() copies a file to directory x, test that the file exists in the target directory. I don't know how complex deploy is, but often mocking these methods can be as hard as just testing the actual behaviour. This may be cumbersome, but like most tests, it would allow you refactor your code so it is simpler to understand. If your goal is refactoring, then in my experience, it's easier to refactor if you're testing actual behaviour rather than mocking.
You could create a UserInput stub / mock with the correct expectations and verify that only the expected calls (and no more) were made.
However, from a design point of view, if you were able to split your validation and the deployment process into separate classes - then your code can be as simple as:
if (_validator.isValid(userInput)) {
_deployer.deploy(userInput);
}
This way you can easily test that if the validator returns false the deployer is never called (using a mocking framework, such as jMock) and that it is called if the validator returns true.
It will also enable you to test your validation and deployment code seperately, avoiding the issue you're currently having.
I have a solution with multiple projects and one of these projects is my service class which calls into the persistence manager.
I would like to write a unit test as follows:
[Test]
public void Create_HappyPath_Success()
{
// Arrange
UnitOfMeasure unitOfMeasure = new UnitOfMeasure();
unitOfMeasure.Code = "Some new unit of measure";
unitOfMeasure.DataOwner = 1;
// Act
this.UoMService.Create(unitOfMeasure); // Fails here as UoMService is null
// Assert something
}
Now, I'm getting a null reference exception on this line:
this.UoMService.Create(unitOfMeasure); // Fails here as UoMService is null
I believe that it's due to the fact that Castle Windsor is not getting called and hence the UoMService isn't getting instantiated. My Castle Windsor application installer is defined in another project i.e. my ASP.NET MVC project. So my first question is whether it's possible to reuse that installer to run my Unit Tests.
Now to get around this problem, I created a new installer in my unit test project by linking to the installer in my web project. Then I used the following code in my set up:
[SetUp]
public void ControllersInstallerTests()
{
this.containerWithControllers = new WindsorContainer();
IoC.Initialize(this.containerWithControllers);
this.containerWithControllers.Install(FromAssembly.This());
}
This time when I run the tests, I get the following error:
SetUp : Castle.Windsor.Configuration.Interpreters.XmlProcessor.ConfigurationProcessingException : Error processing node resource FileResource: [] []
----> Castle.Core.Resource.ResourceException : File C:\Projects\DavidPM\Services\MyProject.Services.ServiceImpl.Test.Unit\bin\Debug\Config\Windsor.config could not be found
The question is why is it looking in the bin\Debug folder?
As a newbie with Castle Windsor, I am not sure what I should be doing to hook into Castle Windsor for my unit tests.
You should not be hooking up your IoC container in your unit tests. During production, your IoC container will resolve dependencies. During unit tests, you create the dependencies as part of your tests -- usually using a mocking framework so you can test in isolation.
make your config file copy to output directory