I'm doing my first steps with Camel and currently working on writing a simple junit test using jms as a transport.
Here is a code I wrote:
public class FirstMockTest extends CamelTestSupport {
#Override
protected RoutesBuilder createRouteBuilder() throws Exception {
return new RouteBuilder() {
#Override
public void configure() throws Exception {
from("jms:topic:quote")
.to("mock:quote");
}
};
}
#Test
public void testMessageCount() throws InterruptedException {
MockEndpoint mockEndpoint = getMockEndpoint("mock:quote");
mockEndpoint.setExpectedMessageCount(1);
template.sendBody("jms:topic:quote", "Camel rocks");
mockEndpoint.assertIsSatisfied();
}
}
Because of missing connectionFactory I got the following exception:
org.apache.camel.FailedToCreateRouteException: Failed to create route route1: Route(route1)[[From[jms:topic:quote]] -> [To[mock:quote]]] because of connectionFactory must be specified
I'm able to fix it adding the following lines to my route:
ConnectionFactory connectionFactory =
new ActiveMQConnectionFactory("vm://localhost?roker.persistent=false");
context.addComponent("jms", JmsComponent.jmsComponent(connectionFactory));
But I don't like I'm adding some components to my context inside the route. Also, If i want to have another route I will need to do it again.
Obviously, there should be another way to tell my test about connection factory.
Thank you in advance!
It's a good idea to define the JMS connection factory outside of your Camel context and, if possible, reuse it. How to do that depends on your component model / execution environment.
If you're using a Java SE version that supports CDI, that would be an obvious choice. You'd define your JMS connection factory as a named component once and inject it everywhere you need it. Have a look at http://camel.apache.org/cdi.html and for testing support at http://camel.apache.org/cdi-testing.html
If you're using Spring, define your connection factory as a spring bean and inject it wherever you need it.
If you're using Java EE on an application server, you'd usually define the JMS connection factory using the mechanisms of that app server. You'd then look up the JMS connection factory using JNDI.
If you're running in an OSGi container, you should define the JMS connection factory in its own bundle and export it as an OSGi service. In the bundle of your Camel context, import that OSGi servide and inject it into the Camel context.
In all above cases you should consider using a pooled JMS connection factory.
For CDI, Spring and OSGi, have a look at: http://activemq.apache.org/maven/5.14.5/apidocs/org/apache/activemq/jms/pool/PooledConnectionFactory.html
For Java EE the way how to set pooling parameters depends on your app server.
Note of caution: for Java SE CDI and Spring there should be only one Camel context per application (you can have many routes, though). So if the JMS connection factory is only used in that one Camel context, there is not much reuse. Despite that I still think it's preferable to define the JMS connection outside of the Camel context in a separate component. It's, well, cleaner.
Since you are writing a junit you can avoid creating a ConnectionFactory if you stub the jms endpoint. You can name the endpoint as stub:jms:topic:quote. Have a look at sample example at link https://github.com/camelinaction/camelinaction2/blob/master/chapter9/mock/src/test/java/camelinaction/FirstMockTest.java
Related
I have recently upgraded our project to Spring Boot 2. The App is just a Rest API. And now all our 400 and 500 responses are being returned as html instead of json.
I am defining a custom ErrorAttributes, just like the docs say to do.
#Configuration
public class WebConfig implements WebMvcConfigurer {
...
#Bean
public ErrorAttributes errorAttributes() {
return new DefaultErrorAttributes() {
#Override
public Map<String, Object> getErrorAttributes(WebRequest webRequest,
boolean includeStackTrace) {
Map<String, Object> errorAttributes = super.getErrorAttributes(webRequest, true);
return errorAttributes;
}
};
}
...
I would like to debug this issue locally, but I cannot find in the code where Spring Boot makes this decision to add a JSON Response for errors. The docs here: https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-error-handling says:
For machine clients, it produces a JSON response with details of the error, the HTTP status, and the exception message.
I'm thinking that I must have a Bean defined locally that is causing this not to be configured correctly in the Spring Boot Auto configuration.
I finally figured this out. I think there were some changes in Spring Security 4 to Spring Security 5 that was causing a NPE early in the filter chain for our app. Also, compounding the difficulty of debugging the issue is that with the Spring Boot upgrade, the /error route was forced to be authenticated.
I ended up fixing the NPE, allowing for everyone to see the /error mapping and then making sure ErrorMvcAutoConfiguration was being initialized correctly. All is working now.
I have a dead simple FeignClient interface that I would like to "unit"/integration test with a fake HTTP server, WireMock for example. The idea is to test the mapping with a sampled HTTP API response, without configuring a whole Spring Boot/Cloud Context.
#FeignClient(name = "foo", url = "${foo.url}")
public interface FooClient {
#RequestMapping(value = "/foo/{foo-id}/bar", method = RequestMethod.GET)
public Bar getBar(#PathVariable("foo-id") String fooId);
}
Is there any way to programmatically instantiate this interface, like a Spring Data Repository through a *RepositoryFactoryBean ?
I see a FeignClientFactoryBean in the source code, but it is package protected, and it relies on an ApplicationContext object to retrieve its dependencies anyway.
Well, you can fake a real rest client using wiremock for testing purposes, but this is more about containing the functional test, that feign clients themself work. This is mostly not what you really want to test, because the actual need is to test your components using your client behave in a specified way.
The best practice for me is not to make live hard with maintaing a fake server, but mock the clients behavior with Mockito. If you use Spring Boot 1.4.0, here is the way to go:
Consider you have some FooBarService, which internally uses your FooClient to peform some FooBarService::someAction(String fooId), which performs some business logic which needs to work with a foo with given id
#RunWith(SpringRunner.class)
#SpringBootTest(classes = App.class)
class FooUnitTest {
#Autowired;
private FooBarService fooBarService;
#MockBean;
private FooClient fooClient;
#Test
public void testService() {
given(fooClient.getBar("1")).willReturn(new Bar(...));
fooBarService.someAction("1");
//assert here, that someAction did what it supposed to do for that bar
}
}
At this point you first should clarify, what you expect the REST client to respond, when asking for "/foo/1/bar", by creating a mock for exactly that case and give the Bar object you expect to receive for that API, and assert that your application is in the desired state.
i try to use spring-test(3.2.10) and integration tests with TestNG by this link.
I created RootTest.java
#WebAppConfiguration
#ContextConfiguration("file:src/test/resources/root-context2.xml")
public class ReferenceServiceTest extends AbstractTestNGSpringContextTests {
...
spring context loaded success. But my global variables not instantiated because web.xml ignored. In web.xml i have my own "listener-class"(implementation of ServletContextListener) and "context-param". How i can load web.xml context(and calls all application startup listeners) with spring integration test context?
As stated in the reference manual, the Spring MVC Test Framework...
"loads the actual Spring configuration through the TestContext
framework and always uses the DispatcherServlet to process requests
thus approximating full integration tests without requiring a running
Servlet container."
The key point there is "without ... a Servlet container". Thus web.xml does not come into the picture here. In other words, there is no way for configuration in web.xml to have an affect on integration tests using the Spring MVC Test Framework.
Now, having said that, it is possible to register a Servlet Filter with MockMvc like this:
mockMvcBuilder.addFilters(myServletFilter);
or
mockMvcBuilder.addFilters(myResourceFilter, "/resources/*");
And you can configure context-param entries by adding them manually to the ServletContext (which is actually Spring's MockServletContext) before you execute assertions on MockMvc like this:
wac.getServletContext().setInitParameter(name, value);
But... there is no way to configure a ServletContextListener using Spring MVC Test. If you want to have a listener applied to all of your requests that pass through Spring MVC, as an alternative you could consider implementing a custom HandlerInterceptor or WebRequestInterceptor (see Configuring interceptors in the reference manual).
Regards,
Sam
Try with a MockServletContext
#Before
public void before() {
MockServletContext mockServletContext = new MockServletContext();
mockServletContext.setInitParameter("parameterName", "parameterValue");
new MyListenerClass().contextInitialized(new ServletContextEvent(mockServletContext));
}
I have a Spring MVC app that is running fine on local tomcat etc. Its a Spring 3.1 MVC/Hibernate app.
I am using (where possible) pure Java #Configuration for the app - and I am now trying to deploy the app to CloudFoundry (via STS), but I am struggling to get the MySql db configured (from memory, with xml config you dont need to do anything and Spring/CloudFoundry auto-injects the required user/password etc, but its been a while since I deployed anything to CF).
I have tried both of the following configurations:
#Bean
public BasicDataSource dataSource() throws PropertyVetoException {
//CloudFoundry config
final CloudEnvironment cloudEnvironment = new CloudEnvironment();
final List<MysqlServiceInfo> mysqlServices = cloudEnvironment.getServiceInfos(MysqlServiceInfo.class);
final MysqlServiceInfo serviceInfo = mysqlServices.get(0);
BasicDataSource bean = new BasicDataSource();
bean.setDriverClassName("com.mysql.jdbc.Driver");
bean.setUrl(serviceInfo.getUrl());
bean.setUsername(serviceInfo.getUserName());
bean.setPassword(serviceInfo.getPassword());
return bean;
}
The above failed on out of bounds on the .get(0) line of the mysqlServices. This was based on the answer suggested here.
I also tried leaving the datasource as what it runs on as local to see if the properties just get injected, but no luck there either. (the below was tried with the values as per the Spring sample code here, and also using property placeholders from my db.connection props file)
#Bean
public BasicDataSource dataSource() throws PropertyVetoException {
BasicDataSource bean = new BasicDataSource();
bean.setDriverClassName("com.mysql.jdbc.Driver");
bean.setUrl("");
bean.setUsername("spring");
bean.setPassword("spring");
return bean;
}
Edit
I have also used the getServiceInfo(String, Class) method passing in the name of the MySql service that I have created and bound to the application, but that just NPEs similar to the getServiceInfos(..) approach
Ok, this was just a stupid mistake - when I deployed the app via STS I had selected Java Web app rather than the "Spring" type. Not sure why that would make the CloudEnvironment properties not be available (I was under the impression that approach was the common method to inject the details in non-Spring apps) - but re-deploying it to the server as a Spring app resolved the probs!
I'm trying to use EJB 3.1 Embeddable EJBContainer on Glassfish 3.1 for integration
testing my EJB's. There's a classloading issue I can't figure out.
My ejbs are build into dum-ejb.jar. They use EclipseLink JPA. I also create EJB client jar dum-ejb-client.jar, while attempting to fight the classloading issues. Client jar contains the EJB interfaces, and Entity classes (which are usually parameters or returns values). Client jar also contains a lot of unneeded classes that could be dropped (but I don't see how it would solve the problem).
The problem is that since EclipseLink does bytecode weaving to the Entity classes, the Entity classes must not be in the classpath when the junit tests are run: http://www.java.net/forum/topic/glassfish/glassfish/embedded-glassfish-and-weaving
I can do that and configure classpath so that dum-ejb.jar is not included. If I use EJBContainer so that I look up my service as a java.lang.Object and call it's methods via reflection, the test works. But of course, that's not how I want to write my tests.
Typical test would be like:
#Test
public void testInEJBContainer() throws Exception {
File ejbJarFile = new File("target/dum/dum-ejb.jar");
Map props = new HashMap();
props.put("org.glassfish.ejb.embedded.glassfish.instance.root",
"target/classes/instance-root");
props.put(EJBContainer.MODULES, new File[]{ejbJarFile});
EJBContainer container = EJBContainer.createEJBContainer(props);
CompanyService = (CompanyService)
container.getContext().lookup("java:global/dum/CompanyServiceImpl");
log.info("result of findAll() " + service.findAll(false));
}
How could I run the test if CompanyService interface, and returned Company Entity classes can not be in the classpath?
Even if dum-ejb.jar is not on classpath, and dum-ejb-client.jar is, EclipseLink weaving gets broken.
Isn't this exactly the typical use case for EJBContainer, shouldn't there be a simple solution to this?
Turns out I ran into classloading problems since I was running the EJBContainer from maven ear project.
When I run it from the maven ejb project itself, there's no such issues and EJBContainer is easy to use.