CDI testing with Arquillian - junit

3 days ago I completed the Arquillian "getting started" guide and decided that this would be a good thing to use for unit testing of my part of a CQRS system.
Unfortunately this has proved a little less than straight forwards. I have googled for the past 3 days and the issue it not resolved by any of the solutions that have worked for others.
I am coming to the conclusion that the problem is with my code though I dont see how.
My task is to write an event listener that listens to an ActiveMQ topic for events and then updates the "view" in a Mongo DB.
There will be many events in the system so it seemed reasonable to me to create an abstract base class that all event listeners extend.
This base class contains the Mongo client and registers to listen to the topic. It uses an over loaded getter for the listener name which it uses as a bean reference in a camel route. The listener client ID is generated from a static long which is incremented on each listener registration. This ensures that every listener get to see every event posted to the topic. Intention is to later add a filter to reduce the number of event received.
I have built this code and driven it from a timer generating event topic posts and it all works fine.
The trouble with that is a quality requirement to have cobertura report 80% code coverage by unit tests.
My test application is not a unit test so my code coverage is 0%.
I have come to Arquillian via a couple of other methods of unit testing in CDI but Arquillian does seem to be the best option if I could only get it to work.
The error I am getting is:
java.lang.IllegalStateException: Could not find beans for Type=class org.apache.deltaspike.core.impl.scope.window.WindowBeanHolder and qualifiers:[]
I have included deltaspike in the pom, I have added it to the shrinkwrap deployment
POM extract
<dependency>
<groupId>org.apache.deltaspike.core</groupId>
<artifactId>deltaspike-core-api</artifactId>
<version>${deltaspike.version}</version>
</dependency>
<dependency>
<groupId>org.apache.deltaspike.core</groupId>
<artifactId>deltaspike-core-impl</artifactId>
<version>${deltaspike.version}</version>
</dependency>
<dependency>
<groupId>org.jboss.shrinkwrap.resolver</groupId>
<artifactId>shrinkwrap-resolver-impl-maven</artifactId>
<version>2.0.0</version>
</dependency>
Test class
#RunWith(Arquillian.class)
public class ListenerTest {
AbstractEventListener listener = null ;
WindowBeanHolder w = new WindowBeanHolder();
#Deployment
public static WebArchive createDeployment() {
return ShrinkWrap.create(WebArchive.class).addAsLibraries(Maven.resolver().loadPomFromFile("pom.xml")
.resolve("org.apache.deltaspike.core:deltaspike-core-api",
"org.apache.deltaspike.core:deltaspike-core-impl")
.withoutTransitivity().asFile())
.addAsWebInfResource("beans.xml");
}
#Test
public void testExecute() {
Assert.assertNotNull(listener);
}
}
My beams.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://xmlns.jcp.org/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/beans_1_1.xsd"
bean-discovery-mode="all">
</beans>
As you can see I have even tried adding the WindowBeanHolder to the code.
I have made many changes to the code over the last few days. I have not included the full pom etc as this may not be needed but can add if required.
If you have any suggestion as to where I can go from here many thanks in advance.

Only the org.apache.deltaspike packages are needed:
return ShrinkWrap.create(WebArchive.class)
.addClasses(HealthResource.class)
.addPackages(true, "org.apache.deltaspike")
.addAsWebInfResource(EmptyAsset.INSTANCE, "beans.xml");

Abstract the database layer by way of an interface. Provide a mongo impl for prod, but a unit-testable stateful "dummy" impl for testing.
Have all your code refer to the interface and inject (using reflection if necessary) the dummy impl to test your code prior to running your unit tests.

Related

Apache camel with spring DSL and Junit Coverage

I am completely new to apache camel.
I got some basic understanding about it.
Now I am going through some videos and documents to get some ideas for implementing junit test cases for apache camel spring DSL based spring boot application but it's not clear to me since there are many ways to implement or in very high level.
I am confused.. which one to follow and what is actually happening in those junits
Does anyone have example or link or videos which explains junit coverage for apache camel spring DSL based spring boot application?
I am particularly looking for junits.
also If you know someone good tutorials about apache camel let me know.
JUnit and Camel doesn't work the same as JUnit and "normal" code and as far as I am aware there's only fairly rudimentary ways to get coverage of a camel route from JUnit. Camel routes are a processing model that is essentially an in memory model of the various steps that need to run, so you can't use code coverage tools to track what parts get executed.
Consider this route (in a subclass of RouteBuilder ):
public void configure() throws Exception {
from("jms:queue:zzz_in_document_q")
.routeId("from_jms_to_processor_to_jms")
.transacted()
.log(LoggingLevel.INFO, "step 1/3: ${body}")
.bean(DocBean.class)
.log(LoggingLevel.INFO, "step 2/a3 - now I've got this: ${body}")
.process(new DocProcessor())
.log(LoggingLevel.INFO, "step 3/3 - and finally I've got this: ${body}")
.to("jms:queue:zzz_out_document_q");
}
and an associated test case, in a class that extends CamelBaseTestSupport:
#Test
public void testJmsAndDbNoInsert() throws Exception {
long docCountBefore = count("select * from document");
template.sendBody("jms:queue:zzz_in_document_q", new Long(100));
Exchange exchange = consumer.receive("jms:queue:zzz_out_document_q", 5000);
assertNotNull(exchange);
Document d = exchange.getIn().getBody(Document.class);
assertNotNull(d);
long docCountAfter = count("select * from document");
assertEquals(docCountAfter, docCountBefore);
}
When the unit test runs the app context will run the configure method, so I've got 100% coverage of my route before I even put a message on the queue! Except I don't, because all it's done is created the execution model in the camel route system and the various components and processors are now all going to run in the right order.
Beans and Processors will get included in the coverage reports, but if you have complex logic in the routes it's not going to give you coverage on this.
There is this capability, delivered around 2017 - https://issues.apache.org/jira/browse/CAMEL-8657 - but I haven't used it and am not sure how it will go working with whatever coverage tooling you use.

I am not able to store entity using em.merge in broadleaf

I am new to broadleaf application. I am able to run application using tomcat + mysql integration well. Now I want to move on with the development to customize the site project as per my requirement.
I am stuck on the point of persistant in broadleaf site module. I have tried using em.merge that returns my entity but do not save it in database and also tried #Transactional(value="blTransactionManager") but It still problem persists. I have tried bellow code in applicationContext-servlet.xml
<aop:config>
<aop:pointcut id="blMyStoreDao" expression="execution(* com.mycompany.dao.StoreDaoImpl.save*(..))"/>
<aop:advisor advice-ref="blTxAdvice" pointcut-ref="blMyStoreDao"/>
</aop:config>
Here is my controller code
newStore.setCustomer(customer);
newStore.setProductList(new ArrayList<ProductImpl>());
Store getStore=store.save(em, newStore);
System.out.println(getStore.getCustomer().getUsername());
System.out.println("customer fetched: "+customer.getEmailAddress());
Here is my daoimpl code
#Repository("blMyStoreDao")
#Transactional(value="blTransactionManager")
public class StoreDaoImpl implements StoreDao {
#PersistenceContext(unitName="blPU")
protected EntityManager em;
#Transactional(value="blTransactionManager")
public Store save(EntityManager em, Store store) {
System.out.println(em);
System.out.println(store.getCustomer().getUsername());
Store s= em.merge(store);
return s;
}
}
But it also didn't resolve my issue.
Code runs perfectly as it should be, but it doesn't save my entity in database.
Anybody Help. Thanks In advance
There isn't any reason to use <aop:config> especially in applicationContext-servlet.xml (if anywhere it should be in the root application context)
You should use #Transactional(TransactionUtils.DEFAULT_TRANSACTION_MANAGER to annotate your method
It is likely that your class was not being scanned by Spring. In Broadleaf, there is a default component scan set up in applicationContext.xml to scan com.mycompany.core.
I would recommend verifying that your dao is actually scanned by Spring and is initialized as a Spring bean. The fact that the entity manager did not get injected indicates that it did not get loaded by Spring correctly. One way to verify this would be to add an #PostConstruct method and print something or set a breakpoint to verify that it gets hit.

Class loading collision between Robolectric and Powermock

I'm trying to write a test that needs both Robolectric 2.2 and PowerMock, as the code under test depends on some Android libraries and third party libraries with final classes that I need to mock.
Given that I'm forced to use the Robolectric test runner through:
#RunWith(RobolectricTestRunner.class)
...I cannot use the PowerMock test runner, so I'm trying to go with the PowerMock java agent alternative, without luck so far.
I have setup everything according to this guide but I'm facing a collision problem between classes required by the javaagent library and by robolectric through its dependency with asm-1.4. Both depend on
org.objectweb.asm.ClassVisitor
, but javaagent-1.5.1 ships with its own version where ClassVisitor is an interface while asm-1.4 version for the same namespace is an abstract class, with the corresponding error at runtime:
java.lang.IncompatibleClassChangeError: class org.objectweb.asm.tree.ClassNode has interface org.objectweb.asm.ClassVisitor as super class
I have even tried to modify the javaagent library jar to entirely remove the org.objectew.asm classes in there, but that doesn't work as ClassNotFoundException happens afterwards due to some other classes needed in the org.objectweb.asm package that only ship in the javaagent library jar, and not in the asm one.
Any ideas? According to examples out there the agent seems to work fine with, at least, the Spring test runner.
I had the same problem and while I didn't solve this problem as such, I wanted to share my approach, which removes the need for PowerMock (which is always a good thing in my view): I wanted to mock a call to
Fragment fooFragment = new FooFragment();
So what I did was addanother level of indirection. I created a FragmentProvider class:
public FragmentFactory fragmentFactory = new FragmentFactory();
[...]
Fragment fooFragment = fragmentFactory.getFooFragment();
After i did this, I could just mock out the factory with standard Mockito, like this:
FragmentFactory mockFactory = mock(FragmentFactory.class);
activity.fragmentFactory = mockFactory;
when(mockFactory.getFooFragment()).thenReturn(mockFooFragment);

Liferay 6 Using Common Service Builder layer Error - BeanLocatorException - BeanLocator has not been set

We are trying to use liferay service builder as a common layer for all our portlets. We have created a separate common portlet project where we are building the service using service.xml This generates a service.jar file for us. We are copying this jar to all portlets WEB-INF/lib dir.
When we run the portlet it throws following error on the logs and Portlet is temporarily unavailable message is displayed on the portlet.
14:43:17,447 ERROR [jsp:154] com.liferay.portal.kernel.bean.BeanLocatorException: BeanLocator has not been set
at com.liferay.portal.kernel.bean.PortletBeanLocatorUtil.locate(PortletBeanLocatorUtil.java:40)
at com.cogs.common.service.CourseLocalServiceUtil.getService(CourseLocalServiceUtil.java:223)
at com.cogs.common.service.CourseLocalServiceUtil.getCoursesCount(CourseLocalServiceUtil.java:187)
at org.apache.jsp.jsps.course.course_005fview_jsp._jspService(course_005fview_jsp.java:542)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:377)
at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:313)
at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:260)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646)
at org.apache.catalina.core.ApplicationDispatcher.doInclude(ApplicationDispatcher.java:551)
at org.apache.catalina.core.ApplicationDispatcher.include(ApplicationDispatcher.java:488)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646)
I am sure that this approach should work seamlessly. But found several people complaining about it on liferay forums, but did not find any solution yet. Please let us know if you found a way to use service builder as a common layer and it worked for you.
We are using maven for building all portlet projects.
Liferay Version is 6.0.5
And we are using Spring Portlet MVC for our portlet development.
You have to build-service AND deploy the (Portlet-Hook) that is required for your current portlet, you can know it by see its name in liferay-plugin-package.properties file as:
required-deployment-contexts=[Portlet-Hook name]
I tryed anything written on that page, but nothing worked for me, until i added the project's version
to maven-pluginname in pom:
<configuration>
<autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir>
<appServerDeployDir>${liferay.app.server.deploy.dir}</appServerDeployDir>
<appServerLibGlobalDir>${liferay.app.server.lib.global.dir}</appServerLibGlobalDir>
<appServerPortalDir>${liferay.app.server.portal.dir}</appServerPortalDir>
<liferayVersion>${liferay.version}</liferayVersion>
<pluginType>portlet</pluginType>
<pluginName>${project.artifactId}-${project.version}</pluginName>
</configuration>
and in liferay-plugin-package.properties:
artifactId-version-deployment-context=artifactId-version
for example:
portlet-sample-1.0-deployment-context=portlet-sample-1.0
where artifactId = portlet-sample
and version = 1.0
After all I i built services, and redeployed my war.
I came to the solution because I debugged:
com.liferay.portal.kernel.bean.PortletBeanLocatorUtil
where
BeanLocator beanLocator = getBeanLocator(servletContextName);
is called which always returned null without versionnumber...
I hope someone helps this.
I had a hard time finding a solution to this error so I will post what we did. The name of the portlet changed, built the service and when running the portlet throws the same error:
com.liferay.portal.kernel.bean.BeanLocatorException: BeanLocator has not been set for servlet context
In our case we had to remove the jar file from ../docroot/WEB-INF/lib/portlet-service.jar
We had a requirement to use something similar: Have a portlet (lets say Source-portlet) whose services will be used by other portlets.
So we moved the generated sourceportlet-service.jar from the Source-portlet's WEB-INF/lib to {tomcat_home}/lib/ext folder were other jars like portlet-service.jar etc reside.
The down side of this approach is whenever there is a change in the Source-portlet we would need to restart the server.
If the other portlet's are your custom plugin portlets than another approach would be to copy the generated sourceportlet-service.jar to other portlet's WEB-INF/lib. This approach does not work if you are using the service in a JSP hook.
Hope this would help.
The previous answer by Martin Gamulin is correct. If you have two separate web apps, one for Spring portlets and another with your Service Builder (which seems to be the correct way to do things in Liferay), then you need to ensure that your Spring portlets do not reference your ServiceBuilder classes during initialization.
If they do then depending on the order in which your app server instantiates your webapps (and in Tomcat you can't specify a startup order), the BeanLocatorException will happen every time the portlets webapp deploys before the builder webapp.
In our case this meant moving a XxxLocalServiceUtil.createXxx(0) call from the constructor of the portlet Controller to the relevant methods.
I had a a similar problem doing a maven portlet.
First i did the portlet and then i put the service.xml
The issue was that the generator was looking for a portlet name that was not there
i solved making expliciting the portlet name i wanteed the genrator to look for
in particular, to do this two pom nodes must be equal
project.artifatctId = (liferay creates a bean locator for this)
and
project.build.(liferay plugin).configuration.pluginName = the internal name of the portlet for the generator
as an example, a small exerpt from my pom.xml
<modelVersion>4.0.0</modelVersion>
<groupId>io.endeios</groupId>
<artifactId>ShowTheCats-portlet</artifactId><!-- ONE -->
<packaging>war</packaging>
<name>ShowTheCats Portlet</name>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>com.liferay.maven.plugins</groupId>
<artifactId>liferay-maven-plugin</artifactId>
<version>${liferay.maven.plugin.version}</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>build-css</goal>
</goals>
</execution>
</executions>
<configuration>
<autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir>
<appServerDeployDir>${liferay.app.server.deploy.dir}</appServerDeployDir>
<appServerLibGlobalDir>${liferay.app.server.lib.global.dir}</appServerLibGlobalDir>
<appServerPortalDir>${liferay.app.server.portal.dir}</appServerPortalDir>
<liferayVersion>${liferay.version}</liferayVersion>
<pluginType>portlet</pluginType>
<pluginName>ShowTheCats-portlet</pluginName><!-- TWO -->
</configuration>
</plugin>
ONE and TWo must be the same
Problem with BeenLocator with my spring portlet for me was
that my portlet's spring context was getting initialized before liferay's spring context did.
I was using
ClassName className = ClassNameLocalServiceUtil.getClassName(JournalArticle.class.getName()); in my constructor. LIferay's context was not loaded hence the error. I moved that piece of code to be called when (only that time) first request needed it. Problem was solved.
So do not depend on lifery during initialization of your portlet, do some kind of "lazy" wiring of dependencies to liferay.
Since you are using maven, try to make sure your war name equals to your portlet project name.
After debug i've found that ClpSerializer defines _servletContextName which is equal to <artifactId> of war project. If you deploy artifact named artifactId-1.0.0-snapshot.war the context will be created with that name, but code, generated by servicegen expects it to be artifactId. Verify with your ClpSerializer.
I also had the same problem. I placed the following code in liferay-plugin-package.properties file of portlets which uses the service layer of common portlet. It worked for me.
required-deployment-contexts=common-portlet
It's better to copy the service.jar file to tomcat/lib/ext instead of all portlets WEB-INF/lib.
I did the following to solve the above problem:
Set the plugin config property pluginName in pom.xml to the correct context
<plugin>
<groupId>com.liferay.maven.plugins</groupId>
<artifactId>liferay-maven-plugin</artifactId>
<version>${liferay.version}</version>
<configuration>
<autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir>
<appServerPortalDir>${liferay.app.server.portal.dir}</appServerPortalDir>
<liferayVersion>${liferay.version}</liferayVersion>
<pluginType>portlet</pluginType>
<pluginName>XXXX-portlet</pluginName>
</configuration>
</plugin>
Optionally set the XXXX-portlet-deployment-context property in liferay plugin properties file or portlet.properties file
XXXX-portlet-deployment-context=XXXX-portlet
Re-Build the services
Verify if the generated ClpSerializer.java contains the correct contexts
` public static String getServletContextName() {
if (Validator.isNotNull(_servletContextName)) {
return _servletContextName;
}
synchronized (ClpSerializer.class) {
if (Validator.isNotNull(_servletContextName)) {
return _servletContextName;
}
try {
ClassLoader classLoader = ClpSerializer.class.getClassLoader();
Class<?> portletPropsClass = classLoader.loadClass(
"com.liferay.util.portlet.PortletProps");
Method getMethod = portletPropsClass.getMethod("get",
new Class<?>[] { String.class });
String portletPropsServletContextName = (String) getMethod.invoke(null,
"XXXX-portlet-deployment-context");
if (Validator.isNotNull(portletPropsServletContextName)) {
_servletContextName = portletPropsServletContextName;
}
} catch (Throwable t) {
if (_log.isInfoEnabled()) {
_log.info(
"Unable to locate deployment context from portlet properties");
}
}
if (Validator.isNull(_servletContextName)) {
try {
String propsUtilServletContextName = PropsUtil.get(
"XXXX-portlet-deployment-context");
if (Validator.isNotNull(propsUtilServletContextName)) {
_servletContextName = propsUtilServletContextName;
}
} catch (Throwable t) {
if (_log.isInfoEnabled()) {
_log.info(
"Unable to locate deployment context from portal properties");
}
}
}
if (Validator.isNull(_servletContextName)) {
_servletContextName = "upay-portlet";
}
return _servletContextName;
}
}`
Deploy the war, verify the war name and the logs for the correct context name.

Android JUnit: Define a different Application subclass

So for my normal Android project, I have the following in AndroidManifest.xml:
<application android:name=".utilities.App" ...>
....
</application>
And then I have my App class:
public class App extends Application {
....
}
And then I have an Android JUnit Test project associated with the Android project. Everything is all fine and dandy and I can write JUnit tests. However, I'm trying to run code coverage with my JUnit tests and I'm getting bloated results. The reason is because my App class gets called and initialized as if my application were actually started. I do not want my custom App class to execute when I run the JUnit tests or code coverage. Any setup I would need for the JUnit tests will go in the appropriate JUnit setup() method. Is there any way I can prevent it from executing my custom App class or a way that any classes/methods/lines that are executed due to the creation of my App class aren't counted towards the code coverage?
A temporary solution that I've found will work unless someone has any better ideas.
Go into the main Android project's AndroidManifest.xml.
Change the android:name attribute from ".utilities.App" to "android.app.Application"
Run the code coverage utility/JUnit tests
Change the android:name attribute back from "android.app.Application" to ".utilities.App"
Re-deploy the app onto the device (so that it uses the right Application class when it runs external to the code coverage/JUnit tests)
I'm sure the real solution is to automate this process, but I'm too lazy to do so, and it just feels hackish and wrong. But at least it's a workaround unless someone has any ideas.