I have a larger number of tests and want (triggered via maven) see on stdout/stderr, which test method is currently executed.
I do not want or need that inside the test itself, but rather follow what is going on -- a little bit like IntelliJ does that with the circles that then turn red or green.
Is there a command line option to do this, or do I have to write my own test runner, which does that for me?
Your best option is probably a RunListener, which you can plug into Maven:
RunListener listens to test events, such as test start, test end, test failure, test success etc.
public class RunListener {
public void testRunStarted(Description description) throws Exception {}
public void testRunFinished(Result result) throws Exception {}
public void testStarted(Description description) throws Exception {}
public void testFinished(Description description) throws Exception {}
public void testFailure(Failure failure) throws Exception {}
public void testAssumptionFailure(Failure failure) {}
public void testIgnored(Description description) throws Exception {}
}
Then, in surefire, you can specify a custom listener, using:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.10</version>
<configuration>
<properties>
<property>
<name>listener</name>
<value>com.mycompany.MyResultListener,com.mycompany.MyResultListener2</value>
</property>
</configuration>
</plugin>
This is from Maven Surefire Plugin, Using JUnit, Using custom listeners and reporters
What Matthew wrote works for JUnit and TestNG has a similar interface also.
I'm using TestNG, and setting the verbosity high does the job for me:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
<configuration>
<properties>
<property>
<name>surefire.testng.verbose</name>
<value>10</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
It is documented here http://maven.apache.org/surefire/maven-surefire-plugin/examples/testng.html
The verbosity level is between 0 and 10 where 10 is the most detailed.
You can specify -1 and this will put TestNG in debug mode (no longer
slicing off stack traces and all). The default level is 0.
It has also been answered here Set TestNG's verbosity level from Maven
The output then contains such lines as the tests progress:
[TestNG] INVOKING: "Surefire test" -
org.example.MyTest.myMethod(java.lang.String,
java.lang.String)(value(s) ...
Also the #BeforeClass and #AfterClass methods are mentioned.
Related
I have an HTTPRepository initialised with a the URL to the repository. I use a RepositoryConnection to retrieve and add (weather) data to the repository. The data is retrieved from a web service, then transformed into RDF statements, and added to the repository. This is done periodically by a stand-alone application.
When I run this application within IntelliJ, everything works fine.
To run this application on a server I created a jar file (containing all dependencies). The application starts as expected and is able to retrieve data from the repository.
However, when the application tries to write data to the repository I get an UnsupportedRDFormatException:
org.eclipse.rdf4j.rio.UnsupportedRDFormatException: Did not recognise RDF format object BinaryRDF (mimeTypes=application/x-binary-rdf; ext=brf)
at org.eclipse.rdf4j.rio.Rio.lambda$unsupportedFormat$0(Rio.java:568) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at java.util.Optional.orElseThrow(Optional.java:290) ~[na:1.8.0_111]
at org.eclipse.rdf4j.rio.Rio.createWriter(Rio.java:134) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.rio.Rio.write(Rio.java:371) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.rio.Rio.write(Rio.java:324) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.addModel(HTTPRepositoryConnection.java:588) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.flushTransactionState(HTTPRepositoryConnection.java:662) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.commit(HTTPRepositoryConnection.java:326) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.conditionalCommit(AbstractRepositoryConnection.java:366) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.add(AbstractRepositoryConnection.java:431) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at nl.wur.fbr.data.weather.WeatherApp.retrieveData(WeatherApp.java:122) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at nl.wur.fbr.data.weather.WeatherData$WeatherTask.run(WeatherData.java:105) [weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at java.util.TimerThread.mainLoop(Timer.java:555) [na:1.8.0_111]
at java.util.TimerThread.run(Timer.java:505) [na:1.8.0_111]
The source code in which the error occurs is:
public void retrieveData(){
logger.info("Retrieving data for weather for app: "+ID+" ");
RepositoryConnection connection = null;
ValueFactory vf = SimpleValueFactory.getInstance();
try {
connection = repository.getConnection();
// Retrieving the locations from the repository (no problem here).
List<Location> locations = this.retrieveLocations(connection);
List<Statement> statements = new ArrayList<>();
// Retrieving weather data from each location and transforming it to statements.
for(Location location : locations){
List<Weather> retrievedWeather = weatherService.retrieveWeatherData(location.name,location.latitude,location.longitude);
for(Weather weather : retrievedWeather){
BNode phenomenon = vf.createBNode();
statements.add(vf.createStatement(location.ID,WEATHER.HAS_WEATHER,phenomenon,rdfStoreGraph));
statements.addAll(weather.getStatements(phenomenon,vf,rdfStoreGraph));
statements = this.correctOMIRIs(statements,vf);
}
}
// Adding data retrieved from the weather API
// This is where the exception happens.
connection.add(statements,rdfStoreGraph);
} catch (Exception e) {
logger.error("Could not retrievedata for weather app: '"+ID+"' because no monitor locations could be found.",e);
} finally {
if(connection != null){
connection.close();
}
}
}
The HTTPRespository is initialised as so:
repository = new HTTPRepository(rdfStore.toString());
((HTTPRepository)repository).setPreferredRDFFormat(RDFFormat.BINARY);
((HTTPRepository)repository).setPreferredTupleQueryResultFormat(TupleQueryResultFormat.BINARY);
I've tried changing the formats to TURTLE. But it makes no difference.
Can you tell me how to solve this?
NB. Both the RDF4J server and library have version 2.0.1 (rdf4j).
To run this application on a server I created a jar file (containing all dependencies).
There's your problem: you created a "fat jar" and probably haven't properly merged SPI registry files.
RDF4J's Rio parsers (and several other modules as well) use Java's Service Provider Interface (SPI) mechanism to register themselves. This mechanism relies on a text file in META-INF\services in the jar file containing the fully-qualified name of each parser implementation.
The problem comes when you merge jars: each Rio parser jar has a registry file with the same name, but different contents. If you are using something like the maven assembly plugin to create the fat jar, each registry file gets overwritten by the next one. The consequence is that at the end, RDF4J can only find one parser - the one whose registry file was added last to the fat jar.
The solution is to either not create a fat jar at all, or if you must, use a different technique to create it, which merges the registry files rather than overwriting them. The maven shade plugin has a good config option for this: the ServicesResourceTransformer.
I am refloating this post because I got stucked with this for several hours. Finally, I could generate an executable jar by using maven shade plugin with the following configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>${fully.qualified.main.class}</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
I used the shaded plugin with the ManifestResourceTransformer for creating the executable jar indicating the main class of my project, and with the ServicesResourceTransformer for dealing with the RDF4J package naming in order to avoid that one parser overrides the previous one. Also, I had to include the filter section to avoid JNI errors derived from package signatures.
I hope this is useful for someone.
Greetings.
I intend on annotating some of my JUnit4 tests with an #Category annotation. I would then like to exclude that category of tests from running on my Maven builds.
I see from the Maven documentation that it is possible to specify a category of tests to run. I want to know how to specify a category of tests not to run.
Thanks!
You can do this as follows:
Your pom.xml should contain the following setup.
<configuration>
<excludes>
<exclude>**/TestCircle.java</exclude>
<exclude>**/TestSquare.java</exclude>
</excludes>
</configuration>
If you want regex support just use
<excludes>
<exclude>%regex[.*[Cat|Dog].*Test.*]</exclude>
</excludes>
http://maven.apache.org/surefire/maven-surefire-plugin/examples/inclusion-exclusion.html
If you want to use Category annotation, you need to do the following:
#Category(com.myproject.annotations.Exclude)
#Test
public testFoo() {
....
}
In your maven configuration, you can have something like this
<configuration>
<excludedGroups>com.myproject.annotations.Exclude</excludedGroups>
</configuration>
I have the following configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>analyze</id>
<goals>
<goal>analyze-only</goal>
</goals>
<configuration>
<failOnWarning>false</failOnWarning>
</configuration>
</execution>
<!--Copy the dependencies so ant build has the same versions-->
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.basedir}/lib</outputDirectory>
<overWriteIfNewer>true</overWriteIfNewer>
<stripVersion>true</stripVersion>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<excludeTransitive>false</excludeTransitive>
</configuration>
</execution>
</executions>
</plugin>
The above configuration dumps everything on the same folder. I tried excluding the test scope by adding the test configuration but gives an error:
Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.6:copy-dependencies (copy-dependencies) on project pcgen: Can't exclude Test scope, this will exclude everything.
Is there a way to separate test dependencies from the rest so I can copy to different folders?
I tried excluding the test scope by adding the test configuration but gives an error
I just stumbled across this, probably for very different reasons, but I think I found us both the answer. Try this, for example. You'll need pom.xml in the current directory, of course.
mvn dependency:copy-dependencies \
-DincludeScope=runtime \
-DexcludeScope=provided \
-DoutputDirectory=target/war/WEB-INF/lib
A huge belated thanks to Brian Fox, who writes on Maven Dependency Plugin Issue #128:
You shouldn't ever need to include or exclude two scopes at the same time because they are comprised of each other. The default is to include test scope, which includes everything. If you don't want any test dependencies or provided dependencies, then include runtime and exclude provided.
The scopes being interpreted are the scopes as maven sees them, not as specified in the pom. So the "test" scope includes everything, runtime includes compile but not provided etc.
In May 2013, the includeScope documentation was updated to:
/**
* Scope to include. An Empty string indicates all scopes (default).
* The scopes being interpreted are the scopes as
* Maven sees them, not as specified in the pom. In summary:
* <ul>
* <li><code>runtime</code> scope gives runtime and compile dependencies,</li>
* <li><code>compile</code> scope gives compile, provided, and system dependencies,</li>
* <li><code>test</code> (default) scope gives all dependencies,</li>
* <li><code>provided</code> scope just gives provided dependencies,</li>
* <li><code>system</code> scope just gives system dependencies.</li>
* </ul>
*
* #since 2.0
*/
#Parameter( property = "includeScope", defaultValue = "" )
protected String includeScope;
use includeScope indeed, test scope includes every scope, that is why fails.
<includeScope>runtime</includeScope>
I am getting an unexpected error when trying to perform a simple test with mocks.
#RunWith(MockitoJUnitRunner)
class AccessorTest {
#Mock
private DeviceBuilder deviceBuilder
#Test
void shouldCreateDeviceFromFilesystem() {
//given
URI uri = this.class.classLoader.getResource("sample-filesystem").toURI()
File deviceRoot = new File(uri)
Accessor accessor = new Accessor(deviceBuilder)
Device expectedDevice = new Device(deviceRoot)
when(deviceBuilder.build(eq(deviceRoot))).thenReturn(expectedDevice)
//when
Device device = accessor.readFrom(deviceRoot)
//then
assert device == expectedDevice
verify(deviceBuilder).build(deviceRoot)
}
}
The DeviceBuilder is a single method interface Device::DeviceBuilder#build(File root). Device has a well defined equals method as per Josh Bloch.
The exception is thrown on the when() line, and none of the variables in scope are null. The full exception is:
org.mockito.exceptions.misusing.InvalidUseOfMatchersException:
Invalid use of argument matchers!
0 matchers expected, 1 recorded.
This exception may occur if matchers are combined with raw values:
//incorrect:
someMethod(anyObject(), "raw String");
When using matchers, all arguments have to be provided by matchers.
For example:
//correct:
someMethod(anyObject(), eq("String by matcher"));
For what it's worth here's a snippet of my POM:
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<compilerId>groovy-eclipse-compiler</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-eclipse-compiler</artifactId>
<version>2.7.0-01</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>2.0.4</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.8.5</version>
<scope>test</scope>
</dependency>
</dependencies>
My suspicions are either some weirdness from the way Groovy manages classes, or some version incompatibility, but I hope it's something obvious I just can't see.
I know this is an old question, but this may be helpful to someone else.
The issue described in the question is detailed at http://code.google.com/p/mockito/issues/detail?id=303
I ran into the very same issue and I got it fixed by using the library provided at https://github.com/cyrusinnovation/mockito-groovy-support
Sometimes Mockito doesn't flag inappropriate uses at the time it occurrs but at the next invocation. You could have a look at the test before that one, maybe you have a problem there?
The message says that in an invocation you either have to provide only Matchers or only real objects, but not a combination thereof. You can usually fix that by using eq(obj) instead of the object itself (or sameInstance(obj)), to have all Matchers. However in this case I don't find any place in the code you posted that would fit that description, that's why I suspect a problem in an earlier place of the code.
I'm not answering the original question.
I've searched everywhere and the only question about the error I got points to this discussion.
In case if someone got this error: do not call mock methods to provide values for matchers. The call deviceBuilder.build should go BEFORE verify(.
We are trying to use liferay service builder as a common layer for all our portlets. We have created a separate common portlet project where we are building the service using service.xml This generates a service.jar file for us. We are copying this jar to all portlets WEB-INF/lib dir.
When we run the portlet it throws following error on the logs and Portlet is temporarily unavailable message is displayed on the portlet.
14:43:17,447 ERROR [jsp:154] com.liferay.portal.kernel.bean.BeanLocatorException: BeanLocator has not been set
at com.liferay.portal.kernel.bean.PortletBeanLocatorUtil.locate(PortletBeanLocatorUtil.java:40)
at com.cogs.common.service.CourseLocalServiceUtil.getService(CourseLocalServiceUtil.java:223)
at com.cogs.common.service.CourseLocalServiceUtil.getCoursesCount(CourseLocalServiceUtil.java:187)
at org.apache.jsp.jsps.course.course_005fview_jsp._jspService(course_005fview_jsp.java:542)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:377)
at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:313)
at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:260)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646)
at org.apache.catalina.core.ApplicationDispatcher.doInclude(ApplicationDispatcher.java:551)
at org.apache.catalina.core.ApplicationDispatcher.include(ApplicationDispatcher.java:488)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:646)
I am sure that this approach should work seamlessly. But found several people complaining about it on liferay forums, but did not find any solution yet. Please let us know if you found a way to use service builder as a common layer and it worked for you.
We are using maven for building all portlet projects.
Liferay Version is 6.0.5
And we are using Spring Portlet MVC for our portlet development.
You have to build-service AND deploy the (Portlet-Hook) that is required for your current portlet, you can know it by see its name in liferay-plugin-package.properties file as:
required-deployment-contexts=[Portlet-Hook name]
I tryed anything written on that page, but nothing worked for me, until i added the project's version
to maven-pluginname in pom:
<configuration>
<autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir>
<appServerDeployDir>${liferay.app.server.deploy.dir}</appServerDeployDir>
<appServerLibGlobalDir>${liferay.app.server.lib.global.dir}</appServerLibGlobalDir>
<appServerPortalDir>${liferay.app.server.portal.dir}</appServerPortalDir>
<liferayVersion>${liferay.version}</liferayVersion>
<pluginType>portlet</pluginType>
<pluginName>${project.artifactId}-${project.version}</pluginName>
</configuration>
and in liferay-plugin-package.properties:
artifactId-version-deployment-context=artifactId-version
for example:
portlet-sample-1.0-deployment-context=portlet-sample-1.0
where artifactId = portlet-sample
and version = 1.0
After all I i built services, and redeployed my war.
I came to the solution because I debugged:
com.liferay.portal.kernel.bean.PortletBeanLocatorUtil
where
BeanLocator beanLocator = getBeanLocator(servletContextName);
is called which always returned null without versionnumber...
I hope someone helps this.
I had a hard time finding a solution to this error so I will post what we did. The name of the portlet changed, built the service and when running the portlet throws the same error:
com.liferay.portal.kernel.bean.BeanLocatorException: BeanLocator has not been set for servlet context
In our case we had to remove the jar file from ../docroot/WEB-INF/lib/portlet-service.jar
We had a requirement to use something similar: Have a portlet (lets say Source-portlet) whose services will be used by other portlets.
So we moved the generated sourceportlet-service.jar from the Source-portlet's WEB-INF/lib to {tomcat_home}/lib/ext folder were other jars like portlet-service.jar etc reside.
The down side of this approach is whenever there is a change in the Source-portlet we would need to restart the server.
If the other portlet's are your custom plugin portlets than another approach would be to copy the generated sourceportlet-service.jar to other portlet's WEB-INF/lib. This approach does not work if you are using the service in a JSP hook.
Hope this would help.
The previous answer by Martin Gamulin is correct. If you have two separate web apps, one for Spring portlets and another with your Service Builder (which seems to be the correct way to do things in Liferay), then you need to ensure that your Spring portlets do not reference your ServiceBuilder classes during initialization.
If they do then depending on the order in which your app server instantiates your webapps (and in Tomcat you can't specify a startup order), the BeanLocatorException will happen every time the portlets webapp deploys before the builder webapp.
In our case this meant moving a XxxLocalServiceUtil.createXxx(0) call from the constructor of the portlet Controller to the relevant methods.
I had a a similar problem doing a maven portlet.
First i did the portlet and then i put the service.xml
The issue was that the generator was looking for a portlet name that was not there
i solved making expliciting the portlet name i wanteed the genrator to look for
in particular, to do this two pom nodes must be equal
project.artifatctId = (liferay creates a bean locator for this)
and
project.build.(liferay plugin).configuration.pluginName = the internal name of the portlet for the generator
as an example, a small exerpt from my pom.xml
<modelVersion>4.0.0</modelVersion>
<groupId>io.endeios</groupId>
<artifactId>ShowTheCats-portlet</artifactId><!-- ONE -->
<packaging>war</packaging>
<name>ShowTheCats Portlet</name>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>com.liferay.maven.plugins</groupId>
<artifactId>liferay-maven-plugin</artifactId>
<version>${liferay.maven.plugin.version}</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>build-css</goal>
</goals>
</execution>
</executions>
<configuration>
<autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir>
<appServerDeployDir>${liferay.app.server.deploy.dir}</appServerDeployDir>
<appServerLibGlobalDir>${liferay.app.server.lib.global.dir}</appServerLibGlobalDir>
<appServerPortalDir>${liferay.app.server.portal.dir}</appServerPortalDir>
<liferayVersion>${liferay.version}</liferayVersion>
<pluginType>portlet</pluginType>
<pluginName>ShowTheCats-portlet</pluginName><!-- TWO -->
</configuration>
</plugin>
ONE and TWo must be the same
Problem with BeenLocator with my spring portlet for me was
that my portlet's spring context was getting initialized before liferay's spring context did.
I was using
ClassName className = ClassNameLocalServiceUtil.getClassName(JournalArticle.class.getName()); in my constructor. LIferay's context was not loaded hence the error. I moved that piece of code to be called when (only that time) first request needed it. Problem was solved.
So do not depend on lifery during initialization of your portlet, do some kind of "lazy" wiring of dependencies to liferay.
Since you are using maven, try to make sure your war name equals to your portlet project name.
After debug i've found that ClpSerializer defines _servletContextName which is equal to <artifactId> of war project. If you deploy artifact named artifactId-1.0.0-snapshot.war the context will be created with that name, but code, generated by servicegen expects it to be artifactId. Verify with your ClpSerializer.
I also had the same problem. I placed the following code in liferay-plugin-package.properties file of portlets which uses the service layer of common portlet. It worked for me.
required-deployment-contexts=common-portlet
It's better to copy the service.jar file to tomcat/lib/ext instead of all portlets WEB-INF/lib.
I did the following to solve the above problem:
Set the plugin config property pluginName in pom.xml to the correct context
<plugin>
<groupId>com.liferay.maven.plugins</groupId>
<artifactId>liferay-maven-plugin</artifactId>
<version>${liferay.version}</version>
<configuration>
<autoDeployDir>${liferay.auto.deploy.dir}</autoDeployDir>
<appServerPortalDir>${liferay.app.server.portal.dir}</appServerPortalDir>
<liferayVersion>${liferay.version}</liferayVersion>
<pluginType>portlet</pluginType>
<pluginName>XXXX-portlet</pluginName>
</configuration>
</plugin>
Optionally set the XXXX-portlet-deployment-context property in liferay plugin properties file or portlet.properties file
XXXX-portlet-deployment-context=XXXX-portlet
Re-Build the services
Verify if the generated ClpSerializer.java contains the correct contexts
` public static String getServletContextName() {
if (Validator.isNotNull(_servletContextName)) {
return _servletContextName;
}
synchronized (ClpSerializer.class) {
if (Validator.isNotNull(_servletContextName)) {
return _servletContextName;
}
try {
ClassLoader classLoader = ClpSerializer.class.getClassLoader();
Class<?> portletPropsClass = classLoader.loadClass(
"com.liferay.util.portlet.PortletProps");
Method getMethod = portletPropsClass.getMethod("get",
new Class<?>[] { String.class });
String portletPropsServletContextName = (String) getMethod.invoke(null,
"XXXX-portlet-deployment-context");
if (Validator.isNotNull(portletPropsServletContextName)) {
_servletContextName = portletPropsServletContextName;
}
} catch (Throwable t) {
if (_log.isInfoEnabled()) {
_log.info(
"Unable to locate deployment context from portlet properties");
}
}
if (Validator.isNull(_servletContextName)) {
try {
String propsUtilServletContextName = PropsUtil.get(
"XXXX-portlet-deployment-context");
if (Validator.isNotNull(propsUtilServletContextName)) {
_servletContextName = propsUtilServletContextName;
}
} catch (Throwable t) {
if (_log.isInfoEnabled()) {
_log.info(
"Unable to locate deployment context from portal properties");
}
}
}
if (Validator.isNull(_servletContextName)) {
_servletContextName = "upay-portlet";
}
return _servletContextName;
}
}`
Deploy the war, verify the war name and the logs for the correct context name.