Can the ConfigurationAPI in Liferay DXP be used for Plugin sdk portlet? - configuration

I have followed given 2 tutorials to use COnfigurationAPI in a Liferay dxp plugins SDK portlet built using Ant/Ivy.
COnfiguration API 1
COnfiguration API 2.
Below is the configuration class used:
package com.preferences.interfaces;
import com.liferay.portal.configuration.metatype.annotations.ExtendedObjectClassDefinition;
import aQute.bnd.annotation.metatype.Meta;
#ExtendedObjectClassDefinition(
category = "preferences",
scope = ExtendedObjectClassDefinition.Scope.GROUP
)
#Meta.OCD(
id = "com.preferences.interfaces.UnsupportedBrowserGroupServiceConfiguration",
name = "UnsupportedBrowser.group.service.configuration.name"
)
public interface UnsupportedBrowserGroupServiceConfiguration {
#Meta.AD(deflt = "", required = false)
public String displayStyle();
#Meta.AD(deflt = "0", required = false)
public long displayStyleGroupId(long defaultDisplayStyleGroupId);
}
Post following the steps,I am getting the below error:
ERROR [CM Configuration Updater (ManagedService Update: pid=[com.preferences.interfaces.UnsupportedBrowserGroupServiceConfiguration])][org_apache_felix_configadmin:97] [org.osgi.service.cm.ManagedService, id=7082, bundle=297//com.liferay.portal.configuration.settings-2.0.15.jar?lpkgPath=C:\dev\Liferay\osgi\marketplace\Liferay Foundation.lpkg]: Unexpected problem updating configuration com.preferences.interfaces.UnsupportedBrowserGroupServiceConfiguration {org.osgi.service.cm.ConfigurationAdmin}={service.vendor=Apache Software Foundation, service.pid=org.apache.felix.cm.ConfigurationAdmin, service.description=Configuration Admin Service Specification 1.2 Implementation, service.id=56, service.bundleid=643, service.scope=bundle}
Caused by: java.lang.IllegalArgumentException: wrong number of arguments
So,does this process need a osgi module as mandatory or can we do it using plusings sdk portlet built using ant as well?

Without disecting the error message Caused by: java.lang.IllegalArgumentException: wrong number of arguments:
The way you build your plugin (Ant, Maven, Gradle, manually) doesn't make a difference, as long as you build a plugin that will be understood by the runtime. aQute.bnd.annotation.metatype.Meta points firmly into the OSGi world, and makes it almost certain that you'll need an OSGi module. You can build this with Ant, of course. Even in Ant you can embed tools like bnd, or you can write the proper Manifest.mf to include in your module manually (just kidding - you don't want to do it manually, but it would work).
Recommendation: Instead of moving everything over: Try to reproduce this with a minimal example in gradle or better Liferay Workspace (which is gradle based), just to get all the automatic wiring in. Check if it makes a difference and compare the generated output from your Ant build process with the workspace output. Pay specific attention to the Manifest.
In order to build the proper Manifest, you want to use bnd - if the Manifest turns out to be your issue: Find a way to embrace bnd - if that's by saying goodby to Ant, or by tweaking your build script remains your decision.

Related

How can I externalize ISchedulerExecutorService to run tasks in an external hazelcast cluster(Hazecast 5.2) without using UserCodeDeployment?

I am working on externalizing our IScheduledExecutorService so I can run tasks externally on a external cluster. I am able to write a test and get the Runnable to actually run ONLY if I turn on UserCode deployment. If I want to change this task at all and run the tests again I get the below in my external cluster member's logs..
java.lang.IllegalStateException: Class com.mycompany.task.ScheduledTask is already in local cache and has conflicting byte code representation
I want to be able to change the task if I could and redeploy to Hazelcast to just handle it. I do this kind of thing with our external maps now. It can handle different versions of our objects using compact serialization.
Am I stuck using user code deployment for these functional objects? If I need to make a change to it I need to change the class name and redeploy to production. I'm hoping to get this task right the first time and not have to ever do that but I have a way of handling it if I do.
The cluster is already running in production and I'll have to add the following to each member
HZ_USERCODEDEPLOYMENT_ENABLED=true
and the appropriate client code(listed below) to enable this.
What I've done...
Added the following to my local docker file
HZ_USERCODEDEPLOYMENT_ENABLED=true
and also in the code that creates a hazelcast client connecting to my external cluster with
ClientConfig clientConfig = new ClientConfig(); ClientUserCodeDeploymentConfig clientUserCodeDeploymentConfig = new ClientUserCodeDeploymentConfig(); clientUserCodeDeploymentConfig.addClass("com.mycompany.task.ScheduledTask"); clientUserCodeDeploymentConfig.setEnabled(true); clientConfig.setUserCodeDeploymentConfig(clientUserCodeDeploymentConfig);
However, if I remove those two pieces I get the following Exception with a failing test. It doesn't know about my class at all.
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.mycompany.task.ScheduledTask
Side Note:
We are using compact serialization for several maps already and when I try to configure this Runnable task via compact serialization I get the below error. I don't think that's the right approach either.
[Scheduler: myScheduledExecutorService][Partition: 121][Task: 7afe68d5-3185-475f-b375-5a82a7088de3] Exception occurred during run
java.lang.ClassCastException: class com.hazelcast.internal.serialization.impl.compact.DeserializedGenericRecord cannot be cast to class java.lang.Runnable (com.hazelcast.internal.serialization.impl.compact.DeserializedGenericRecord is in unnamed module of loader 'app'; java.lang.Runnable is in module java.base of loader 'bootstrap')
at com.hazelcast.scheduledexecutor.impl.ScheduledRunnableAdapter.call(ScheduledRunnableAdapter.java:49) ~[hazelcast-5.2.0.jar:5.2.0]
at com.hazelcast.scheduledexecutor.impl.TaskRunner.call(TaskRunner.java:78) ~[hazelcast-5.2.0.jar:5.2.0]
at com.hazelcast.internal.util.executor.CompletableFutureTask.run(CompletableFutureTask.java:64) ~[hazelcast-5.2.0.jar:5.2.0]

Sonar Unit tests report parameter - sonar.junit.reportPath vs sonar.java.junit.reportPath

I found that my Sonar instance 5.1 or 5.1.1 (with latest sonar-runner 2.x) stopped showing part of the Unit test info (Unit test widget) on the project's dashboard.
The properties I had were (in Gradle's sonarRunner > sonarProperties section):
property "sonar.junit.reportsPath", "build/test-results/UT"
property "sonar.surefire.reportsPath", "build/test-results/UT"
To fix it, I had to include the following properties as well:
property "sonar.java.junit.reportsPath", "build/test-results/UT"
property "sonar.java.surefire.reportsPath", "build/test-results/UT"
Just FYI: All my Unit tests reports go under build/test-results/UT folder, all Integration Tests result files go unedr build/test-results/IT folder and etc.
I'm wondering if this is due to Gradle version that I'm using (2.3) or is it due to a later version of SonarQube (4.5+) as I have both SQ 5.1 and 5.1.1 instance.
I know SonarQube team started Multi language support since SonarQube version 4.12
Since SonarQube 4.2, it is possible to run an analysis on a multi-language project.
Now, it raises a question. For Getting the same Unit test info for Groovy based projects, do I need to use:
property "sonar.groovy.junit.reportsPath", "build/test-results/UT"
property "sonar.groovy.surefire.reportsPath", "build/test-results/UT"
something like that if my project has Groovy code instead of java?
Searching "**sonar.java.junit.reportPath"** with using double quotes shows No results found in Google and it forces me to try and see google results if I can run the search again without using " double quotes (for this property).
Doing the same in SonarQube site "search box" shows:
No results found for sonar.java.junit.reportPath. Please try one of the following suggestions:
Though in Gradle, inside
sonarRunner task {
.. inside ..
sonarProperties {
... section ... where I define various sonar props..
}
...
}
I can define both sonar.junit.reportPath, sonar.java.junit.reportPath and similarly, sonar.surefire.reporPath and sonar.java.surefire.reportPath and while running sonarRunner task in Gradle, it doesn't error out. Thus it makes me believe that the property variables are valid.
There are also issues with running sonarRunner or stand alone sonar-runner command for a mixed Java and Groovy based project (i.e. source code in Java but tests in Groovy). Setting sonar.language=java,grvy didn't help. I posted this question on stackoverflow but so far I have no perfect result/answer on how to get a full fledged sonar dashboard up and running for a Groovy projects like I get for a Java project.
Groovy project - Sonar - Publish project and Unit + Integration Test code coverage data
PS: I have tried various values for setting sonar.. variables (as far a sonar source, tests, etc, etc properties are concerned, which they have mentioned on their site's docs section)
The only valid property to use as of now is sonar.junit.reportsPath which will tell the java sonarqube plugin where to import your result of unit tests.
For groovy, this is work in progress, see : http://jira.sonarsource.com/browse/SONARGROOV-2
All the other properties you mentioned do not exist and are not taken into account.

Debugging and troubleshooting Apache Karaf 3

So I've OSGI-ified a war file. It still works in Tomcat. I have all the requisite fields in the manifest and the libraries are all embedded for now. I'll externalize them later. There are two which are not OSGI enabled. The war file has log4j2 embedded BTW. It will be removed later.
The war file in question is 3 simple Jersey based REST/JSON services.
It starts and goes to Active state but I can't hit it with SoapUI where I expect to find it. The logs show it starting but that is all they show.
How can I squeeze more info out of Karaf so that I can properly figure out what is going on?
Is there something special I have to do in the Activator to get it to fire up?
Note: This is a simple REST / JSON service that wraps WURFL. By license, it's Open Source but it hasn't been released yet.
karaf#root()> bundle:headers MobileWURFL
MobileWURFL Maven Webapp (104)
-------------------------------
Manifest-Version = 1.0
Bnd-LastModified = 1395276484402
Archiver-Version = Plexus Archiver
Tool = Bnd-2.1.0.20130426-122213
Embed-Directory = WEB-INF/lib
Embedded-Artifacts = WEB-INF/lib/org.osgi.core-4.3.0.jar;g="org.osgi";a="org.osgi.core";v="4.3.0",WEB-INF/lib/org.osgi.compendium-1.4.0.jar;g="org.apache.felix";a="org.osgi.compend
ium";v="1.4.0",WEB-INF/lib/org.osgi.core-1.4.0.jar;g="org.apache.felix";a="org.osgi.core";v="1.4.0",WEB-INF/lib/javax.servlet-1.0.0.jar;g="org.apache.felix";a="javax.servlet";v="1.
0.0",WEB-INF/lib/org.osgi.foundation-1.2.0.jar;g="org.apache.felix";a="org.osgi.foundation";v="1.2.0",WEB-INF/lib/servlet-api-2.5.jar;g="javax.servlet";a="servlet-api";v="2.5",WEB-
INF/lib/log4j-api-2.0-rc1.jar;g="org.apache.logging.log4j";a="log4j-api";v="2.0-rc1",WEB-INF/lib/log4j-core-2.0-rc1.jar;g="org.apache.logging.log4j";a="log4j-core";v="2.0-rc1",WEB-
INF/lib/disruptor-3.0.1.jar;g="com.lmax";a="disruptor";v="3.0.1",WEB-INF/lib/commons-lang-2.6.jar;g="commons-lang";a="commons-lang";v="2.6",WEB-INF/lib/log4j-slf4j-impl-2.0-rc1.jar
;g="org.apache.logging.log4j";a="log4j-slf4j-impl";v="2.0-rc1",WEB-INF/lib/slf4j-api-1.7.5.jar;g="org.slf4j";a="slf4j-api";v="1.7.5",WEB-INF/lib/commons-collections-3.2.1.jar;g="co
mmons-collections";a="commons-collections";v="3.2.1",WEB-INF/lib/wurfl-1.5.1.jar;g="net.sourceforge.wurfl";a="wurfl";v="1.5.1",WEB-INF/lib/json-20140107.jar;g="org.json";a="json";v
="20140107",WEB-INF/lib/jersey-server-1.8.jar;g="com.sun.jersey";a="jersey-server";v="1.8",WEB-INF/lib/asm-3.1.jar;g="asm";a="asm";v="3.1",WEB-INF/lib/jersey-core-1.8.jar;g="com.su
n.jersey";a="jersey-core";v="1.8"
Built-By = Coder_Guy
Embed-Dependency = *;scope=compile|runtime
Embed-Transitive = true
Webapp-Context = MobileWURFL
Web-ContextPath = MobileWURFL
Build-Jdk = 1.7.0_51
Created-By = Apache Maven Bundle Plugin
Bundle-Name = MobileWURFL Maven Webapp
Bundle-SymbolicName = MobileWURFL
Bundle-Version = 0.0.1.SNAPSHOT
Bundle-ManifestVersion = 2
Bundle-ClassPath = .,WEB-INF/classes,WEB-INF/lib/org.osgi.core-4.3.0.jar,WEB-INF/lib/org.osgi.compendium-1.4.0.jar,WEB-INF/lib/org.osgi.core-1.4.0.jar,WEB-INF/lib/javax.servlet-1.0
.0.jar,WEB-INF/lib/org.osgi.foundation-1.2.0.jar,WEB-INF/lib/servlet-api-2.5.jar,WEB-INF/lib/log4j-api-2.0-rc1.jar,WEB-INF/lib/log4j-core-2.0-rc1.jar,WEB-INF/lib/disruptor-3.0.1.ja
r,WEB-INF/lib/commons-lang-2.6.jar,WEB-INF/lib/log4j-slf4j-impl-2.0-rc1.jar,WEB-INF/lib/slf4j-api-1.7.5.jar,WEB-INF/lib/commons-collections-3.2.1.jar,WEB-INF/lib/wurfl-1.5.1.jar,WE
B-INF/lib/json-20140107.jar,WEB-INF/lib/jersey-server-1.8.jar,WEB-INF/lib/asm-3.1.jar,WEB-INF/lib/jersey-core-1.8.jar
As you are using a OSGi-fied war the war-extender of Pax Web will kick in. Therefore no activator needed. For debugging just start the karaf container with karaf debug, attach your debuger to port 8787.
Depending on your embedded jars there might be an issue with those, for example a servlet.jar or similar will result in errors with deployment. Also possible the log4j2.jar could cause an issue.
What's the result of bundle:header for this war?
With the command
web:list
you also receive the info of the state of the web bundle.
UPDATE:
It is right there in your Bundle-ClassPath. The servlet jar is not allowed to be in a WAR, per spec by the way. In OSGi it collides with the packages provided by Pax-Web. In a Tomcat, the servlet.jar is already loaded by the container therefore it does work, as First-Come-First-Serve is used by a classloader. With OSGi the first Servlet class is found inside the War and therefore the resolver doesn't use the one provided by Pax Web. It is essential that you remove that jar.
And I think adding those osgi jars doesn't help any either, this will most likely collide with the bundles provided by the container.
I strongly suggest using the maven-bundle-plugin to generate this war, so the imports are properly created. Or just neglect all OSGi meta information and deploy a standard WAR. If you use the following type URL:
webbundle:mvn:groupID/artifactID/version/war?Web-ContextPath=Mobile-WURFL
It will generate a proper OSGi Manifest for your war.

Want to use JUnit in Domino Designer / Java Beans - but keep getting a "Class not found" error?

I do the following:
From the Package Explorer I select "New, Other, JUnit Test Case"
I write this code:
package dk.sample;
import org.junit.*;
import static org.junit.Assert.*;
public class TestCase {
#Test
public void alwaysTrue(){
assertTrue( true );
}
}
I then select "Run As, JUnit test"
Get this error: "Class not found dk.sample.TestCase
java.lang.ClassNotFoundException: ...."
What do I miss? Have tried with different Run Configurations - but it seems like I miss a classpath somewhere? But to what and where?
To make JUnit work within Domino Designer you need to perform few additional steps:
set up source control for your application
adjust the on-disk project to be recognized as Java application
run JUnit tests within your on-disk project
Please note that java agents have to be tested in a different way..
You can find more detailed explanation about enabling JUnit for both XPages and Agents in the following blog post: Unit Tests for Lotus Domino Applications
Here's also a great how-to on this topic.
Coundn't get JUnit to work inside the Domino Designer. Instead of running the tests from DDE, I now run the tests from a XPages. This works like a dream. Made my own 'JUnit runner' class - that is, I just call the JUnit runners but handles the result my self in order to display it as html on the XPage.
Code can be found here: http://xpages.dk/wp-content/uploads/2013/10/junitrunner.txt
Danish blog post here: http://xpages.dk/?p=1162

MEF: "Unable to load one or more of the requested types. Retrieve the LoaderExceptions for more information"

Scenario: I am using Managed Extensibility Framework to load plugins (exports) at runtime based on an interface contract defined in a separate dll. In my Visual Studio solution, I have 3 different projects: The host application, a class library (defining the interface - "IPlugin") and another class library implementing the interface (the export - "MyPlugin.dll").
The host looks for exports in its own root directory, so during testing, I build the whole solution and copy Plugin.dll from the Plugin class library bin/release folder to the host's debug directory so that the host's DirectoryCatalog will find it and be able to add it to the CompositionContainer. Plugin.dll is not automatically copied after each rebuild, so I do that manually each time I've made changes to the contract/implementation.
However, a couple of times I've run the host application without having copied (an updated) Plugin.dll first, and it has thrown an exception during composition:
Unable to load one or more of the requested types. Retrieve the LoaderExceptions for more information
This is of course due to the fact that the Plugin.dll it's trying to import from implements a different version of IPlugin, where the property/method signatures don't match. Although it's easy to avoid this in a controlled and monitored environment, by simply avoiding (duh) obsolete IPlugin implementations in the plugin folder, I cannot rely on such assumptions in the production environment, where legacy plugins could be encountered.
The problem is that this exception effectively botches the whole Compose action and no exports are imported. I would have preferred that the mismatching IPlugin implementations are simply ignored, so that other exports in the catalog(s), implementing the correct version of IPlugin, are still imported.
Is there a way to accomplish this? I'm thinking either of several potential options:
There is a flag to set on the CompositionContainer ("ignore failing imports") prior to or when calling Compose
There is a similar flag to specify on the <ImportMany()> attribute
There is a way to "hook" on to the iteration process underlying Compose(), and be able to deal with each (failed) import individually
Using strong name signing to somehow only look for imports implementing the current version of IPlugin
Ideas?
I have also run into a similar problem.
If you are sure that you want to ignore such "bad" assemblies, then the solution is to call AssemblyCatalog.Parts.ToArray() right after creating each assembly catalog. This will trigger the ReflectionTypeLoadException which you mention. You then have a chance to catch the exception and ignore the bad assembly.
When you have created AssemblyCatalog objects for all the "good" assemblies, you can aggregate them in an AggregateCatalog and pass that to the CompositionContainer constructor.
This issue can be caused by several factors (any exceptions on the loaded assemblies), like the exception says, look at the ExceptionLoader to (hopefully) get some idea
Another problem/solution that I found, is when using DirectoryCatalog, if you don't specify the second parameter "searchPattern", MEF will load ALL the dlls in that folder (including third party), and start looking for export types, that can also cause this issue, a solution is to have a convention name on all the assemblies that export types, and specify that in the DirectoryCatalog constructor, I use *_Plugin.dll, that way MEF will only load assemblies that contain exported types
In my case MEF was loading a NHibernate dll and throwing some assembly version error on the LoaderException (this error can happen with any of the dlls in the directory), this approach solved the problem
Here is an example of above mentioned methods:
var di = new DirectoryInfo(Server.MapPath("../../bin/"));
if (!di.Exists) throw new Exception("Folder not exists: " + di.FullName);
var dlls = di.GetFileSystemInfos("*.dll");
AggregateCatalog agc = new AggregateCatalog();
foreach (var fi in dlls)
{
try
{
var ac = new AssemblyCatalog(Assembly.LoadFile(fi.FullName));
var parts = ac.Parts.ToArray(); // throws ReflectionTypeLoadException
agc.Catalogs.Add(ac);
}
catch (ReflectionTypeLoadException ex)
{
Elmah.ErrorSignal.FromCurrentContext().Raise(ex);
}
}
CompositionContainer cc = new CompositionContainer(agc);
_providers = cc.GetExports<IDataExchangeProvider>();