Spring XD: Using log4j with logback - logback

I have a collection of XD job modules that all use log4j logging. I have recently upgraded to Spring XD 1.3.1 and my modules are no longer logging.
I have tried adding my packages to the xd-singlenode-logback.groovy configuration file. This has no effect.
I have created a dummy module using slf4j, which logs correctly.
I have tried to find any information on log4j and logback compatibility, but haven't found a definitive answer.
Do I have to switch out log4j with slf4j, or is there something I am missing.

Do you have log4j on the classpath?
xd/lib contains log4j-over-slf4j-1.7.12.jar which should convert your log4j calls to slf4j which in turn calls logback.
You should not have the real log4j on the CP for this to work properly, though.

Related

Passing the ArchiveName to UploadArchives in gradle

Having migrated from Spring Boot 1.5.19 to Spring Boot 2.0.4, we are encountering problems with the build on jenkins. Using gradle 4.2.1. We think the behavioural changes in the spring boot gradle plugin between the versions is causing our issue.
The spring Boot gradle plugin has also been updated from 1.5.19 to 2.0.4
Our target artefact naming convention is :
project-name-<version>-<branch>-RELEASE.jar
The jar file gets generated correctly, having specified the following in the build.gradle file.
bootJar {
baseName = 'project_name'
}
The problem occurs when the uploadArchives task is executed. This task looks for an artefact with the following naming convention.
<path-folder-name>-<version>-<branch>-RELEASE.jar
where is the name of the folder path on the jenkins.
It doesn’t seem to be picking up the baseName config.
The build pipeline runs successfully when we don’t perform the uploadArchives task. Also, prior to the Spring Boot upgrade, this was not an issue.
Is there a way to get uploadArchives task to look for the generated jar file name?
I resolved this eventually by adding a settings.gradle file and defining a a root project name in that
rootProject.name = "project_name"
I think the upgrading of the spring Boot gradle plugin must have changed the way the project was being defined.
The 1.5.* version seemed to be taking the project name from the baseName in the Jar task, but the newer version was using the folder name where the app sits.
That was fun

Apache Flink: logback configuration ignored

I have a flink job in which i am using logback as my logging framework. Apart from the file, console appenders, i am also using logstash-logback-appender to send my log to a logstash instance.
If i run the flink job from Eclipse, the logs are sent to the specified logstash server.
I can see the logs being sent to logstash if i package the application as jar and run it outside Eclipse.
However if run the flink application as job (by uploading the same jar as above) from flink dashboard, the logs are not sent to logstash.
My flink setup is running on windows as per the instructions here Running flink on windows. I start flink using start-cluster.bat
I think the logback configuration is ignored. I have placed the logback configuration at src/main/resources in my application. How can i get the logback configuration recognized by flink setup?
I have tried the steps mentioned in Best practices. Does this steps to replace log4j with logback are for the jobmanager & taskmanager logs or are they for application logs?
Try adding logstash-logback-encoder lib in the lib/ folder along with logback jars.

using jain sip in OSGI bundle

I try to use jain sip API in an OSGi bundle, when i use it in a standard java application, it seems to work when I import the log4j jar. But when i don't import it, i catch an exception when I use the Sipfactory.createStack(Properties p) function. From what i saw in my researches it's because the log4j jar is absent.
Now when i try with an OSGi environment i have the same problem, even when I try to put the lib log4j with the jainsip jars. I just don't know how to make it work, maybe there is a specific manipulation to be able to use log4j.
Moreover i already have another plugin using log4j, i tried to export the log4j lib from this bundle and to import it in my jainsip bundle but it didn't work too.
JAIN-SIP uses log4j-1.2.15.jar. Have you checked if that version of log4j makes any difference?

java.lang.NoClassDefFoundError when already imported jar files to android studio

I receive the error java.lang.NoClassDefFoundError: org.codehaus.jackson.map.ObjectMapper even after adding the Jackson jar files. In Android Studio?
Also tried a few links. Namely this, which did not help.
Edit:
After some research I discovered the root of the error. The dynamo-geo.jar library that is provided by Amazon is inherently flawed in that it refers to some sort of outdated Jackson version. Upon looking in I can see that the class that is called geoJsonMapper refers to a deprecated version of ObjectMapper from the old 1.x.x versions of Jackson. I opened source code from dynamo-geo.jar here and I edited the ObjectMapper import from the outdated version to import com.fasterxml.jackson.databind.ObjectMapper;.
Now the issue I have is I am not sure if there is a way to compile a JAR file in Android Studio? In order to get the newly updated library into my other Android Studio project?
EDIT:
Solution - read this.
If you are using Jackson 2 then you will want to import com.fasterxml.jackson.databind.ObjectMapper instead of org.codehaus.jackson.map.ObjectMapper. You may also have a mix of Jackson 1 and Jackson 2 JAR files in your classpath.
You should be able to fork dynamodb-geo, make your changes, and use Maven to package the new JAR file (run the command mvn clean package). The new JAR file would be located in /dynamodb-geo/target/.

How to integrate Parasoft (JTest) in Hudson?

I normally use JTest Parasoft as a plugin in Eclipse.
But now, I need to integrate JTest in Hudson, at a way that in the Post-build, JTest should run its tests over a Maven project.
So my questions are :
How to integrate JTest in Hudson? I found a plugin CppTest by Parasoft and not JTest...
How to specify the tests which should be run on the project? For example, configure JTest to run "Find unused code" which is included in "Static Analysis"...?
Thank you a lot.
Jtest has fully functional command line interface so generally integration should not be a problem.
As for your questions:
1) there is a Jtest plugin for Maven, so you will be able to trigger your post-build action easily. It's thoroughly described on http://build.parasoft.com .
2) you can specify the Test Configuration of your choice by using -Dparasoft.config option (i.e.: mvn parasoft:jtest -Dparasoft.config="user://Unused Code").
You can find all the parameters which can be used with parasoft:jtest goal described here: http://build.parasoft.com/docs/maven-parasoft-plugin/jtest-mojo.html .
We have integrated Jenkins with Jtest (Linux)
Downloaded the Jtest installers and installed in Jenkins server (in slaves too if you have slaves attached)
Env variables for same has been set (JTEST_HOME)
And now without any entries for Jtest in Pom or build.xml files, we
can directly call the jtestcli commands either in invoke shell
section or use Jtest goals with maven too.
We need to make sure that we have maven-parasoft-plugin 3.12 and Jtest dependencies available in maven repo (for maven projects) and we should have parasoft-ant-3.12.jar available which we need to place in ant lib folder (for ant projects).