springdoc-openapi-ui '1.6.11' dependency conflict - springdoc

I have the conflict and it looks like internal conflict for the springdoc dependencies, how can I solve it:
org.slf4j:slf4j-api:1.7.36
\--- org.webjars:webjars-locator-core:0.50
\--- org.springdoc:springdoc-openapi-ui:1.6.11
\--- compileClasspath
org.slf4j:slf4j-api:1.7.35 -> 1.7.36
\--- io.swagger.core.v3:swagger-core:2.2.2
\--- org.springdoc:springdoc-openapi-common:1.6.11
\--- org.springdoc:springdoc-openapi-webmvc-core:1.6.11
\--- org.springdoc:springdoc-openapi-ui:1.6.11
\--- compileClasspath

Related

How to enable fips openssl in mysql

I was following the below enable fips for mysql
https://dev.mysql.com/blog-archive/openssl-fips-support-in-mysql-8-0/
downloaded these two tars openssl-1.0.2t.tar.gz openssl-fips-2.0.16.tar.gz and tried to build it first into an exe
But running thru lot of issues
The build of FIPS is erroring for missing nmake.dll
mysql blog
C:\openssl\openssl-fips-2.0.16>rem #echo off
C:\openssl\openssl-fips-2.0.16>SET ASM=no--asm
C:\openssl\openssl-fips-2.0.16>SET EXARG=
C:\openssl\openssl-fips-2.0.16>SET MFILE=ntdll.mak
C:\openssl\openssl-fips-2.0.16>if NOT X == X goto wince
C:\openssl\openssl-fips-2.0.16>if NOT XAMD64 == X goto defined
C:\openssl\openssl-fips-2.0.16>if AMD64 == x86 goto X86
C:\openssl\openssl-fips-2.0.16>if AMD64 == IA64 goto IA64
C:\openssl\openssl-fips-2.0.16>if AMD64 == AMD64 goto AMD64
C:\openssl\openssl-fips-2.0.16>echo Auto Configuring for AMD64
Auto Configuring for AMD64
C:\openssl\openssl-fips-2.0.16>SET TARGET=VC-WIN64A
C:\openssl\openssl-fips-2.0.16>if xno--asm == xno-asm goto compile
C:\openssl\openssl-fips-2.0.16>if xno--asm == xno-asm SET EXARG=no-asm
C:\openssl\openssl-fips-2.0.16>perl Configure VC-WIN64A no-asm fipscheck
Configuring for VC-WIN64A
no-asm [option] OPENSSL_NO_ASM
no-ec_nistp_64_gcc_128 [default] OPENSSL_NO_EC_NISTP_64_GCC_128 (skip dir)
no-gmp [default] OPENSSL_NO_GMP (skip dir)
no-jpake [experimental] OPENSSL_NO_JPAKE (skip dir)
no-krb5 [krb5-flavor not specified] OPENSSL_NO_KRB5
no-md2 [default] OPENSSL_NO_MD2 (skip dir)
no-rc5 [default] OPENSSL_NO_RC5 (skip dir)
no-rfc3779 [default] OPENSSL_NO_RFC3779 (skip dir)
no-store [experimental] OPENSSL_NO_STORE (skip dir)
no-zlib [default]
no-zlib-dynamic [default]
IsMK1MF=1
CC =cl
CFLAG =-DOPENSSL_FIPSCANISTER -DOPENSSL_THREADS -DDSO_WIN32 -W3 -Gs0 -Gy -nologo -DOPENSSL_SYSNAME_WIN32 -DWIN32_LEAN_AND_MEAN -DUNICODE -D_UNICODE -D_CRT_SECURE_NO_DEPRECATE
EX_LIBS =
CPUID_OBJ =mem_clr.o
BN_ASM =bn_asm.o
DES_ENC =des_enc.o fcrypt_b.o
AES_ENC =aes_core.o aes_cbc.o
BF_ENC =bf_enc.o
CAST_ENC =c_enc.o
RC4_ENC =rc4_enc.o rc4_skey.o
RC5_ENC =rc5_enc.o
MD5_OBJ_ASM =
SHA1_OBJ_ASM =
RMD160_OBJ_ASM=
CMLL_ENC =camellia.o cmll_misc.o cmll_cbc.o
MODES_OBJ =
ENGINES_OBJ =
PROCESSOR =
RANLIB =true
ARFLAGS =
PERL =perl
SIXTY_FOUR_BIT mode
DES_INT used
RC4_CHUNK is unsigned long long
Configured for VC-WIN64A.
WARNING: OpenSSL has been configured to generate a fipscanister.o object module.
That compiled module is NOT FIPS 140-2 validated or suitable for use in
satisfying a requirement for the use of FIPS 140-2 validated cryptography
UNLESS the requirements of the Security Policy are followed exactly (see
http://openssl.org/docs/fips/ or http://csrc.nist.gov/cryptval/).
This is the OpenSSL FIPS 2.0 module.
C:\openssl\openssl-fips-2.0.16>pause
Press any key to continue . . .
C:\openssl\openssl-fips-2.0.16>echo on
C:\openssl\openssl-fips-2.0.16>perl util\mkfiles.pl 1>MINFO
C:\openssl\openssl-fips-2.0.16>perl util\mk1mf.pl dll no--asm VC-WIN64A 1>ms\ntdll.mak
***************************
****FIPS BUILD FAILURE*****
***************************

After upgrade kafka to 2.x, my UT cannot be launched after using latest EmbeddedKafkaRule

My test dependencis related to kafka as below:
| +--- org.apache.kafka:kafka_2.11:2.0.0 -> 2.0.1
| | +--- org.apache.kafka:kafka-clients:2.0.1 -> 2.1.1-cp1
| +--- org.apache.kafka:kafka-clients:2.0.0 -> 2.1.1-cp1 (*)
| +--- io.confluent:kafka-avro-serializer:5.1.2
| | +--- io.confluent:kafka-schema-registry-client:5.1.2
| | | +--- org.apache.kafka:kafka-clients:2.1.1-cp1 (*)
| +--- io.confluent:kafka-schema-registry-client:5.1.2 (*)
+--- org.springframework.kafka:spring-kafka:2.2.4.RELEASE
| \--- org.apache.kafka:kafka-clients:2.0.1 -> 2.1.1-cp1 (*)
+--- org.springframework.kafka:spring-kafka-test:2.2.4.RELEASE
| +--- org.apache.kafka:kafka-clients:2.0.1 -> 2.1.1-cp1 (*)
| \--- org.apache.kafka:kafka_2.11:2.0.1 (*)
I add rule in my junit entry point class:
#ClassRule
public static EmbeddedKafkaRule embeddedKafka = new EmbeddedKafkaRule(1, true,
SENDER_TOPICS.toArray(new String[0]));
But UT launched failed with following exception:
08:11:06.122 [main] ERROR kafka.server.BrokerMetadataCheckpoint - Failed to read meta.properties file under dir C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778\meta.properties due to C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778\meta.properties
08:11:06.125 [main] ERROR kafka.server.KafkaServer - Fail to read meta.properties under log directory C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778
java.nio.file.NoSuchFileException: C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778\meta.properties
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:79) ~[?:1.8.0_172]
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97) ~[?:1.8.0_172]
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102) ~[?:1.8.0_172]
at sun.nio.fs.WindowsFileSystemProvider.newByteChannel(WindowsFileSystemProvider.java:230) ~[?:1.8.0_172]
at java.nio.file.Files.newByteChannel(Files.java:361) ~[?:1.8.0_172]
at java.nio.file.Files.newByteChannel(Files.java:407) ~[?:1.8.0_172]
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384) ~[?:1.8.0_172]
at java.nio.file.Files.newInputStream(Files.java:152) ~[?:1.8.0_172]
at org.apache.kafka.common.utils.Utils.loadProps(Utils.java:560) ~[kafka-clients-2.1.1-cp1.jar:?]
at kafka.server.BrokerMetadataCheckpoint.liftedTree2$1(BrokerMetadataCheckpoint.scala:63) ~[kafka_2.11-2.0.0.jar:?]
at kafka.server.BrokerMetadataCheckpoint.read(BrokerMetadataCheckpoint.scala:62) ~[kafka_2.11-2.0.0.jar:?]
at kafka.server.KafkaServer$$anonfun$getBrokerIdAndOfflineDirs$1.apply(KafkaServer.scala:665) [kafka_2.11-2.0.0.jar:?]
at kafka.server.KafkaServer$$anonfun$getBrokerIdAndOfflineDirs$1.apply(KafkaServer.scala:663) [kafka_2.11-2.0.0.jar:?]
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) [scala-library-2.11.8.jar:?]
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35) [scala-library-2.11.8.jar:?]
at kafka.server.KafkaServer.getBrokerIdAndOfflineDirs(KafkaServer.scala:663) [kafka_2.11-2.0.0.jar:?]
at kafka.server.KafkaServer.startup(KafkaServer.scala:209) [kafka_2.11-2.0.0.jar:?]
at kafka.utils.TestUtils$.createServer(TestUtils.scala:132) [kafka_2.11-2.0.1-test.jar:?]
at kafka.utils.TestUtils.createServer(TestUtils.scala) [kafka_2.11-2.0.1-test.jar:?]
at org.springframework.kafka.test.EmbeddedKafkaBroker.afterPropertiesSet(EmbeddedKafkaBroker.java:223) [spring-kafka-test-2.2.4.RELEASE.jar:2.2.4.RELEASE]
at org.springframework.kafka.test.rule.EmbeddedKafkaRule.before(EmbeddedKafkaRule.java:109) [spring-kafka-test-2.2.4.RELEASE.jar:2.2.4.RELEASE]
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46) [junit-4.12.jar:4.12]
at org.junit.rules.RunRules.evaluate(RunRules.java:20) [junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) [junit-4.12.jar:4.12]
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:538) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:760) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:460) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:206) [.cp/:?]
08:11:06.455 [main] ERROR kafka.server.LogDirFailureChannel - Failed to create or validate data directory C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778
java.io.IOException: Failed to load C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778 during broker startup
at kafka.log.LogManager$$anonfun$createAndValidateLogDirs$1.apply(LogManager.scala:152) [kafka_2.11-2.0.0.jar:?]
at kafka.log.LogManager$$anonfun$createAndValidateLogDirs$1.apply(LogManager.scala:149) [kafka_2.11-2.0.0.jar:?]
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) [scala-library-2.11.8.jar:?]
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) [scala-library-2.11.8.jar:?]
at kafka.log.LogManager.createAndValidateLogDirs(LogManager.scala:149) [kafka_2.11-2.0.0.jar:?]
at kafka.log.LogManager.<init>(LogManager.scala:80) [kafka_2.11-2.0.0.jar:?]
at kafka.log.LogManager$.apply(LogManager.scala:953) [kafka_2.11-2.0.0.jar:?]
at kafka.server.KafkaServer.startup(KafkaServer.scala:237) [kafka_2.11-2.0.0.jar:?]
at kafka.utils.TestUtils$.createServer(TestUtils.scala:132) [kafka_2.11-2.0.1-test.jar:?]
at kafka.utils.TestUtils.createServer(TestUtils.scala) [kafka_2.11-2.0.1-test.jar:?]
at org.springframework.kafka.test.EmbeddedKafkaBroker.afterPropertiesSet(EmbeddedKafkaBroker.java:223) [spring-kafka-test-2.2.4.RELEASE.jar:2.2.4.RELEASE]
at org.springframework.kafka.test.rule.EmbeddedKafkaRule.before(EmbeddedKafkaRule.java:109) [spring-kafka-test-2.2.4.RELEASE.jar:2.2.4.RELEASE]
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46) [junit-4.12.jar:4.12]
at org.junit.rules.RunRules.evaluate(RunRules.java:20) [junit-4.12.jar:4.12]
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) [junit-4.12.jar:4.12]
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:538) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:760) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:460) [.cp/:?]
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:206) [.cp/:?]
08:11:06.458 [main] ERROR kafka.log.LogManager - Shutdown broker because none of the specified log dirs from C:\Users\shanh\AppData\Local\Temp\kafka-2116758262206608778 can be created or validated
Before I used org.apache.kafka:kafka_2.11:0.10.1.1 with spring-kafka-test:1.1.3.RELEASE, they are working perfectly well. I don't know it's my new dependency issue or bug in spring kafka test.
The issue is kafka version upgrade to 2.1.1 in my dependency tree: org.apache.kafka:kafka-clients:2.0.1 -> 2.1.1-cp1
After forcedModules to "org.apache.kafka:kafka-clients:2.0.1", no issue for my UT

Failing to start my spring boot application due to 'javax.sql.DataSource' that could not be found

I am a beginner in spring boot and am trying to write a simple spring boot application.
My folder structure is as follows:
-> Project
-> build.gradle
-> settings.gradle
-> src/main/java
-> package
-> Main.java
-> UserController.java
-> UserRespository.java
-> dto
-> User.java
->src/main/resouces
-> application.properties
My build.gradle is as follows :
buildscript {
repositories {
jcenter()
}
dependencies {
classpath(
'org.springframework.boot:spring-boot-gradle- plugin:1.5.6.RELEASE'
)
classpath('mysql:mysql-connector-java:5.1.34')
}
}
apply plugin: 'java'
apply plugin: 'spring-boot'
sourceCompatibility = 1.8
targetCompatibility = 1.8
repositories {
mavenCentral()
}
dependencies {
compile(
'org.springframework.boot:spring-boot-starter-actuator',
'org.springframework.boot:spring-boot-starter-web',
'org.springframework.boot:spring-boot-starter-data-jpa'
)
compile('mysql:mysql-connector-java')
testCompile('org.springframework.boot:spring-boot-starter-test')
}
The application.properties is as follows:
spring.jpa.hibernate.ddl-auto=create
spring.datasource.url=jdbc:mysql://localhost:3306/user
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.jpa.database-platform=org.hibernate.dialect.MySQLDialect
spring.jpa.database=MYSQL
spring.jpa.show-sql = true
My Main.java is as follows:-
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
};
}
I am able to successfully build the application. If I run the application , I get Unable to start application with the following stacktrace:
2017-08-14 20:43:02.976 WARN 27205 --- [ main] ationConfigEmbeddedWebApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration': Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'javax.sql.DataSource' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
2017-08-14 20:43:02.978 INFO 27205 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat]
2017-08-14 20:43:03.007 INFO 27205 --- [ main] utoConfigurationReportLoggingInitializer :
Error starting ApplicationContext. To display the auto-configuration report re-run your application with 'debug' enabled.
2017-08-14 20:43:03.198 ERROR 27205 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of constructor in org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration required a bean of type 'javax.sql.Data Source' that could not be found.
- Bean method 'dataSource' not loaded because #ConditionalOnProperty (spring.datasource.jndi-name) did not find property 'jndi-name'
- Bean method 'dataSource' not loaded because #ConditionalOnBean (types: org.springframework.boot.jta.XADataSourceWrapper; SearchStrategy: all) did not find any beans
Action:
Consider revisiting the conditions above or defining a bean of type 'javax.sql.DataSource' in your configuration.
I have checked the dependencies tree and can find both hibernate as well as mysql connector in it.
Have tried removing #EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class}) , in that case I get Cannot load driver class: com.mysql.jdbc.Driver
You should move the dto package inside into your Main.java class package, In your case it should be like src/main/java/package/dto
So when spring-boot scans, your entity will be visible to the scanner.
Make sure you have added the MySQL driver dependency
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.6</version>
</dependency>

Apollo Graphql Custom Scalar JSON - complains "TypeError: type.getFields is not a function"

I'm working on an apollo-express graphql server. I attempted to integrate the module 'graphql-type-json' but when I followed this description on how to integrate, I've tried many things but it seems that the type isn't being passed to the resolver correctly - but I've hit a wall in my debugging and could use a little help. Here is a summary of what I'm doing.
import { makeExecutableSchema } from 'graphql-tools';
const GraphQLJSON = require('graphql-type-json');
//Have also tried import GraphQLJSON from 'graphql-type-json';
const schema = `
scalar JSON
type Device {
deviceconfig: JSON
}
type Query {
foo: Foo
}
`;
const resolveFunctions = {
JSON: GraphQLJSON,
//JSON: {return GraphQLJSON} stops error but other issues come up...
Query: ...
};
const jsSchema = makeExecutableSchema({
typeDefs: schema,
resolvers: resolveFunctions,
resolverValidationOptions: {
requireResolversForNonScalar: false,
},
allowUndefinedInResolve: true,
printErrors: true,
});
Not sure if it's relevant but there are a few issues with my npm:
graphql-type-json#0.1.4
UNMET PEER DEPENDENCY graphql#0.8.2 invalid
├─┬ graphql-tools#0.4.2
│ ├── UNMET PEER DEPENDENCY graphql#^0.5.0 || ^0.6.0
npm ERR! peer dep missing: graphql#^0.6.1 || ^0.7.0, required by apollo-server#0.3.3
npm ERR! peer dep missing: graphql#^0.5.0, required by graphql-tools#0.4.2
npm ERR! extraneous: casual#1.5.8 /home/apollo/node_modules/casual
npm ERR! extraneous: mongoose#4.6.6 /home/apollo/node_modules/mongoose
npm ERR! extraneous: mysql-events#0.0.8 /home/apollo/node_modules/mysql-events
npm ERR! peer dep missing: graphql#^0.5.0 || ^0.6.0, required by express-widgetizer#0.5.11
I resolved custom scalar JSON like this in resolvers
JSON: {
__serialize(value) {
return GraphQLJSON.parseValue(value);
} }
And It worked fine for me. I think it will help you

Junit Testing Struts 2.x actions

I have a simple Action class which I want to unit test:
package com.gam.action.test;
import org.apache.struts2.StrutsTestCase;
public class HelloWorldActionTest extends StrutsTestCase{
/**
* Test method for {#link com.gam.action.HelloWorldAction#execute()}.
*/
public void testExecute() {
fail("Not yet implemented");
}
}
I've created this test case using JUnit wizard in eclipse. I get the following error when I run the test:
Class: com.opensymphony.xwork2.spring.SpringObjectFactory
File: SpringObjectFactory.java
Method: getClassInstance
Line: 230 - com/opensymphony/xwork2/spring/SpringObjectFactory.java:230:-1
at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:449)
at org.apache.struts2.util.StrutsTestCaseHelper.initDispatcher(StrutsTestCaseHelper.java:54)
at org.apache.struts2.StrutsTestCase.initDispatcher(StrutsTestCase.java:196)
at org.apache.struts2.StrutsTestCase.setUp(StrutsTestCase.java:182)
at junit.framework.TestCase.runBare(TestCase.java:132)
at junit.framework.TestResult$1.protect(TestResult.java:110)
at junit.framework.TestResult.runProtected(TestResult.java:128)
at junit.framework.TestResult.run(TestResult.java:113)
at junit.framework.TestCase.run(TestCase.java:124)
at junit.framework.TestSuite.runTest(TestSuite.java:243)
at junit.framework.TestSuite.run(TestSuite.java:238)
at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Caused by: java.lang.NullPointerException
at com.opensymphony.xwork2.spring.SpringObjectFactory.getClassInstance(SpringObjectFactory.java:230)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.verifyResultType(XmlConfigurationProvider.java:538)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addResultTypes(XmlConfigurationProvider.java:509)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:465)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:278)
at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:112)
at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:204)
at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:66)
at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:390)
at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:436)
... 17 more
The problem is I don't know which jar files should be provided. I'm not using spring in my project but if I don't provide spring jar files I get some error and when I provide it I get this one.
What combination of jar files are needed to simply run the test! (As you can see I've degraded my test method to a dummy method.)
The struts2-junit-plugin introduces its own dependencies, shown by this Maven output:
[INFO] +- org.apache.struts:struts2-junit-plugin:jar:2.3.1:compile
[INFO] | +- org.springframework:spring-test:jar:3.0.5.RELEASE:compile
[INFO] | +- org.springframework:spring-core:jar:3.0.5.RELEASE:compile
[INFO] | | +- org.springframework:spring-asm:jar:3.0.5.RELEASE:compile
[INFO] | | \- commons-logging:commons-logging:jar:1.1.1:compile
[INFO] | +- org.springframework:spring-context:jar:3.0.5.RELEASE:compile
[INFO] | | +- org.springframework:spring-aop:jar:3.0.5.RELEASE:compile
[INFO] | | | \- aopalliance:aopalliance:jar:1.0:compile
[INFO] | | +- org.springframework:spring-beans:jar:3.0.5.RELEASE:compile
[INFO] | | \- org.springframework:spring-expression:jar:3.0.5.RELEASE:compile
[INFO] | \- junit:junit:jar:4.8.2:compile
It sounds like you're not using Maven, which is almost certainly a Bad Idea. Managing transitive dependencies yourself is not terribly entertaining–save yourself some time and manual labor.