#RunWith(MockitoJUnitRunner.class) //Class cannot be resolved to a type //#SpringBootTest public class MbankingApplicationTest {
#Mock CanmbTransactionDao dataServiceMock;
#Mock CanmbBaseDao baseDao;
#InjectMocks CanmbTransactionServiceImpl businessImpl;
#Autowired OminiController controller;
Customer customer;
#Test public void test() {
Customer cust= new Customer();
cust.setMbnumber("+919990176197");
cust.setDeviceid("abcdef");
UserProfileMaster profile = new UserProfileMaster();
profile.setChkflag(22);
profile.setStatus("ACTIVE");
cust.setUserProfile(profile);
cust.setMpin("123456");
cust.setMpinsalt("12345");
when(dataServiceMock.getUserMbAndDevice("+919990176197", "abcdef")).thenReturn(cust);
this.customer = cust;
assertEquals(cust, businessImpl.getUserMbAndDevice("+919990176197", "abcdef"));
}
#Test public void testMpin() {
Customer cust = new Customer();
cust.setMbnumber("+919990176197");
cust.setDeviceid("abcdef");
UserProfileMaster profile = new UserProfileMaster(); profile.setChkflag(22);
profile.setStatus("ACTIVE");
cust.setUserProfile(profile);
cust.setMpin("d150cb2c64171a95eb3fa1bbf2ea786aef16b04d389a1ac67a52c75e95f61e66");
cust.setMpinsalt("12345");
when(dataServiceMock.getUserMbAndDevice("+919990176197", "abcdef")).thenReturn(cust);
//assertEquals(cust, businessImpl.getUserMbAndDevice("+919990176197", "abcdef"));
MBSOMNIIntegration reqData = new MBSOMNIIntegration();
reqData.setMbnumber("+919990176197");
reqData.setDeviceid("abcdef");
reqData.setMpin("123456");
OMNIIntegration omni=new OMNIIntegration();
// businessImpl.validateOmniMpin(reqData, omni, "123");
ResponseData data= new ResponseData();
Map<String,String> ominiMap= new HashMap<>();
ominiMap.put("Msg", "verified"); ominiMap.put("statusCode", "0");
data.setStatusCode(0);
data.setTid("");
data.setData(ominiMap);
when(businessImpl.validateOmniMpin(reqData, omni, "123")).thenReturn(data);
assertEquals(data,businessImpl.validateOmniMpin(reqData, omni, "123"));
}
Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.835 sec
<<< FAILURE! - in com.npst.mb.MbankingApplicationTest testMpin(com.npst.mb.MbankingApplicationTest) Time elapsed: 0.823 sec
<<< FAILURE! java.lang.AssertionError: expected: com.npst.mb.pojo.ResponseData but was: com.npst.mb.pojo.ResponseData at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:120) at org.junit.Assert.assertEquals(Assert.java:146) at com.npst.mb.MbankingApplicationTest.testMpin(MbankingApplicationTest.java:93)
Results:
Failed tests: MbankingApplicationTest.testMpin:93 expected: com.npst.mb.pojo.ResponseData but was: com.npst.mb.pojo.ResponseData
Tests run: 2, Failures: 1, Errors: 0, Skipped: 0
[INFO]
------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO]
------------------------------------------------------------------------ [INFO] Total time: 12.493 s [INFO] Finished at: 2019-01-17T14:57:43+05:30 [INFO] Final Memory: 36M/346M [INFO]
------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (default-test) on project CapexMbankingPhase2: There are test failures. [ERROR] [ERROR] Please refer to /home/npstx/raj/Canara Projects/MBS_APP_OMNI/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Related
I have to migrate around millions of blob records from multiple mysql databases to a physical location as files over WAN network.
I chose to use Spring Batch and has already made it work. However, I am struggling with a timeout error happen with random partitioned steps.
Here is some context,
There are multiple MySql database store >10m records in 20 years.
The source tables indexed two composite keys in varchar datatype (there is no ID key) so I have to use an UN-indexed column in date-time format to partitioning the records by year and week to keep the number of records per partition reasonably at average 200 records. If there is any better advice, it would be welcome!
My issue: When the records per partition is high enough, the stepExecutors will randomly failed due to time out
Could not open JDBC Con nection for transaction; nested exception is java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 30000ms
I have done some tweaks with the DataSource properties and Transaction properties but no luck. Can I get some advice please! Thanks
Terminal log:
org.springframework.transaction.CannotCreateTransactionException: Could not open JDBC Con
nection for transaction; nested exception is
java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 30000ms.
at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:309)
~[spring-jdbc-5.3.16.jar:5.3.16]
...
Caused by: java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 30000ms.
2022-03-05 10:05:43.146 ERROR 15624 --- [main] o.s.batch.core.step.AbstractStep : Encountered an error executing step managerStep in job mainJob
org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step at ...
The job is marked as [FAILED] or [UNKNOWN] sometimes, and not restartable.
org.springframework.batch.core.partition.support.PartitionStep.doExecute(PartitionStep.java:112) ~[spring-batch-core-4.3.5.jar:4.3.5]
2022-03-05 10:05:43.213 INFO 15624 --- [main] o.s.b.c.l.support.SimpleJobLauncher : Job: [SimpleJob: [name=mainJob]] completed with the following parameters: [{run.id=20}] and the following status: [FAILED] in 3m13s783ms
2022-03-05 10:05:43.590 INFO 15624 --- [SpringApplicationShutdownHook] com.zaxxer.hikari.HikariDataSource : HikariPool-2 - Shutdown initiated...
2022-03-05 10:05:43.624 INFO 15624 --- [SpringApplicationShutdownHook] com.zaxxer.hikari.HikariDataSource : HikariPool-2 - Shutdown completed.
2022-03-05 10:05:43.626 INFO 15624 --- [SpringApplicationShutdownHook] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown initiated...
2022-03-05 10:05:43.637 INFO 15624 --- [SpringApplicationShutdownHook] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown completed.
Datasource builder: I have tried to increase the connection timeout and pool size, but it seems not be applied.
#Bean(name = "srcDataSource")
// #ConfigurationProperties(prefix = "spring.datasource.hikari")
public HikariDataSource dataSource() {
HikariDataSource hikariDS = new HikariDataSource();
hikariDS.setDriverClassName("com.mysql.jdbc.Driver");
hikariDS.setJdbcUrl("jdbc:mysql://dburl");
hikariDS.setUsername("dbuser");
hikariDS.setPassword("dbpwd");
// properties below does not solve the problem
hikariDS.setMaximumPoolSize(16);
hikariDS.setConnectionTimeout(30000);
// hikariDS.addDataSourceProperty("serverName",
// getConfig().getString("mysql.host"));
// hikariDS.addDataSourceProperty("port", getConfig().getString("mysql.port"));
// hikariDS.addDataSourceProperty("databaseName",
// getConfig().getString("mysql.database"));
// hikariDS.addDataSourceProperty("user", getConfig().getString("mysql.user"));
// hikariDS.addDataSourceProperty("password",
// getConfig().getString("mysql.password"));
// hikariDS.addDataSourceProperty("autoReconnect", true);
// hikariDS.addDataSourceProperty("cachePrepStmts", true);
// hikariDS.addDataSourceProperty("prepStmtCacheSize", 250);
// hikariDS.addDataSourceProperty("prepStmtCacheSqlLimit", 2048);
// hikariDS.addDataSourceProperty("useServerPrepStmts", true);
// hikariDS.addDataSourceProperty("cacheResultSetMetadata", true);
return hikariDS;
}
ManagerStep:
#Bean
public Step managerStep() {
return stepBuilderFactory.get("managerStep")
.partitioner(workerStep().getName(), dateRangePartitioner())
.step(workerStep())
// .gridSize(52) // number of worker, which is not necessary with datepartition
.taskExecutor(new SimpleAsyncTaskExecutor())
.build();
}
WorkerStep: I also tried to increase the transaction properties timeout, but not luck
#Bean
public Step workerStep() {
DefaultTransactionAttribute attribute = new DefaultTransactionAttribute();
attribute.setPropagationBehavior(Propagation.REQUIRED.value());
attribute.setIsolationLevel(Isolation.DEFAULT.value());
// attribute.setTimeout(30);
attribute.setTimeout(1000000);
return stepBuilderFactory.get("workerStep")
.<Image, Image>chunk(10)
.reader(jdbcPagingReader(null))
.processor(new ImageItemProcessor())
.writer(imageConverter())
// .listener(wrkrStepExecutionListener)
.transactionAttribute(attribute)
.build();
}
Job builder:
#Bean
public Job mainJob() {
return jobBuilderFactory.get("mainJob")
// .incrementer(new RunIdIncrementer())
.start(managerStep())
// .listener()
.build();
}
Top module is as follows;
class PE (DataWidth: Int, NumLinks: Int, NumEntries: Int, FifoDepth: Int) extends Module {
val io = IO(new Bundle {
...
})
...
}
I think that this is ordinary style for the chisel3.
I do the following run of sbt to lint the code;
sbt 'test:runMain noc.PEMain'
Then I get bellow error messages;
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
[info] Running noc.NoCMain
[info] [0.002] Elaborating design...
[error] (run-main-0) chisel3.internal.ChiselException: Error: attempted to instantiate a Module without wrapping it in Module().
[error] chisel3.internal.ChiselException: Error: attempted to instantiate a Module without wrapping it in Module().
[error] at chisel3.internal.throwException$.apply(Error.scala:13)
[error] at chisel3.core.BaseModule.<init>(Module.scala:90)
[error] at chisel3.core.UserModule.<init>(UserModule.scala:18)
[error] at chisel3.core.ImplicitModule.<init>(UserModule.scala:102)
[error] at chisel3.core.LegacyModule.<init>(UserModule.scala:127)
[error] at noc.NumGen.<init>(NoC.scala:328)
[error] at noc.FanIn_Link.<init>(NoC.scala:376)
[error] at noc.PE$$anonfun$12.apply(NoC.scala:490)
[error] at noc.PE$$anonfun$12.apply(NoC.scala:490)
[error] at chisel3.core.Module$.do_apply(Module.scala:49)
[error] at noc.PE.<init>(NoC.scala:490)
[error] at noc.NoCMain$$anonfun$1.apply(NoCMain.scala:27)
[error] at noc.NoCMain$$anonfun$1.apply(NoCMain.scala:27)
...
[error] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:297)
[error] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:295)
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at chisel3.internal.Builder$.build(Builder.scala:295)
[error] at chisel3.Driver$.elaborate(Driver.scala:93)
[error] at chisel3.Driver$.execute(Driver.scala:140)
[error] at chisel3.iotesters.setupTreadleBackend$.apply(TreadleBackend.scala:139)
...
[error] at logger.Logger$$anonfun$makeScope$1.apply(Logger.scala:138)
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at logger.Logger$.makeScope(Logger.scala:136)
...
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at chisel3.iotesters.Driver$.execute(Driver.scala:38)
[error] at chisel3.iotesters.Driver$.execute(Driver.scala:100)
[error] at noc.NoCMain$.delayedEndpoint$noc$NoCMain$1(NoCMain.scala:27)
[error] at noc.NoCMain$delayedInit$body.apply(NoCMain.scala:26)
[error] at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
...
[error] at scala.App$class.main(App.scala:76)
[error] at noc.NoCMain$.main(NoCMain.scala:26)
[error] at noc.NoCMain.main(NoCMain.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
[error] at java.lang.Thread.run(Thread.java:745)
And lastly this error:
[error] (Test / runMain) Nonzero exit code: 1
I find warning of:
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
Command of sbt 'show discoveredMainClasses' shows;
[info] Loading settings from plugins.sbt ...
[info] Loading project definition from /Users/hoge/Desktop/NoC/project
[info] Loading settings from build.sbt ...
[info] Set current project to en-noc (in build file:/Users/hoge/Desktop/NoC/)
[info] *
[success] Total time: 1 s, completed 2019/10/22 2:08:49
What does this error message mean and how can I fix it?
sbt 'testOnly noc.PETester'
introduced;
[info] at chisel3.core.LegacyModule.<init>(UserModule.scala:127)
This is caused at
val io = IO(new Bundle {
val No = Output(Vec(NumLinks, UInt((log2Ceil(NumLinks)).W)))
})
It seem that the main problem is:
attempted to instantiate a Module without wrapping it in Module()
This may rise due to you making a new instance of a class extended from Module but you are probably not wrapping it into one.
For example you in your code you are doing something like:
val test = new module_class
where as you should be doing
val test = Module(new module_class)
I am facing the below issue when trying to create and run a large number of JUnit 5 dynamic tests using maven-surefire-plugin 2.21.0
[ERROR] GC overhead limit exceeded
[ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException: There was an error in the forked process
[ERROR] GC overhead limit exceeded
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:673)
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:535)
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:280)
[ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:245)
[ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1124)
[ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:954)
[ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:832)
[ERROR] at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
[ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
[ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:154)
[ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:146)
[ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
[ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
[ERROR] at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
[ERROR] at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
and below is my code, it is a db comparison tool. What basically I am doing is, stream data from two databases and compare the record and fail a test if there is a difference in any of two records so there will be as many dynamic tests created as many records in the db (~14M), pretty large, not sure whether dynamic tests were meant to use at this scale. Any help on this regard is greatly appreciated :)
#TestFactory
Stream<DynamicNode> dynamicTestsWithContainers() throws SQLException {
return tableNameProvider()//List of db tables ~100
.map(tableName -> dynamicContainer(tableName, dynamicNodeStream(tableName)));
}
private Stream<DynamicNode> dynamicNodeStream(String tableName) {
try {
System.out.println("Testing " + tableName);
Stream<Row> rows = sourceRepo.rows(tableName);
Stream<List<Row>> batchRows = batch(rows, 10000);
Optional<TableSchema> tableInfo = sourceRepo.getTableSchema(tableName);
final Stream<DynamicNode> dynamicNodeStream = batchRows
.flatMap(batch -> {
Map<String, Row> sourceRowsMap = buildRowMap(batch, tableInfo.get());//HashMap with 10000 Objects
Map<String, Row> targetRowsMap = targetRepo.getTargetDBRows(sourceRowsMap, tableInfo.get());//HashMap with 10000 Objects
Set<String> commonKeys = Sets.intersection(sourceRowsMap.keySet(), targetRowsMap.keySet());
final Stream<DynamicTest> dynamicTestStream = Streams.concat(
Stream.of(
dynamicTest("All source records should be present in target DB", () -> assertThat(targetRowsMap.keySet())
.as("Comparing " + sourceRepo.getServerName() + " against " + targetRepo.getServerName())
.hasSameElementsAs(sourceRowsMap.keySet()))
),
commonKeys
.stream()
.map(rowKey -> dynamicTest(tableName + " Row with #" + rowKey + " should be same in target DB",
() -> assertThat(targetRowsMap.get(rowKey).getRowData())
.isEqualToComparingFieldByFieldRecursively(sourceRowsMap.get(rowKey).getRowData())
))
);
return dynamicTestStream;
});
return dynamicNodeStream;
} catch (Exception e) {
throw new RuntimeException("Error running tests on table " + tableName, e);
}
}
Try:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-XX:+UseConcMarkSweepGC</argLine>
</configuration>
</plugin>
in your build profile to allow concurrent garbage collection.
I have been following tutorials on PlayFramework, but it seems that they are all outdated and for older version of Play.
I want to try JUnit test using mysql databse not h2 in-memory database.
I am using ebean ORC (which clearly has different api than it used to have inside play 2.2, and the api is not really well documented).
Anyway I want to try JUnit test on mysql databse, but I am always getting a configuration error.
This is how the JUnit test class looks:
public class ModelsTest extends WithApplication {
public Application app;
#Before
public void setUp() throws FileNotFoundException, IOException {
java.util.Properties externalProps=new java.util.Properties();
externalProps.load(new FileInputStream("resources/test-ebean.properties"));
ServerConfig config = new ServerConfig();
config.setName("test");
config.setDefaultServer(true);
config.loadFromProperties(externalProps);
EbeanServer server = EbeanServerFactory.create(config);
app = Helpers.fakeApplication();
Helpers.start(app);
}
#Test
public void createAndRetrieveUser() {
new User("bob#bob.bob", "admin", "admin").save();
User bob = User.find.where().eq("email", "bob#bob.bob").findUnique();
assertNotNull(bob);
assertEquals("admin", bob.login);
}
#After
public void stopApp() {
Helpers.stop(app);
}
test-ebean.properties file :
ebean.ddl.generate=true
ebean.ddl.run=true
datasource.default=db
datasource.db.username="root"
datasource.db.password="root"
datasource.db.databaseUrl="jdbc:mysql://localhost:3306/test?characterEncoding=UTF-8"
datasource.db.databaseDriver=com.mysql.jdbc.Driver
When running test I am getting this error:
[error] Test ModelsTest.createAndRetrieveUser failed:` play.api.Configuration$$anon$1: Configuration error[null], took 4.969 sec
[error] at play.api.Configuration$.configError(Configuration.scala:154)
[error] at play.api.Configuration.reportError(Configuration.scala:806)
[error] at play.Configuration.reportError(Configuration.java:366)
[error] at play.db.ebean.DefaultEbeanConfig$EbeanConfigParser.parse(DefaultEbeanConfig.java:81)
[error] at play.db.ebean.DefaultEbeanConfig$EbeanConfigParser.get(DefaultEbeanConfig.java:60)
[error] at play.db.ebean.DefaultEbeanConfig$EbeanConfigParser.get(DefaultEbeanConfig.java:44)
[error] at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:81)
[error] at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:72)
[error] at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:61)
[error] at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:62)
[error] at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
[error] at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1103)
[error] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
[error] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:145)
[error] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
[error] at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:38)
[error] at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:62)
[error] at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:104)
[error] at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
[error] at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:267)
[error] at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
[error] at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1103)
[error] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
[error] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:145)
[error] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
[error] at com.google.inject.internal.FactoryProxy.get(FactoryProxy.java:56)
[error] at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
[error] at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1103)
[error] at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
[error] at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:145)
[error] at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
[error] at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:205)
[error] at com.google.inject.internal.InternalInjectorCreator$1.call(InternalInjectorCreator.java:199)
[error] at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1092)
[error] at com.google.inject.internal.InternalInjectorCreator.loadEagerSingletons(InternalInjectorCreator.java:199)
[error] at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:180)
[error] at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:110)
[error] at com.google.inject.Guice.createInjector(Guice.java:96)
[error] at com.google.inject.Guice.createInjector(Guice.java:84)
[error] at play.api.inject.guice.GuiceBuilder.injector(GuiceInjectorBuilder.scala:181)
[error] at play.api.inject.guice.GuiceApplicationBuilder.build(GuiceApplicationBuilder.scala:123)
[error] at play.api.test.FakeApplication.<init>(Fakes.scala:209)
[error] at play.test.FakeApplication.<init>(FakeApplication.java:51)
[error] at play.test.Helpers.fakeApplication(Helpers.java:124)
[error] at play.test.WithApplication.provideFakeApplication(WithApplication.java:46)
[error] at play.test.WithApplication.provideApplication(WithApplication.java:33)
[error] at play.test.WithApplication.startPlay(WithApplication.java:51)
[error] ...
[error] Caused by: java.lang.NullPointerException
[error] at play.db.ebean.DefaultEbeanConfig$EbeanConfigParser.parse(DefaultEbeanConfig.java:79)
[error] ... 78 more
[error] Test ModelsTest.createAndRetrieveUser failed: java.lang.NullPointerException: null, took 4.979 sec
[error] at play.test.Helpers.stop(Helpers.java:376)
[error] at ModelsTest.stopApp(ModelsTest.java:58)
[error] ...
[error] Failed: Total 1, Failed 1, Errors 0, Passed 0
[error] Failed tests:
[error] ModelsTest
[error] (test:testOnly) sbt.TestsFailedException: Tests unsuccessful
I just started learning play (but actually most of tutorials are outdated) and I have spent more time trying to configure it to run than actually coding. I guess I should look up another framework.
You don't need to set up a whole database just for testing (you are free to, of course). Play relies strongly on In-Memory databases (e.g. during development) and you can utilize this also in your tests:
#Test
public void findById() {
running(fakeApplication(inMemoryDatabase("test")), () -> {
User bob = User.findById(21l);
assertEquals("bob#bob.bob", bob.email);
assertEquals("admin", bob.login);
});
}
On the other hand if you really want to test the database access code you can go as far as creating a Database test object:
Database database = Databases.createFrom(
"com.mysql.jdbc.Driver",
"jdbc:mysql://localhost/test"
);
Which again can be in-memory:
Database database = Databases.inMemory(
"mydatabase",
ImmutableMap.of(
"MODE", "MYSQL"
),
ImmutableMap.of(
"logStatements", true
)
);
Just don't forget to release the resources after the test:
#After
public void shutdownDatabase() {
database.shutdown();
}
I have jaxb2-maven-plugin version 1.5 and axistools-maven-plugin version 1.4 When I do maven generate sources I get an error .. any help would really appreicate.
[INFO]
[INFO] --- jaxb2-maven-plugin:1.5:xjc (default) # TouchStoneCore ---
[INFO] Generating source...
[INFO] parsing a schema...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.456s
[INFO] Finished at: Wed Jun 26 15:40:58 PDT 2013
[INFO] Final Memory: 28M/351M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:jaxb2-maven-plugin:1.5:xjc (default) on project TouchStoneCore: Execution default of goal org.codehaus.mojo:jaxb2-maven-plugin:1.5:xjc failed: An API incompatibility was encountered while executing org.codehaus.mojo:jaxb2-maven-plugin:1.5:xjc: java.lang.NoSuchMethodError: org.apache.xerces.impl.xs.XMLSchemaLoader.loadGrammar([Lorg/apache/xerces/xni/parser/XMLInputSource;)V
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.codehaus.mojo:jaxb2-maven-plugin:1.5
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/C:/M2/repository/org/codehaus/mojo/jaxb2-maven-plugin/1.5/jaxb2-maven-plugin-1.5.jar
[ERROR] urls[1] = file:/C:/M2/repository/org/codehaus/plexus/plexus-interpolation/1.1/plexus-interpolation-1.1.jar
[ERROR] urls[2] = file:/C:/M2/repository/com/sun/xml/bind/jaxb-xjc/2.1.13/jaxb-xjc-2.1.13.jar
[ERROR] urls[3] = file:/C:/M2/repository/com/sun/xml/bind/jaxb-impl/2.1.13/jaxb-impl-2.1.13.jar
[ERROR] urls[4] = file:/C:/M2/repository/javax/xml/bind/jaxb-api/2.1/jaxb-api-2.1.jar
[ERROR] urls[5] = file:/C:/M2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
[ERROR] urls[6] = file:/C:/M2/repository/javax/activation/activation/1.1/activation-1.1.jar
[ERROR] urls[7] = file:/C:/M2/repository/org/codehaus/plexus/plexus-compiler-api/1.9.1/plexus-compiler-api-1.9.1.jar
[ERROR] urls[8] = file:/C:/M2/repository/org/codehaus/plexus/plexus-utils/3.0.4/plexus-utils-3.0.4.jar
[ERROR] urls[9] = file:/C:/M2/repository/org/sonatype/plexus/plexus-build-api/0.0.7/plexus-build-api-0.0.7.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
try below code snippet
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<version>1.6</version>
For java 8 you can try with the higher versions
This is an issue with versioning. Try using a different version of jaxb2-maven-plugin until you get this to pass.
I use Linux and resolve this issue by -
export JAVA_HOME=/usr/lib/jvm/{java version}
Example -
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64