I need to write unit test cases for a java application using Junit.
I thought of using embedded database like H2, but the challenge i am facing is, i have multiple test classes, and i thought of writing a sql script file for creating tables and initializing data for test db.
But how do we ensure it runs only once before any test case?
If there are multiple test classes, we cannot write these scripts in every test class in a #BeforeClass method.
There should be some other way. My application uses Oracle db but for testing I thought of using H2.
Any suggestions would be really appreciated!
1) Like #BeforeClass that will run once at the beginning of the tests suite, there is #Before that will run before every test.
2) If you are doing a unit test to a certain class/piece of code, then it is better practice not work with a real DB, but with a mock DB that you inject into the code that you are testing and setting its behavior. A popular mocking framework is Mockito
So you can do things like that:
when(carsRepository.findById(id)).thenReturn(null);
when(carsRepository.findAllByCustomerName(customerName)).thenReturn(someCars);
Related
I'm writting integration tests and I need to clean up database after one suite test (via #TestMethodOrder) is finished, either successfuly or with failure.
Obviously first thing come to mind was to use method with #AfterAll annotation, but it needs to be static. I'm using #Autowired JdbcTemplate, which cannot be static. Actually after few searches I found out, that no database connection should be static.
Is there any good solution for integration testing optionally with suits?
I want to write an integration test that uses MySQL to test my queries. How to do this in golang?
This contains few questions:
How to setup MySQL (in-memory?) server in golang test?
How to clean/recreate data model before/after each test so that they do not leave garbage behind?
How to tear down mysql after all the tests are done?
If you really want to have an embedded MySQL, you can use golangs C bindings to integrate with: https://dev.mysql.com/doc/refman/5.1/en/libmysqld.html. I haven't seen any project packing up the bindings for this in a nice Go package, that would be an interesting small project.
Otherwise you can use Docker to set up a throwaway MySQL server, this requires some setup/teardown steps before you run go test though. This is what we are doing where I work.
In both cases, you will need to write setup/teardown methods that creates and drops tables as needed for your tests. These are just normal SQL statements, DROP DATABASE, CREATE TABLE etc.
Testify https://github.com/stretchr/testify has tooling for setup/teardown, but just writing a helper function for this works just fine.
I am collecting data and store this data in a MySQL database using Java. Additionally, I use Maven for building the project, TestNG as a test framework, and Spring-Jdbc for accessing the database. I've implemented a DAO layer that encapsulates the access to the database. Besides adding data using the DAO classes I want to execute some queries which aggregate the data and store the results in some other tables (like materialized views).
Now, I would like to write some testcases which check whether the DAO classes are working as they should. Therefore, I thought of using an in-memory database which will be populated with some test data. Since I am also using MySQL-specific SQL queries for aggregating data, I went into some trouble:
Firstly, I've thought of simply using the embedded-database functionality provided by Spring-Jdbc to instantiate an embedded database. I've decided to use the H2 implementation. There I ran into trouble because of the aggregation queries, which are using MySQL-specific content (e.g. time-manipulation functions like DATE()). Another disadvantage of this approach is that I need to maintain two ddl files - the actual ddl file defining the tables in MySQL (here I define the encoding and add comments to tables and columns, both features are MySQL-specific); and the test ddl file that defines the same tables but without comments etc. since H2 does not support comments.
I've found a description for using MySQL as an embedded database which I can use within the test cases (http://literatitech.blogspot.de/2011/04/embedded-mysql-server-for-junit-testing.html). That sounded really promising to me. Unfortunately, it didn't worked: A MissingResourceExcpetion occurred "Resource '5-0-21/Linux-amd64/mysqld' not found". It seems that the driver is not able to find the database daemon on my local machine. But I don't know what I have to look for to find a solution for that issue.
Now, I am a little bit stuck and I am wondering if I should have created the architecture differently. Do someone has some tips how I should setup an appropriate system? I have two other options in mind:
Instead of using an embedded database, I'll go with a native MySQL instance and setup a database that is only used for the testcases. This options sounds slow. Actually, I might want to setup a CI server later on and I thought that using an embedded database would be more appropriate since the test run faster.
I erase all the MySQL-specific stuff out of the SQL queries and use H2 as an embedded database for testing. If this option is the right choice, I would need to find another way to test the SQL queries that aggregates the data into materialized views.
Or is there a 3rd option which I don't have in mind?
I would appreciate any hints.
Thanks,
XComp
I've created Maven plugin exactly for this purpose: jcabi-mysql-maven-plugin. It starts a local MySQL server on pre-integration-test phase and shuts it down on post-integration-test.
If it is not possible to get the in-memory MySQL database to work I suggest using the H2 database for the "simple" tests and a dedicated MySQL instance to test MySQL-specific queries.
Additionally, the tests for the real MySQL database can be configured as integration tests in a separate maven profile so that they are not part of the regular maven build. On the CI server you can create an additional job that runs the MySQL tests periodically, e.g. daily or every few hours. With such a setup you can keep and test your product-specific queries while your regular build will not slow down. You can also run a normal build even if the test database is not available.
There is a nice maven plugin for integration tests called maven-failsafe-plugin. It provides pre- and post- integration test steps that can be used to setup the test data before the tests and to cleanup the database after the tests.
I am writing an MCV3 application using Windsor as the IoC container. I am using Cassini-dev and WaitN in the Acceptance tests and have a number of basic tests which work fine.
What I normally do in my acceptance testing is fire up a new database with a unique name, populate it with some data, run the test and then through the database away.
In order to do this I need to provide my MVC3 application the new database connection string which is wired up to a configuration object passed into Windsor.
Additionally I will need to mock out a couple of components that do not exist in my testing environment and need to pass those into Windsor instead or the real objects.
If anyone has done this or something similar I would be interested to hear about your experience.
Cassini is merely a web server and ASP.NET host, it has nothing to do with this. But you could use web.config transformations to select different configurations depending on environment. Here's an example that shows how to change connection strings.
My entire environment, java, js, and php are set up with our continuous integration server (Hudson).
But how do I get out database into the mix?
I would like to deploy fresh MySql databases for unit testing, development, and qa.
And then I'd like to diff development against production and have an update script that would be used for releasing.
I would look at liquibase (http://www.liquibase.org/). It's a open source java based db migration tool that can be integrated into your build script and can handle db diffing. I've used it before to manage db updates on a project before with a lot of success.
You could write a script in Ant to do all that stuff and execute it during the build.
Perhaps investigate database migrations such as migrate4j.
Write a script that sets up your test database. Run it from your build tool, whatever that is, before your build tests run. I do this manually and it works pretty well; still integrating it into maven. Shouldn't be too much trouble.
Isn't HyperSQL in-memory DB (http://hsqldb.org/) better for running your tests?
For managing migrations to your database sechema between releases, you could do a lot worse than to use Scala migrations:
http://opensource.imageworks.com/?p=scalamigrations
It's an open source tool that I've found to integrate well in a Java development ecosystem, and has extra appeal if any of your team have been looking at ways to introduce Scala.
It should also be able to build you a database from scratch, for testing purposes.
Give http://code.google.com/p/mysql-php-migrations/ a try!
Very PHP oriented, but seems to work well for most purposes.