JUnit testing with hard coded database queries - junit

Can I use JUnit for unit testing when the function I want to test contains hard coded database queries (these are written in Java).

You can start a database instance via Docker before running your tests. For example, I can use Gradle Docker Compose Plugin to start a PostgreSQL server before testings. The database schema update is conducted automatically by Liquibase during Spring Boot initialization.
Note that you have to make sure the database states is clean or under control before running every test.

Related

Can I run a Docker container to use in another Google Cloud Build step?

I'd like to run a fresh MySQL instance in a Docker Container as a Cloud Build, and then access that MySQL DB in a later step to run Unit Tests against. Is this possible? It appears as if I can run a Docker Container in a build step, but the step doesn't complete until the Container exists. I'd like this MySQL container to remain running until after the final build step completes.
FWIW I'd like to use this on a Ruby on Rails project to run rspec tests. I currently use a CloudSQL instance to run tests against, but it's pretty slow, even though the same tests run quickly locally. Changing the machine-type for the Cloud Builder to something powerful didn't help, so I assume latency is my biggest killer, which is why I want to try a peer Container MySQL instance instead.
It turns out there are at least 2 ways to skin this cat:
Use docker-compose cloud builder to spin up multiple containers in 1 step: MySQL and a test runner. The downside here is the step will never complete unless since MySQL will run in the background and never exit. I suppose one could write a wrapper to cause it to die after a few minutes.
You can actually start a container with -d in an early build step and ensure it's on the cloudbuild docker network, and then later steps can connect to it if they're also on the cloudbuild network. Essentially the Mysql step will "complete" quickly as it just starts the server in daemon mode then continues to next build step. Later, the test runner will run tests against the fresh DB and its build step completes when tests are actually done.
I went with option 2, and my 16-min unit tests (run against CloudSQL in same region) shrunk down to 1.5mins using the dockerized MySQL server.
AFAIK, you can't do this. Each step is independent and you can't run a background step.
My solution here is to have the MySQL in the same step as your unit test, and to run MySQL as a background process, in the same step. Quite boring (because you have to install and run MySQL in your step) but I haven't better solution.
For an easier use, you can create your own custom builder for Cloud Build

SpringBoot Redis integration testing

I have an application with MySQL and SpringBoot.
Redis is being used for caching with spring cache annotations.
Now, starting up Redis etc is not an issue as I am using Docker compose to dynamically allocate containers for testing.
But, what is the proper way of verifying that the data is actually being written and read from Redis cache not from Mysql?
You could prevent the client code to communicate with the MySQL database during the part of the integration test where you want to ensure that only Redis is used.
You don't precise the exact way to communicate with the MySQL database, so I cannot give you a specific advise.
But here some ideas :
rely on an MySQL backend service implementation that throws exception as any method is invoked
shutdown the MySQL database
use an empty MySQL database
To test it, I create the record from rest endpoint, then I remove it from the db using spring repository directly. Then, since it is cached, it is should still be accessible by rest endpoint from cache even it is not present in the db.

Arquillian with JBoss - How to run an .sql script at the very end to repopulate the database once all Arquillian tests have fininshed?

We have a project which uses JUnit test cases and runs with Arquillian ontop of JBoss server. When the tests run, the database tables are emptied and repopulated with the test data.
Is it possible to repopulate the database with default data from a .sql file at the very end of the tests? I could call the .sql file after all tests have finished in a single test class using #AfterClass but this approach is not efficient as the default dataset is too large.
I would appreciate any feedback.
what you are mention here is something that it is a suite level. Currently, I am not aware that you can do this with APE that always works at test class level. In this specific case what I suggest is that you use a tool like Flyway (with Maven plugin for example) to populate everything before test run phase.

Logging DB, should Dev, Test and Production all write to the same Log DB?

I have done some search on internet, and I have learnt that it is a good practice to separate the logging DB from the main project DB...
If we have a dedicated logging DB, would it make sense to use the same logging DB for Dev, Test and Prod environments? or each environment requires its own logging DB?
The main reason that I though about using the same server, is the cost of having extra DBs... My aplication is in ASP.Net MVC, my Db is MySQL and I am using Nlog for logging.

How do I spin up a MySQL database in Maven for use with our JUnit tests?

We're using Maven 3.0.3, MySql 5.5, JUnit 4.8.1, and Liquibase. Here's what we would like to do. I would like to spin up a MySQL database, run our liquibase database scripts against it, and then run our JUnit tests, using this newly-created database as the datasource. If someone knows of a way to do this using Maven, know that any sample script you post will get more votes than you know what to do with -- and the respect you so dearly deserve.
I realize that the H2 in-memory database supports a MySQL mode, but we don't want to use that. We want to get as close to the real thing as possible.
Steps:
Use the Maven SQL plugin to create a database.
Use the Maven liquibase plugin to run your scripts.
Run your tests (standard surefire).
Use the Maven SQL plugin again to drop the database, so you can start fresh the next time.