Dockerising your legacy integration tests

As most of you will know, every now and then you can expect the question ‘Can you help fix something on this ancient project?’ According to the Universal Software Ageing table, that means anything from last month to much, much older. I got such a question recently, being in-between projects, and this project fell squarely into the ‘much older’ category. As luck would have it there was still a previous developer around, for some definition of around that is. (Sidenote, if you’re not that lucky, I suggest you look into this other thing called Software Archaeology.)

So having had the ultra quick introduction (‘there’s the git repo, good luck’) I started looking at what was needed to get this thing running on my local machine and earn that much coveted ‘Works on My Machine Certification‘. Luckily it was already using Java 8 and maven so I could immediately start firing off:

<br>
mvn clean verify<br>

Obviously, that didn’t go too well out of the box. Someone had found it necessary not only to run some integration tests against a database but also to have that database be a regular MySQL instead of some in-memory one like H2 or HSQLDB.

Now, how do we get to go past this hurdle? In the old days, we would have installed a local copy of MySQL, manually configured it with the user and database necessary for the project and then run the tests again. But even a Grey Beard developer like myself learns some new tricks every now and then and the last few years there’s that thing called Docker that helps us run stuff if and when we need it.

Integration testing with MySQL – take 1

So, we now know we need MySQL and want to use Docker. Let’s get started there.

<br>
docker container run mysql<br>

Ok. If that would be all, there wouldn’t be much of a point to this blog now, would there? Indeed. The above is not all. Our test requires the database server to be a specific version, available at a certain address, with a certain user and a certain database. And since we don’t want to keep a console open every time, let’s run it in the background:

Ok. At least the tests want to start now, but for some reason, it complains that the tables generated by Flyway (for an introduction to that see a write up by another colleague) cannot be found. Long story short, we hit that particular quirk where MySQL is running on Linux (in the container) and we’re using Hibernate to talk to it: MySQL created the tables in all lower case and Hibernate asks for the camel-cased name. Usually solved by hacking the /etc/mysql/my.cnf file on the server, but this is Docker and we don’t want to create our own custom container just for that. Luckily, the official MySQL Docker image allows you to pass in command arguments so using

we finally manage to get those pesky tests to pass. Does that mean we’re done yet? Of course not. Although this solution has gone from a laborious set of documented steps to a significantly lesser amount of steps, it is still documentation and we all know how well that is read…

Integration testing with MySQL – take 2

The question then becomes if there is a way to automate all of this a little further. Perhaps as part of the build so we don’t have to write the documentation nor have to do anything but run the mvn clean verify command.

This makes sure that before the integration tests are started, we start the docker container and when the integrations tests are down, we stop it. You did properly separate the integration tests from the regular unit tests, didn’t you?

So, that would be a slight alteration to your default maven surefire plugin configuration then:

Integration testing with MySQL – take 3

According to the Rule of Three, there has to be a Part 3. Rejoice then, because there is!

After part two, we had our database automatically handled via maven and we could leave it at that, but what if I would like to run a test from my IDE? Or if I were to develop a new test or debug an existing one? Do I keep running it via maven? Nooooooooo.

Ok. Halloween is not there yet, so hold off on the dramatics there. Still. There is that thingy with using the solution via maven… can we work around that too?

Yes, we can! Using some even newer-fangled thingy called test containers instead of using the fabric8.io plugin. It allows us to use a container in our test code as if it was just another object.

Unlike the previous two steps, we do have to add a custom .cnf file to work around the table name issue and the mysql_overrides parameter refers to a directory on the classpath where we could have a file called override.cnf with the following content:

<br>
[mysqld]<br>
lower_case_table_names = 1<br>

Now, it so happens to be that our integration tests use Spring. Mind you that would be the plain Spring Framework, not that recent whatchamacallit upstart Spring Boot. So we could completely rewrite our tests to ditch all of that work, or we could integrate this testcontainer.org magic into our existing test. Like this for instance:

Using this, we can continue using the existing tests and don’t have to introduce all kinds of maven plugins. Just modify all existing test code…

Summary

Now, where does this leave us?

Whatever step we take, we have to have Docker installed. That used to be a thing, but many systems are fully supported now so I’ll assume that is not an issue anymore. Besides, just having to support Docker on your machine instead of a gazillion databases and other server software in as many different versions sounds like an improvement in its own right.

Step 1 would obviously be the least invasive and quickest to implement. All you have to do is get the right docker run command and you’re done. And then you have to remember that one, hand it to your successor somehow and don’t forget to keep it running after a reboot… You get the point I guess.

Step 2 gives you some more automation and should be compatible with CI environments as well. It also required only changes to build configuration. There are some drawbacks though. I’ve already mentioned the fact you’re now dependant on running the integration tests with maven all the time. Observant readers may also have noticed that the plugin configuration contained a line starting with unix. If you’re working on Windows that would have to change and if you’re working in an environment like me, where people can choose their own tools (within limits, though I am surprised we haven’t got someone running OpenBSD or somesuch on their development machine).

Step 3 arguably gives you the most power. It uses Docker and is independent of any build tool so it works in your favourite IDE as well. It does come at a cost though. It requires code changes, which, depending on the age and state of the project, may not be that easy.
I also noted during my experiments that the build times would vary quite a bit. Whereas the maven plugin solution would clock in at 2 minutes sharp every time for my project, the testcontainers.org solution would usually be a bit slower and vary between 1:56 and 2:13 in my, granted highly non-scientific experiments. YMMV.

So, what am I going to do? Integrate the testcontainers.org solution into our project as that seems the most useful for us. And for the future, I’ve got a few more options to choose from.

2 Comments

Great post! First of all, i want to thanks to this site,its really a amazing place where anyone can enhance his/her knowledge related to Integration testing. Each step were explained very clearly. It’s easy to understand for me. keep up the immense work!

Thanks for this outstanding post. it’s my first visit at this web page, and post is truly fruitful for me, in your site. This article gives a great for the beginners who want to start their career in integration testing. such a great information given in this article. Nice article with lot’s of information about integration testing.