On OSX and Windows, as you might know, directories are case insensitive. So the directory CaseSensitive and casesensitive are the same on those operating systems. But on Linux, they are different directories.

The interesting thing about this is, that when you use docker and attach a volume to the container from OSX or Windows, that directory will be case insensitive. While other directories within the container are case sensitive (because it runs Linux). Which makes perfect sense, as the directory in the volume is managed by your own OS.

While I’m working with Terraform, I’d thought I’d share the way I set up my DNS for my Virtual Private Cloud (VPC).

If you want traffic to be routed to one of your services, you need a DNS. AWS offers Route 53 as a DNS Service. ‘Hosted zones’ in the route 53 service define records like a telephone book defines phone numbers. At the time of writing, every hosted zone you add will cost about $0.61 per month. If you buy a domain from AWS, Route 53 will make a record for 4 name servers that will translate the domain to an IP address. If the domain is registered with another party, AWS offers straight forward steps to either migrate from that party’s DNS to Route 53, or let you add records of external name servers to Route 53.

There are ways to use AWS Services with an external DNS provider, but I recommend Route53 to save you a headache.

Separating concerns is something we as developers are used to thinking about in terms of code. But the same also applies to identity management. If you’ve dabbled in AWS, you can get started right away with a root account. However, when it goes beyond dabbling, it might be a good idea to start splitting up responsibilities.Continue Reading

Introduction

You might have heard about Mutation Testing before. In the last 5 or 6 years it’s been a reasonably hot (“warm”?) topic to discuss in blogs and dev talks. So what is the added value over code coverage with just Unit Testing? Even if you could pride yourself with over 90% line and branch coverage, that coverage means nothing apart from that unit tests are touching production code. It says nothing about how well that code is tested, it doesn’t care whether any asserts exist in your tests. Imagine an engineer that tests a power drill he designed on a sheet of paper, and declaring that it does exactly what it was designed for: drilling holes. It’s obvious that this test is meaningless for a power drill that is meant to be used on wood, steel or stone.

Mutation tests aim to expose tests that cover the lines they’re meant to cover but are insufficient in testing the intent of the code. The idea behind this is fairly simple: introduce “mutants” in the code that is being tested, and check whether the unit tests that cover these mutants still succeed or start to fail. If a test still succeeds, that means the test falls short of verifying the complete intent of the code!

When creating a Spring Boot Rest service, you can configure Spring to convert a LocalDateTime to display as a ISO-8601 date string when returning a JSON response. To get this working you have to do a few things. Firstly, you need the following dependency:

1

com.fasterxml.jackson.datatype:jackson-datatype-jsr310

This dependency has all the JSON serialisers and deserialisers for the Java 8 time API, and when you use Spring Boot with auto configuration, it should load all the correct serialisers. Secondly, you need to add the following to your application properties:

Although Java has always been awesome, Java 8 has brought the language several features the language was in dire need of. Apart from the long-awaited improved DateTime-API and the introduction of Optionals, Java 8 finally gave behaviour the attention it deserves by incorporating (a form of) functional programming into the language using lambdas.

Some time ago, I was working on a project where I had to fix an issue that was raised by our OWASP Zap scanner, which is a free security tool that runs in the test phase of the Jenkins build of the project. It checks for security vulnerabilities that you want to prevent from going to Production.

The error / warning that was raised looked like this:

X-Frame-Options header is not included in the HTTP response to protect against ‘ClickJacking’ attacks.

That’s pretty generic and anything could’ve cause that. The odd thing was that we actually had anti-clickjacking libraries in place for our service, so where was this coming from?

Getting Stomp working using Spring was very easy. It didn’t seem that easy to me because I tested my implementation with an integration test which seemed to fail at least as much as it succeeded. In this post I will take you through my journey of getting a stable integration test on Spring Websockets.

“What is this Kafka I’ve been hearing about?”

In short, Kafka is a horizontally scalable streaming platform. In other words, Kafka is a message broker which can be run on multiple servers as a cluster. Different data streams are called topics. Producers can place messages on a topic whereas consumers can subscribe to topics. Topics can be configured for single- and multiple delivery of messages. Consumers can be grouped in so called consumer-groups, which makes it possible for multiple consumers to act as one when it comes to single-delivery.

But don’t take my word for it. There’s a lot more to Kafka than I can get into in this post and the original documentation is much clearer, so check out the documentation at https://kafka.apache.org/.

“How do I use Kafka in my Spring applications?”

Among all the abstractions Spring Boot delivers there is also an abstraction layer for using Kafka, called Spring Cloud Stream. The use of the cloud messaging API makes it very easy to produce messages to Kafka and to consume them.

Normally when using swagger, you generate a swagger.yaml file for your API. But what if you already have a swagger.yaml file and you want to generate the API interface and models, like you would also do with a webservice using a WSDL file? To achieve this, swagger has a great tool: swagger-codegen. The tool has a CLI and a maven plugin, but no gradle plugin. So how do we use it with gradle?

Here is a list of things we need to do to get it to work:

Create a gradle task that generates the API interface every time we build (which should be generated to src/generated/java to keep everything separated)

Make sure ‘gradle clean’ also cleans the generated files

Support gradle incremental builds

Making sure it compiles: the generated classes should be available in src/main/java, and execute the generate task before build/compile

BONUS: IntelliJ should generate the files automatically when you import the project or sync the project.

A big thanks to my colleague Willem Cheizoo from JDriven for helping me create this list and pointing me in the right direction.

Creating a generate task

To be able to use the codegen in our gradle task, we need to add the dependency to the buildscript in our build.gradle file.