So why is this important?

This was the first question when I discussed this with my clients and colleagues, why is it so important that we now can use the Kafka protocol with Event Hubs? When we are working with Kafka, we have different options of hosting our Kafka server.

PaaS, which can be done with HDInsight, but still requires you to manage the cluster.

As always, these options have impact on the amount of work you have to do yourself, and the control you have over your environment.

Work vs. responsibility

At our clients, we almost always encounter a SaaS over PaaS over IaaS over On-Premises policy, as they want to focus on delivering value, and not have to worry on keeping things running.

So how does Event Hubs fit in here? Event Hubs is a fully managed PaaS solution, meaning you don’t have to worry about any of the maintenance, as this all is handled by Microsoft, and instead we can just focus on delivering value, placing it in the serverless space. For Kafka this means we don’t have to worry about the cluster either, we can just point our Kafka application to our Event Hubs endpoint, and everything will be handled for us. So if you don’t need to have any special configurations, but just need a way to handle your data, Event Hubs is the perfect solution.

Similarities and differences

When looking at Event Hubs and Kafka, we will see a lot of similarities. They are both designed to handle large streams of messages, allowing for real time data streaming. They both make sure messages are handled in a reliable fashion, and can scale extremely well even under high load.

Event Hubs architectur

But of course, there are also differences. The biggest difference was already explained, where for Event Hubs you don’t have to worry about configuring and managing your brokers, and don’t have to worry about the servers and network. But not only the hosting options are a big difference, we also see a lot of variation in the features and tools provided by both platforms.

Event Hubs gives us many great features

By taking advantage of Event Hubs very interesting features, and combining these with Kafka’s ecosystem and tools, we can truly have the best of both worlds. Using Event Hubs with Kafka support you can keep using your existing tools for get insights in and working with your existing applications, including things like using MirrorMaker to replicate your Kafka messages with with Kafka and Event Hubs, while gaining all the features and possibilities which Event Hubs provides us.

Kafka has some great tools and ecosystem

And you can mix and match this, so you could for example have producers sending messages with the Kafka protocol to Azure Event Hubs, and have consumers working with the Event Hubs protocol to process these messages.

So how do we do this?

Getting started with Event Hubs for Kafka is extremely easy, and only needs two changes the configuration file of the Kafka client. The first change is to switch the endpoint to which the client will connect to our Azure Event Hubs instance, and the second is to update the security protocol into SASL PLAIN, using the connection string from our Event Hubs instance as the password.

Conclusion

As we have seen, with just a few minor configuration changes we can now connect our Kafka clients to Azure Event Hubs, allowing us to have the best of both worlds. We can keep working with our existing Kafka applications, managing this with its own tools and ecosystem, while leveraging the easy of use and many great features of Event Hubs.