Configuring Kafka Topics with Spring Kafka

Here’s a common scenario, your engineering team decides to use Kafka and they start writing producers, consumers, connectors, streams, you name it. Before long, 6 engineers have manually created 20+ topics in the lower environment and its time to promote that great work into the next environment. Some teams may be fine continuing down the path of manual management and assigning a “topic master” to handle all topic change requests. Other teams may not be fine with this and look towards automation.

I recently came across a scenario similar to this and during my research was surprised at the lack of solutions for managing a Kafka cluster’s topics. With Spring Kafka already in the mix, I started perusing their documentation and stumbled on a small section of the docs that talk about configuring topics via a NewTopic class. The rest of this post details my findings as well as a solution to managing topic configurations.

NewTopic Beans

To test out creating new topics via code I created a simple Spring Boot application and defined three NewTopic beans. I gave each topic a name, a number of partitions, and replication factor. You can also provide the NewTopic with a map of config values to set up any of the topic level configurations. To keep things simple I left this out.

After running the app, I jumped onto my local broker and listed out the topics and there they were. Easy!

The KafkaAdmin

Behind the scenes, there is a KafkaAdmin tapped into the afterSingletonsInstantiated lifecycle hook that pulls all NewTopic beans out of the Application Context and runs through some logic to figure out which topics need to be created and which need to be updated.

If the broker is running Kafka 1.0.0 or higher, the KafkaAdmin can increase a topic’s partitions. It will not decrease the number of partitions.

This was a great start to getting away from manual topic management but I still wanted to get the configuration out of the code and into properties to make it more flexible across environments as well as simpler to maintain.

Dynamically Creating NewTopic Beans

The first step of pushing the topic configuration out of the code was to come up with a yaml format that maps to a POJO. The below snippet shows what that format looks like as well as the @ConfigurationProperties model they map to.

Now that these properties are available for injection, I created a @Configuration class and wired them in. In the @PostConstruct, the topic configurations are iterated over and injected into the Application Context as a NewTopic bean so that they will get picked up by the normal KafkaAdmin logic mentioned above.

This project makes a Kafka Cluster more maintainable as well as secure by reducing the needs for direct broker access. It centralizes topic configurations and provides an audit trail of operations on a topic. Future environments can be quickly configured by adding a new application-{env}.yml file and running the app with that {env} as one of the active profiles.


Source Code: https://github.com/schroedermatt/spring-kafka-config

About the Author

Matt Schroeder profile.

Matt Schroeder

Principal Technologist

A wide range of professional experience and a Master’s Degree in Software Engineering have become the foundation that enables Matt to lead teams to the best solution for every problem.

One thought on “Configuring Kafka Topics with Spring Kafka

  1. TRWS says:

    How to achieve this with kafkatemplate?

  2. Victor says:

    This is really really useful and cool. Thanks

  3. Somebody says:

    Link to ” a small section of the docs” is broken

    1. Thanks. The link is updated.

    1. Thanks! The link is updated.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Blog Posts
Android Development for iOS Developers
Android development has greatly improved since the early days. Maybe you tried it out when Android development was done in Eclipse, emulators were slow and buggy, and Java was the required language. Things have changed […]
Add a custom object to your Liquibase diff
Adding a custom object to your liquibase diff is a pretty simple two step process. Create an implementation of DatabaseObject Create an implementation of SnapshotGenerator In my case I wanted to add tracking of Stored […]
Keeping Secrets Out of Terraform State
There are many instances where you will want to create resources via Terraform with secrets that you just don’t want anyone to see. These could be IAM credentials, certificates, RDS DB credentials, etc. One problem […]
Validating Terraform Plans using Open Policy Agent
When developing infrastructure as code using terraform, it can be difficult to test and validate changes without executing the code against a real environment. The feedback loop between writing a line of code and understanding […]