The Power of the “Tolerant Reader” in Kafka Microservices 💪🚀

Andreas Loizou
3 min readJun 14, 2023
The Tolerant Reader Service Design Pattern

Hello, world! 🌍

Today, we’re embarking on an intriguing journey into the world of Kafka and microservices, focusing on the “Tolerant Reader” pattern. The “Tolerant Reader” pattern was popularized by industry guru Martin Fowler, and offers a robust way to manage data across services.

We’ll provide a real-world example involving a Transaction object and dive into actual code snippets to breathe life into this potent concept. Buckle up and let’s get started! 🧘‍♂️💡

What is the “Tolerant Reader”? 🧐

The “Tolerant Reader” is a service design pattern that encourages services to be as forgiving as possible when reading data from other services. Instead of expecting a specific, unchanging structure, it advocates that readers (services consuming the data) should tolerate unexpected, irrelevant, or missing fields. This way, services can handle changes gracefully, without falling apart at the first sign of unpredictability!

To learn more about it, you can visit the corresponding link on Martin Fowler’s page here: https://martinfowler.com/bliki/TolerantReader.html

Kafka Example with Transaction Object 🏦

Imagine a microservice, TransactionReader, operating in a Kafka-based system. It consumes messages from a topic, TransactionTopic. Each message encapsulates a Transaction object, which historically has a stable structure with 25 fields.

A full Transaction class in Java might look something like this:

import com.fasterxml.jackson.annotation.JsonProperty;

public class Transaction {
@JsonProperty("transactionID")
private String transactionID;
@JsonProperty("transactionType")
private String transactionType;
// ...23 more fields...
@JsonProperty("transactionStatus")
private String transactionStatus;
}

Interestingly, TransactionReader only needs one field from the Transaction object: transactionID. In line with the "Tolerant Reader" pattern, we can create a new class that extracts only this field.

import com.fasterxml.jackson.annotation.JsonProperty;

public class PartialTransaction {
@JsonProperty("transactionID")
private String transactionID;
}

By following this pattern, we can unlock two significant benefits:

  1. Efficiency: By deserializing only the necessary data, we save on CPU time and memory.
  2. Robustness to Changes: As long as the transactionID field remains intact, the service can continue to operate even if other parts of the Transaction object change.

Things to Ponder

Despite the advantages, this approach comes with its share of caveats. Depending on the technologies you currently use, implementing the “Tolerant Reader” might require an additional deserialization tool or library that supports partial deserialization.

Additionally, having partial representations of objects can potentially lead to confusion, since different applications might need different fields, and will have different Models, that represent only a fraction of the actual model.

Wrapping Up 🎁

In summary, the “Tolerant Reader” offers a sturdy and efficient strategy for managing data in Kafka microservices. While not a panacea, when applied judiciously, it can significantly enhance your services’ resilience to changes. As is the case with all patterns, the key lies in knowing where and when to apply them appropriately. 🧰🔨

Thank you for reading! Let’s continue to explore, adapt, and reinforce our microservices. Stay tuned for more exciting insights into the fascinating world of microservices and Kafka. Until next time, happy coding! 💻👋

--

--

Andreas Loizou

Software Engineering Trainer | Director of Engineering @ https://www.qbeat.io | @UniofOxford alum. No limits. Dares the Unknown