Sponsored by Red Hat

Why Modern Applications Need an Event-driven Architecture

PinIt

As businesses strive to be more responsive and react in real-time, they must incorporate events data and develop and deploy applications based on microservices and cloud-native architectures.

Businesses today must make fast decisions as changes are taking place. Those decisions are based on insights derived from the analysis of events data from multiple sources. That’s quite different from the way organizations worked before. And it requires changes in the way applications are architected and how data is collected, analyzed, and incorporated into business processes.

RTInsights recently sat down with David Codelli, Product Marketing Manager, Red Hat, to discuss the growing need to use events data, what type of architecture is needed to use them, and more. Here is a summary of our conversation.

Why the great interest in using events?

RTInsights: Why is there such great interest in using events in modern business applications?

Codelli: Events are just an abstraction of a change. An event is any kind of change in an organization’s own data systems, or the systems of their partners, or things about their customer, any kind of change can be a useful data point. So the interest in events is saying, okay, if we can capture all those data points as they happen and process them and maybe process them faster, I could serve my customers better, treat my patients better, or gain some business advantage.

How is events data different?

RTInsights: What makes handling events data different than other types of data that might be used in an application?

Codelli: First, let’s talk about things that are not events. Any kind of data that’s exchanged in a batch is the obvious example. If my architecture is that I receive changes in my master data in the morning, I get a big file that says, these are your new product codes. During the day, transactional systems are handling things. They’re receiving orders, filling orders, doing business things, making changes into databases in various parts of my enterprise.

One approach to handling changes with this batch approach is saying, well, I have my new product codes in my reporting systems, and changes have happened, and at the end of the day, everybody’s going to send me their file of all the changes. There’s this class of products, ETL products, that often run in a batch mode saying, once per day, I’m going to go to each system, get a file of their changes, and load them in my data warehouse. Then I can start generating reports. That’s an example of a useful, interesting process that’s not event-driven. It’s based on timings. It’s based on collecting data in batches.

An event-driven approach says, let’s capture the event as it happens and publish it in a generalized manner that allows that event to be captured by interested parties. And if I’m capturing the event and publishing it, I may not even know how it’s being used in all the other various applications to which I give access.

What is an EDA?

RTInsights: What is an event-driven architecture (EDA), and what role does it play in using events data?

Codelli: An event-driven architecture is basically people, platforms, and processes that facilitate that goal of capturing events as soon as they’re raised, received, or available. For people, it means changing your development practices to use things like publish and subscribe. For platforms, you’re going to want to select from a wide variety of platforms that help you. Some of these platforms have been around for a while. For example, the well-established Java language and the various .NET languages have had first-class objects. They have been around long enough that your developers can say, I have this notion of events that programmatically are raised, and I can respond to them. That gets you part of the way there, but those things don’t work in the cloud-native world because cloud nativity is all about microservices and separate developer teams not sharing a platform. Right now, the state-of-the-art for an event-driven architecture in terms of platforms is something like Apache Kafka. It’s probably the most popular approach to event-streaming backbones upon which you can build an event-driven architecture, but there are others.

What are some EDA examples?

RTInsights: What are some examples where the benefits and capabilities of an event-driven architecture shine?

Codelli: As long as there’s been computing, we’ve had this idea of a cycle of identifying situations, predicting outcomes of those situations, deciding the best course of action, and then acting on it. So, identifying, predicting, deciding, executing. When big BI business intelligence systems first sort of burst on the scene in the 1990s, that cycle was a daily, weekly, or sometimes even quarterly process you did for your financial reporting. The time between gathering the data I need, generating insights, and executing on it used to be very long. With EDA and state-of-the-art platforms for an EDA, you can shrink that cycle time to minutes, seconds, and even milliseconds.

Using EDA with good processes and good platforms, state-of-the-art platforms are super useful in the financial world when trying to detect fraud. You’re trying to connect data points like someone using an ATM card in Singapore just minutes after being used in the U.S. Information like that probably means you are dealing with a hacked or stolen card. Reacting rapidly and getting multiple data points is super useful in financial services.

The same capabilities are also very useful in healthcare. With the COVID pandemic, we want to know the progress of managing outbreaks. Contact tracing could benefit a lot from an event-driven architecture. So healthcare is a great fertile ground for EDAs.

Another field that can benefit from EDAs is manufacturing. Consider a very sophisticated manufacturing environment where you have components, cards, or gears that can fail. Based on insight, you might learn what happens when a certain temperature threshold is reached. Processing a huge array of temperature readings and selecting the ones that will predict failure is something EDA does well.

Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply