Why Event-Driven is More Than Just Kafka

PinIt

Developers and businesses need a scalable platform to host event-driven Kafka applications.

Competitive pressures are driving the need for new thinking when it comes to developing applications that help a business make decisions in real time. At the heart of these efforts is being able to make real-time decisions based on events, so-called event-driven applications.

The reason: Whenever a transaction takes place, those events invoke the need for action. For example, when a customer places an order or makes a deposit to a bank account, that is an event that drives a next step. Customers are looking for responsive experiences when they interact with companies. As such, being able to make real-time decisions based on an event becomes critical.

The challenge is how to derive insights from streaming events data from multiple sources in real time. The first place many businesses start is to use Apache Kafka, the open-source distributed event streaming platform. It is used for high-performance data pipelines, streaming analytics, and data integration.

Kafka offers numerous capabilities and delivers many benefits. For example, it can be used to process streams of events with joins, aggregations, filters, transformations, and more, using event-time and exactly-once processing.

It eliminates many time-consuming tasks to gain access to data. Specifically, its connect interface integrates with hundreds of event sources and event sinks, including Postgres, JMS, Elasticsearch, AWS S3, and more. Additionally, it has client libraries that let users read, write, and process streams of events in many programming languages.

Apache Kafka supports a range of use cases where high throughput and scalability are vital. Since Apache Kafka minimizes the need for point-to-point integrations for data sharing in certain applications, it can reduce latency to milliseconds. This means data is available to users faster, which can be advantageous in use cases that require real-time data availability.

Scalability is critical

Apache Kafka is an ideal foundation for cloud-native development. Cloud-native applications are event-driven, and Apache Kafka is the optimal backbone to manage events. Kafka enables core event-driven capabilities, including distributed streaming, real-time processing, and high scalability.

However, Kafka alone is not enough for many situations. Developers and businesses need a scalable platform to host Kafka applications. Many are turning to Kubernetes to help streamline the deployment, configuration, management, and use of Apache Kafka. By combining Kafka and Kubernetes, businesses get the benefits of Kafka and the advantages of Kubernetes, which include scalability, high availability, portability, and easy deployment.

In particular, Kubernetes lets a developer or business scale resources up and down with a simple command or scale automatically based on usage as needed to efficiently use resources. With Kubernetes, Apache Kafka clusters can span across on-site and public, private, or hybrid clouds and use different operating systems.

As such, Kubernetes is an essential element for implementing, incorporating, and scaling event-driven applications.

For organizations determined to use Kafka and Kubernetes as the foundation of an event-driven architecture, the next decision is how to go about things. One choice is to do it on their own using the open-source offerings. They will need to consider and address several issues, including:

  • Do they have properly trained staff and resources to undertake such an effort?
  • Does the staff have the expertise to integrate the various components (container orchestrator, event-processing engine, etc.) into a cohesive platform upon which event-driven applications can be deployed?
  • Does the staff have the time to carry out the many tasks and chores related to installation, configuration, and upgrades?
  • How will they scale from the development and piloting of an application to ramping it up for use in production?

Many companies are turning to technology partners to help address these issues. Such partners can bring the expertise and enterprise-class features that supplement the core open source offerings.

A leader in the EDA field that many turn to for technical expertise and solutions is Red Hat. It offers AMQ Streams, with simplified deployment on Red Hat OpenShift, for event-driven companies that want to use Kafka. It is a massively scalable, distributed, and high-performance data streaming platform based on Kafka. It offers a distributed backbone that allows microservices and other applications to share data with high throughput and low latency. Additionally, AMQ Streams includes pre-built container images for Apache Kafka and Zookeeper to streamline deployment, as well as operators for managing and configuring Apache Kafka clusters, topics, and users on top of Kubernetes. 

Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *