SHARE
Facebook X Pinterest WhatsApp

DIY EDA Security Challenges and Concerns

thumbnail
DIY EDA Security Challenges and Concerns

Computer security code abstract image. Password protection conceptual image.

With many deployment options, it’s easy to undermine the inherent EDA security benefits by deploying onto a substandard environment.

Written By
thumbnail
Joel Hans
Joel Hans
Sep 21, 2021

As technology-savvy companies move through their digital transformation, they inevitably come to a point where they want—or need—to modernize their applications. In this process, they’re moving to development and deployment strategies like containers and microservices, testing out Kubernetes. They’re figuring out whether they can go fully cloud-native or if a hybrid cloud environment will offer more flexibility, particularly when more employees are working from home. No matter what they end up with, they have to continuously improve their security posture across the entire software development lifecycle. Event-driven architecture (EDA) is one standard paradigm for navigating these issues.

In this pattern, events are immutable records of a state change created by producers, which then publish the event to a router. The router ingests, filters, queues, and pushes events to a consumer, which receives the event and takes action, potentially creating another event. In the EDA paradigm, events always move “downstream.”

A typical example of an EDA in action is a retail website with a shopping cart service that creates a ‘new order’ event when a visitor checks out. The event router ingests this event and pushes it to an inventory database consumer, which decrements the specific product by one. The router might have also pushed this event to additional consumers, but the inventory database isn’t made aware of or dependent on any other events.

See also: The Challenges of Implementing and Scaling a DIY EDA

Inherent EDA security benefits

No implementation or deployment strategy is foolproof, but this one-way flow of immutable events sets up businesses for security success from the first day. They get a modern, cloud-ready application with additional layers of resiliency and visibility.

  • Resiliency in the producer-router-consumer chain: In an EDA, each service is decoupled from all others, with no dependencies but the flow of events. Producers, routers, and consumers all run (and fail) independently of one another. From a security standpoint, decoupling means that if a single service fails due to a security breach, no other producers or consumers are directly affected. Because EDAs rely on containers and microservices, oftentimes with an orchestrator like Kubernetes, services don’t run on the same node, the same hardware, or in the same location.
  • Isolation due to push-based workflow: Let’s say that in the above EDA example, an attacker gains access to the containerized inventory database service due to a zero-day exploit in the open-source database’s code. While this is still a very concerning security breach, developers can still deploy patches and restore databases to a reliable state. EDAs win out in this case because of isolation, where each consumer only listens to events from the router, preventing an attacker from pushing malicious code “upstream” or hopping to another service.
  • Filtering and buffering in the event router: The buck stops at the event router. This crucial service looks at incoming events, validates those matching certain structures or values, and sends only those on waiting consumers, thus preventing any damage from any producer affected by a security breach. On the other side, if a consumer service fails due to a security breach, messages coming from upstream producers queue in the router until the consumer is back online. The router is a smart, elastic buffer that instantly accommodates any change in the event flow without affecting other services.
  • Single source of security truth: In an EDA, the event stream is fundamentally a sequence of facts related to the change during each event. This sequence—the “narrative” of how events are produced, routed, and consumed by the EDA—can be analyzed in real-time to look for unexpected behavior and set off security alerts. It’s also auditable after the fact, which helps developers and security professionals replay the log, diagnose the problem, or even run events through machine learning to model the difference between “normal” and “anomalous” to constantly refine alerts.

While the EDA provides some inherent security benefits on the application layer, the environment is another matter entirely. Many businesses pursuing modern applications and digital transformation aren’t ready to migrate fully to the cloud or don’t want to abstract every piece of their infrastructure to a third party. While a legitimate strategy for some, it’s easy to undermine an EDA’s inherent benefits by deploying onto a substandard environment.

One common DIY strategy is microservices for the producers/consumers and Apache Kafka as the event router, with everything managed by Kubernetes. Managing and improving the security of this complex environment, which requires a great deal of time and specialist experience, might be out of reach for smaller businesses or those who don’t yet have the talent on staff. Mix that with the fact that deployments operate in not just one security environment, but many—think on-prem, hybrid clouds, multi-cloud, and CDNs that are essentially black boxes for your traffic—and the DIY approach quickly feels expensive and unmanageable.

That’s why many businesses are opting to work with an EDA provider in the cloud to address EDA security. They deploy cloud-native applications on a different cloud, forget about the hassle of on-prem or hybrid clouds, and abstract all the network- and hardware-level security to a better-staffed security operations center (SOC). They then get to collect on all the downstream benefits: More time to focus on continuously deploying new service-level code to make their applications even more immutable, resilient, and secure.

thumbnail
Joel Hans

Joel Hans is a copywriter and technical content creator for open source, B2B, and SaaS companies at Commit Copy, bringing experience in infrastructure monitoring, time-series databases, blockchain, streaming analytics, and more. Find him on Twitter @joelhans.

Recommended for you...

The Rise of Autonomous BI: How AI Agents Are Transforming Data Discovery and Analysis
Beyond Procurement: Optimizing Productivity, Consumer Experience with a Holistic Tech Management Strategy
Rishi Kohli
Jan 3, 2026
Smart Governance in the Age of Self-Service BI: Striking the Right Balance
Why the Next Evolution in the C-Suite Is a Chief Data, Analytics, and AI Officer

Featured Resources from Cloud Data Insights

The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.