The additive benefits of event-driven architecture and serverless together include faster deployments, reduced costs, and improved observability.
When looked at side-by-side, event-driven architecture (EDA) and serverless seem like a perfect match. EDA is a fast-growing software development paradigm that meshes together distributed services with asynchronous events, emphasizing being as close to real-time as possible. Serverless is a deployment strategy that lets developers bundle applications into discrete containers that provision and run the code only when needed.
Thus, an event-driven serverless deployment consists of many producers and consumers working in tandem, likely across different environments or clouds, with all of the hard work of building, deploying, securing, and scaling all those discrete components abstracted away to the serverless provider.
Then why haven’t the two already fully overlapped in the hearts and minds of CTOs, developers, and the businesses they work for?
Serverless adoption hasn’t been without a few hiccups, especially considering that some version of running abstracted functions in the cloud has existed since the mid-aughts. Developers appreciated instant scalability and abstracted provisioning and security, and their IT leads liked the sound of pay-as-you-go pricing to reduce costs and simplify capacity planning. But they couldn’t always figure out the overall value in pushing their existing deployment to a serverless provider.
However, as more of these same businesses are deeply investigating event-driven architectures, they’re also starting to see how the same serverless benefits that have been touted for years are now getting even better in some unique ways.
Simplified provisioning and deployment
One of the biggest benefits of serverless is that developers don’t have to worry about provisioning their servers, whether physical or virtual, and they know that deployment is generally as simple as uploading their code and waiting for the green light. They also benefit from the provider’s global infrastructure, meaning their code runs close to the user for minimal latency.
But when developers run a serverless EDA, this simplicity strengthens. Developers can now put the most intense focus on providing the best end-user experience without being held back by fears of capacity planning or whether they’ll need to scale horizontally or vertically. They need to focus on the most reliable interaction between discrete producers and consumers.
Serverless detractors often point out the threat of vendor lock-in, but the reality is that EDA incentivizes the smallest possible discrete parts, which simplifies migration. In addition, most serverless providers support one or more open container standards, such as the Open Container Initiative, which means that the underlying logic will run on any number of clouds regardless of how developers created them.
Scalability where it matters
One clear benefit of using serverless is that the provider can instantly run multiple instances of a single, containerized application during periods of high demand. This is an automated version of horizontal scaling. The team managing an infrastructure adds more physical/virtual nodes to handle the increased load while improving the end-user experience. This kind of scaling is great, but it’s not particularly “smart.”
Once again, the inherent features of an event-driven architecture make the scalability picture even more enticing. For example, a SaaS application might include an area for users to update their profiles. Submitting the form creates an event that’s sent to three different event channels and many different consumers, but the process of converting and resizing the user’s profile picture takes the longest by far. A serverless EDA can scale that particular application to keep all event queues as small as possible in a moment of increased load.
There are no restrictions on throughput or fears over how much the database node can handle. A serverless EDA gets most businesses much closer to the fabled land of “elasticity.”
Low latency in multiple forms
One early critique of serverless deployments was the “cold start,” which was the amount of time required for a containerized application to re-launch after the provider left it idling to keep costs down.
The small, decoupled, and discrete foundation of an EDA means this is much less likely when going serverless. Any EDA is much less likely to suffer from cold starts, as a nominal amount of user interaction should keep most producers and consumers active and responding quickly. Even in the rare case, an event needs to wait, the consumer is generally built agile enough, without dependencies on other parts of the infrastructure, for a near-instantaneous start. In addition, EDA allows for multiple consumers to receive events simultaneously, which reduces latency and improves throughput for more elastic-feeling infrastructure.
Serverless providers have also been busy building new features to help find the right balance between warm and cold. Developers can now configure what “idle” means, when to scale down to zero, or even run more instances of a particular event flow in certain regions of the world based on real-time demand.
The additive benefits of event-driven architecture and serverless together also include faster deployments, reduced costs, improved observability into the health and performance of a deployment, and much more. For developers and CTOs building an event-driven architecture for the first time, or ones who gave up on serverless back when they were maintaining monoliths, now just might be time to see how the two continue to grow together.