Sponsored by Hazelcast

Stream Processing in Financial Services

PinIt
stream processing in financial services

Stream processing is transforming how the most ambitious financial services companies serve customers and how they operate in complex, always-changing environments.

The banking, finance, and payment processing world has aggressively pursued millisecond-level response times for years if not decades. The digitization of the financial services industry, perhaps best illustrated by the famous Wall Street floor trading replaced wholesale by direct fiber optic connections to exchanges and artificial intelligence (AI) brokers, is coming for every organization eager to differentiate and expand its market share.

At some organizations, senior technical leadership is trying to compete by collecting real-time data streams and storing them in traditional data lakes to be analyzed later using batch analysis. But no matter how sophisticated these analysts might be, they’re always slowed down by having to keep up with changing data, writing sophisticated SQL queries, and merging an immense scale of data into meaningful analysis.

All while competitors stream ahead with real-time responsiveness in valuable applications:

  • 50x faster payment processing: With real-time and peer-to-peer payments on the dramatic rise, not just between friends but across borders and currencies, organizations can not scale to tens of thousands of transactions per second with millisecond-level latency—all while maintaining data security and over five-nines of availability.
  • Providing recommendations and personalization: If a retail bank can offer a financial product at the right time, that would benefit both the bank (in terms of added revenue) and the customer (i.e., a personal loan). For example, one bank found that a good time to offer a personal loan to a customer is when they tried to withdraw money from an ATM but had insufficient funds. Unfortunately, their batch-based system could not calculate a suitable loan until two days later. With stream processing, they were able to calculate the “right offer at the right time” immediately and deliver the offer for a loan over SMS. Their loan conversion rate increased by 400%.
  • Managing unpredictable loads: In retail banking, an organization can never truly predict the next major scale in customer demand. If infrastructure can’t scale to accommodate these requests, customers start to wonder about the integrity and availability of their money. Stream processing unlocks faster and more scalable transaction infrastructure to get customers on their way.
  • Navigating risk and compliance: Fraud detection is a compute-intensive process requiring thousands of rules executed in memory, often by machine learning (ML) and other artificial intelligence (AI) technologies. By offloading the effort away from human analysis, financial services organizations can apply more checks than even a team of fraud analysts could at the low latency requirements of modern banking.

See also: How AI/ML Can Help Banks Bridge the Human-Digital Divide

Enabling real-time responsiveness with stream processing

The value proposition of stream processing is clear, but the path toward real-time capabilities is much less so. These organizations operate in uniquely complex and highly-regulated landscapes, where all digital systems must be built with data encryption, security, and customer privacy in mind first, which means their inflexible existing solutions often hamper them.

As engineering teams work to implement streaming capabilities, the complexity of legacy code stretches out their development lifecycles into multiple months. Even when they can extend existing applications around real-time responsiveness, they struggle with old infrastructure and data storage that can’t operate at the speed of stream processing, much less with the demanding SLAs and availability requirements customers demand today.

When looking to connect the existing systems to a modern and extensible stream processing platform, speed is a given, but there are other qualities of equal importance:

  • Ease of integration: In cases where a legacy architecture prevents adding real-time capabilities directly into the core architecture, a stream processing solution can seamlessly ingest data from multiple streams, store it in memory, and send results onto other sinks without being held back.
  • Fault tolerance: When dealing with a customer’s money and other financial assets, not even the smallest detail can slip through the cracks. In the unfortunate event of a hardware failure or network outage, the stream processing system must provide exactly-once end-to-end processing guarantees and disaster recovery using multiple data centers.

The fundamentals of stream processing are already transforming how the most ambitious financial services companies serve customers and analyze how they operate in complex, always-changing environments. Real-time action is poised to revise their industry all over again.

Joel Hans

About Joel Hans

Joel Hans is a copywriter and technical content creator for open source, B2B, and SaaS companies at Commit Copy, bringing experience in infrastructure monitoring, time-series databases, blockchain, streaming analytics, and more. Find him on Twitter @joelhans.

Leave a Reply

Your email address will not be published. Required fields are marked *