Sponsored by Red Hat
Event-Driven Architecture for the cloud
Learn More

EDA in Financial Services: Modernizing Apps and Embracing Real-Time

PinIt

An event-driven architecture provides a foundation upon which financial services organizations can move from batch to real-time applications and modernize their offerings.

Financial services organizations face growing challenges from FinTech competitors and must meet rising customer expectations for instant access to information and fast responses to requests. Legacy approaches that relied heavily on batch processing must be replaced with modern applications that operate in real time.

Recently, RTInsights sat down with Rafael Marins, Senior Principal Technical Marketing Manager for Financial Services, at Red Hat and talked about the technology challenges of moving to real-time applications, the move to cloud-native applications and microservices, and the role of event-driven architectures. Here is a summary of our conversation. 

New call-to-action

RTInsights: What are the limitations of using batch and other current technologies in the financial services sector?

Rafael Marins

Marins: To give you an example, many types of payments are settled in batch mode. They run after-hours to reconcile all transactions and combine them into a total. Then proceeds with deposits of money in the merchant’s banking account. The batch process applies to all payments received in a given time frame.

Even today, this has an impact on people’s lives. Perhaps this was the second generation of the evolution of systems in the financial sector, and these methods of batch processing represented a huge improvement over the previous alternative, which was the manual execution of financial operations from centralized records. For many portfolios like consumer fund settlements or bookkeeping, it was in the first generation that the large, centralized database emerged. They then reproduced a system that processes transactions and customer data through this same operation model, and they are still running today.

These systems and the infrastructure model that supports them have served most institutions well for many years. It is quite incredible how the technology in financial services has evolved over time, and the impact has brought significant improvements to people’s lives. But the reality is that knowing this helps to understand the current problems and limitations. Most institutions live with four or five generations of systems and processes implemented over the past decades.

If the processes are long and time-consuming with a fixed set of operating windows, then it’s easy to imagine the number of problems that batch processing errors can generate. The worst part of maintaining batch processing is that when it breaks, and it’s typically 3:00 a.m., you are getting the phone call to fix the problems.

Now from the perspective of a customer and the customer expectation, when the records are not processed properly or have some failure, frustration mounts, and brand reputation can suffer. From problem identification to the correction of the record at customer service can take a couple of days. That is the limitation of batch.

RTInsights: What are some of the pressing issues in the financial services sector that are driving the need for EDA adoption and use?

Marins: When talking about drivers, it’s always about customers. They are the heart of the business. Financial institutions are moving to a customer-centric mindset. They need to leave the verticalized structures they used behind and learn that by loosening the status quo, it’s possible to create new and better customer experiences. Working with FinTech or other companies who specialize in a specific function of the value chain is what can make the difference. Today, financial services customers have a digital experience at their fingertips and can initiate the transaction with one click.

Institutions wage a real battle internally between the customer interactions on the digital channels because it’s not possible to give immediate feedback on the user’s screen with real-time events. They need to cope with multiple systems that incorporate the operation model of several generations. It’s not uncommon to note the technical gaps are also visible in the customer experience. There are multiple demands for increased transactions, driving up volumes, paired with customer expectation for shorter settlement times, and the constant pressure to reduce costs. That means the organization will need to transform and modernize the way they host, run, and process their transactions which impacts both their systems and applications.

RTInsights: What are the technical challenges of making use of event-driven and real-time data?

Marins: The technical challenges are tied to adopting new processes related to IT systems, development, and IT operations to address these new business drivers. It usually means becoming more agile—changing the way you operate systems, deploy them, and put them into production. It’s really altering the way financial service institutions operate. They need to transform not only the business or the way they engage with customers, but it also requires an internal cultural shift to make this happen.

The cultural shift is to stop thinking about centralized systems and begin decomposing these systems into small, flexible business capabilities. Those small business capabilities are mainly microservices that can be rapidly developed and integrated into the existing banking architecture. One advantage of this is as things are decomposed, they are also reconsidered for digital transactions—so improvements are also made in the business processes, policies, and procedures as these microservices are defined. And they can provide value to internal or external users with digital simplicity and speed of engagement.

With microservices, you need a cloud-native platform. That is a different paradigm shift in terms of architecture and design principles.

This is where event-driven comes into play. It helps develop a scalable business operation with multiple business capabilities and is resilient enough in terms of data streaming and how these multiple distributed components can consume data.

You need things like stream processing to meet these challenges, along with the orchestration for them to operate as per company and regulatory directive. Applications are composed of a set of event-driven services that can be connected via some data training. Stream processing is used to enable continuous and real-time applications to react to, process, and transform multiple streams of events at scale or capture streams of events and data chains and feed these into other data systems or widely used processes to streams of events in real time. At the same time, you can publish the data to another database for historical analysis and retention.

In our view, an agile process is needed that connects technologists to bank legacy applications, and the firm’s financial services, in general. It also needs an architecture framework built on open source that drives new efficiencies in terms of systems operations, which align containerized microservices, hybrid cloud, and APIs with the agile and DevOps processes. Such processes are designed to accelerate the development of new products and services to financial services customers.

RTInsights: How does EDA let Financial Services companies modernize and improve the customer experience?

Marins: We have a multinational issuer that is transforming from the mainframe to a microservices-based, event-driven architecture. They are preserving data integrity and consistency between internal, web, and mobile customer-facing operations applications. To connect core legacy data systems with new business applications and channels, they needed to roll out a new enterprise-scale, event-driven architecture in less than one year. They were able to handle events replication in streaming finds, reducing the time from days to seconds.

Event streaming and replication across systems lets IT teams work faster and with greater efficiency, rolling out modern applications that provide real-time customer insights.

Another example is a bank that’s incrementally adding event-driven microservices to a monolith mainframe application. They were able to replace batch jobs with real-time streaming applications for fraud detection and reporting. This decreased the anomaly detection time from weeks to seconds, reduced costs and fraud, and let the bank provide better services to customers.

New call-to-action

RTInsights: What role can Kafka and Red Hat offerings play to help?

Marins: Red Hat understands the complexities of the financial industry’s technology footprint and the big complexities that the financial service industry is facing in its efforts to revamp and modernize real-time systems. To that end, Red Hat offers capabilities and technologies that can drive integration and automation so that firms have the tools they need to modernize.

That means that open source, as part of Red Hat offerings, can deliver substantial cost savings over proprietary products and provide the technology foundation that is provider-agnostic and consistent throughout the enterprise, allowing any real-time solution to take advantage of external resources and enabling the reusability of skill sets of internal resources across a wide variety of IT projects.

For example, efficient messaging is critical for real-time systems in financial services. A distributed messaging platform lets systems across the infrastructure communicate with each other. Red Hat AMQ is flexible messaging technology based on open source. Red Hat AMQ integrates and provides messaging services to application endpoints and devices, and the skill sets obtained in using Red Hat AMQ apply to digital channels, branches, or any other part of the firm.

There’s the Apache Kafka stream processing platform that delivers reliable information to support real-time integration and connection. With Red Hat AMQ Streams, you can share data between microservices and applications running on Red Hat Apache with high throughput and low latency. It also provides partitioning, replication, and fault tolerance.

Also, part of this story is containers that can simplify the application development deployment and portability across platforms. This eliminates the need to refactor services to run them on different infrastructure and make your environment more efficient. Containers also let developers deploy services like data streams, caching automation, or business rules on demand without needing to provide additional infrastructure. It also means that internal policies can be codified in the software delivery—authorized access to those that need it, common definitions shared, and ensures that security updates.

A production-grade container application platform like Red Hat OpenShift provides services to these containers, workloads, and components. It delivers built-in security features for container-based applications, including role-based access control, isolation, and checks throughout the container-view process.

As the world leader in enterprise open source, Red Hat helps communities create the future of cloud technology and enables companies to innovate, promoting repeatability and simplified technology to compete more successfully in the marketplace.

Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *