Businesses should optimize their cloud architectures to deliver the continuous analytics that are fundamental to ongoing operational efficiency and commercial success.
In 2017, the Economist published an article titled “The world’s most valuable resource is no longer oil, but data” in truth, firms in the financial services sector, especially capital markets, have known this for many years.
The digitization of financial services arguably began with the big bang of 1986 when major exchanges like the London Stock Exchange switched over to an automated quotation system, replacing the trading floor. From that moment on, technology has continued to disrupt and transform the sector – the introduction of High-Frequency Trading (HFT) and cryptocurrencies are two obvious examples, but there are many more.
Data, more specifically, the analysis of data, has been fundamental to this transformation. Financial services firms were among the first companies to realize that faster access to deeper, richer insights could give them a significant competitive advantage over their peers, especially in areas such as HFT.
However, and rather inevitably, the focus on analytics and the development of ever more sophisticated – and automated – algorithms using Machine Learning to generate maximum alpha and minimize risk has created what can be best described as an ‘arms race’ in the sector. Forbanks to remain agile and competitive, they must constantly update their offering to customers, developing new strategies that advance positions and protect customers. For that to happen, they need to re-think their approach to data analytics.
See also: Top Challenges of Using Real-Time Data
The time-series continuum
Banks can’t out-compete their competitors with aged data, yet many organizations are still working to timescales of days and weeks when it comes to developing, validating, and launching new algorithms and models when they need to be thinking in seconds and minutes.
This might seem a strange statement after lauding the sector in the opening paragraphs for being at the vanguard of data and analytics, but the truth is many existing data management and analytics platforms simply can’t deliver the speed of insight required on the huge volumes of data being created.
Why is this? Arguably a combination of legacy systems – inflexible on-prem data centers and applications that cannot be easily scaled to deliver the storage and compute needed to tackle the huge volumes of data being created – and an approach to analytics that puts too much focus on historical, or big data.
I view all data, especially financial, as time series, meaning it has a specific value at a fixed point in time. The real-time context here is incredibly important as the business value of that data begins to perish from the moment it is created.
By adopting an approach that thinks of data analytics as being a continuous insights ‘engine’ rather than a batch process (even if batch processing is done hourly or daily), there are never any ‘gaps’ in an organization’s understanding of how the business is operating. The analytics and insights being provided are a continuous and recorded stream of truth that deliver the insights that enable the rapid innovation and development of products and services.
Continuous analytics in the cloud
All financial organizations have a cloud strategy. The benefits of scale, flexibility, and cost savings are well understood. The strategic focus should now shift to ensuring their cloud architecture is optimized to deliver the continuous analytics that are fundamental to ongoing operational efficiency and commercial success.
Three key considerations:
1: Access to data sources: The more market, trade, order, and reference data that can feed into an analytics platform, the better the quality of the insights that can be extracted. However, financial organizations have struggled with managing the sheer volumes, formats, and location of the data. Continuous analytics depends on the ability to instantly connect data sources, regardless of location, and build data pipelines to run analytics on. How easy is it to build these pipelines? Are they resilient, and can they scale?
2: Optimized for coding: How quickly can quants, data analysts, and data scientists extract value from the data rather than spend time managing and processing it? Can they use preferred languages such as SQL and Python to develop and validate new algorithms in timescales that deliver a competitive advantage? Are there inbuilt Machine Learning interfaces or easy access to relevant microservices? Can reports be built quickly and visualized in a way that delivers immediate value?
3: Upgrade path: Most, if not all, banks, as part of their cloud strategy, will be looking at how to use the cloud to re-architect and, eventually, rebuild their data management and analytics systems. What does that journey look like? How easy will it be to move data and optimize costs? What is the path to re-architecture and, in some cases, re-write applications? How integrated will it be with their cloud vendor’s wider ecosystem of services and support?
Most, if not all, banks will eventually move their vast amount of their data management and analytics requirements to the cloud. There will still be some processes that require ultra-low latency that can only be delivered on-prem, but otherwise, the Cloud offers a myriad of benefits. However, all clouds, just like all real-time analytics platforms, are not the same, and banks need to pay careful consideration to how their existing data sources, applications, and processes can be migrated, all the while prioritizing the need for a continuous stream of analytics and insights.