Extreme Transaction Processing: Not Just for Wall Street

PinIt
extreme transaction processing system

An extreme transaction processing system could be used for anything from mobile app purchases to ecommerce and the Internet of Things.

Extreme transaction processing, or XTP as it’s been known, has been a viable concept for more than a decade now. It’s has enabled Wall Street firms and financial services giants to process and settle trades and transactions at subsecond speeds. In the fast and fickle stock market, the ability to make a trade at a given fraction of a price, and then sell a moment later at another fraction, is the stuff of high-paced finance.

Now, XTP is something many organizations well beyond Wall Street or the finance sector are looking at. Every business in every type of industry is looking to invest in an extreme transaction processing system to support capabilities within the emerging global digital economy.

In a new report published by Forbes Insights and SAP, extreme transaction processing is defined as the “ability to process at least two million transactions per hour.” In today’s digital economy, the report’s authors point out, there’s “no tolerance for latency, loss, inconsistency or failure. Digital business innovation depends on transactions that are measured in milliseconds or microseconds—millions of them per hour.”

Today’s transactions go beyond traditional orders and payments, the report states – encompassing everything from big data to the Internet of Things. “They now include machine-to-machine transactions, micropayments, peer-to-peer money transfers, pay-per-second products and services, mobile apps and purchases through TV set-top boxes. And any other data-generating event—sensor readings, streaming information from Internet-connected products and activity recorded by in-store beacons, to name just a few.”

Related: How an ecommerce site cut latency to 0.07 milliseconds

What to look for in an XTP system

The report makes several recommendations on instituting a successful XTP approach:

Keep in mind it’s more of an organizational challenge than a technical one. The technology for supporting millions of transactions per second is readily available and cost-effective. However, the business needs to be ready to capitalize on the capability. “No single technology solution will fit every organization or business case, and the evolution to extreme transaction processing is as much a change management challenge as a technology one,” the report states.

Define which transactions have value. Not every transaction in the enterprise needs to be run at blazing speeds. In other cases, competitiveness may depend on it. As one analyst put it in the report: “There are nuggets of valuable information, but unless you know what those nuggets look like, it makes no sense to buy the shovels and start digging.”

Start small, and experiment. As with many things, the only way to figure out where XTP can make a difference is through a lot of experimentation. The report urges this experimentation in several key areas, including edge computing, “which shifts transaction processing from core datacenters to the outer edges of a network, where the device generating the transaction is located,” and distributed transaction processing and monitoring systems, “which ensure that when transactions involve multiple databases, each remains synchronized.” The report also suggests experimenting with in-memory technology, “which keeps critical data in main memory and perform the continuous queries required to analyze data in motion.” An additional area where XTP can be implemented is through complex event processing engines, “which identify and analyze relationships from large volumes of fast-moving data and initiate a response, such as flagging fraud before a transaction completes or rescheduling a shipment in a supply chain to avoid delays.”

Consider the cloud. Cloud providers offer XTP capabilities which can be turned on with a couple of clicks at their portals. “Because companies can pay only for the computing they use, the public cloud offers flexibility that supports the shifting workload levels and volumes of data inherent in extreme transaction processing.”

Examine business processes, as well as IT infrastructure, databases and applications that support them. Don’t let XTP become another layer of complexity, the authors state. “Pilot projects may point to ways to simplify and streamline. Otherwise, despite the best of intentions, you put layers of new complexity on top of old complexity and you’ll choke the new data to death.”

Stay current with the technology. Because XTP is now gaining converts across various businesses, that means there will be a race to achieve such abilities – just as there was in financial services a decade ago. “Any business that does not evolve its transaction processing technology runs the risk of being unable to make use of live data in the moment, as it is needed,” the authors warn. “That risk is compounded by the fact that competitors are probably developing that capability, so those that don’t will be seen as slow and unresponsive to market needs and customer requirements.”

(Disclosure: I have conducted work in the past 12 months for Forbes Insights, mentioned in this post. I was not involved with the work discussed above.)

More on this topic:

Event stream processing

Edge computing

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *