To address latency challenges organizations are adopting streaming processing engines, in-memory computing database, and soon 5G wireless services.
The rate at which transactions are processed has always been of critical importance in the enterprise. However, as organizations look to digitize business processes latency is rapidly becoming a bigger hurdle to overcome.
A survey of 351 IT decision makers across five vertical industries conducted by Hazelcast, a provider of an in-memory computing platform, finds well over half of organizations (58%) are now are measuring performance in milliseconds and microseconds (58%) versus seconds. To put that in some perspective, the report notes the average blink of an eye occurs in 300 milliseconds.
The report finds a full 64% of respondents said the need to deliver a quicker and easier customer experience is a significant or major burden on their technology infrastructure. The challenge organizations now face is they need to apply analytics to transactions at rates that are faster than any human can consume, much less fathom, to drive digital business processes.
See also: What Do You Need to Stream?
As a result, reliance on machine learning algorithms and other forms of artificial intelligence (AI) to automate a wide range of processes are accelerating at a dramatic rate, says Steve Wooledge, chief marketing officer for Hazelcast.
“It’s kind of scary,” says Wooledge. “But that’s the reality.”
Co-sponsored by Intel, the report identifies AI and machine learning as the most disruptive technologies impacting organizations (40%), followed by Internet of Things (IoT) (28%), robotic process automation (23%) and hybrid/multi-cloud computing (22%).
All told, the report finds 65% of respondents are tracking costs incurred that in one form or another are associated with latency.
Among organizations that measure latency in milliseconds or microseconds, databases, financial management software and networks were identified as the biggest cause of latency. The report also finds 49% of organizations are also having difficulties handling the large influx of data being generated by IoT devices.
To address those latency challenges organizations are adopting streaming processing engines, in-memory computing database, and soon 5G wireless networking services, says Wooledge. That combination of technologies will then make it possible to collect and process data at the volume required by the AI models that are being developed to drive processes in real time.
Naturally, that transition will put a significant amount of pressure on aging IT infrastructure. Organizations are looking to modernize everything from legacy mainframe applications to client/server systems that have been running for decades. Most of those applications are anchored around transaction processing systems that export data to a data warehouse platform where data can later be analyzed.
The challenge organizations face today is that by the time that data gets analyzed it’s too late to have an impact on the events being analyzed. Modern IT systems are taking advantage of streaming processing engines to analyze data in real time in order to optimize a business outcome while it’s occurring. In fact, the most impactful digital business transformation initiatives typically involve applying analytics in at the very least near real time. Those organizations that are not able to surface those types of insights will soon find themselves consigned to the proverbial scrap heap of business history.