Real-time Analytics Requires Modern IT Infrastructure

PinIt
real-time analytics

A new HfS Research/Accenture survey finds 80% of organizations can’t make data-driven decisions due to a lack of skills and tech. Modernization is needed.

Organizations that make better decisions faster will inevitably leave their rivals behind. That basic concept has lead to massive investment in advanced analytics. The trouble is most organizations are not truly capable of processing data in real-time for a variety of reasons.

A survey of 460 senior IT leaders conducted by HfS Research on behalf of Accenture finds 80 percent of organizations are currently unable to make data-driven decisions due to a paucity of skills and technology. In fact, as much as 50 percent to 90 percent of data is reported as unstructured and largely inaccessible.

See also: Infrastructure-as-a-service gains in popularity

This issue lies at the root of any organization’s ability to effectively compete, says Debbie Polishook, group chief executive at Accenture Operations, a unit of the IT services provider that focuses specifically on applications and IT infrastructure.

“Businesses are worried about being disrupted out of existence,” says Polishook.

Thanks to the cloud new competitors can emerge overnight. Startups can leverage readily available IT infrastructure in the cloud to disrupt entire industries because the IT barrier to entry into a sector from a cost perspective has fallen sharply. Just about every one of those offerings makes extensive use of real-time analytics to provide a better customer experience.

That means not only analyzing data residing in a data warehouse; it means making extensive use of technologies such as streaming analytics to gain insides into data as it moves across the enterprise. By the time most data get analyzed in a data warehouse, the opportunity to act on it in real time has been lost. Streaming analytics software that provides the ability to analyze data in motion is now a digital business prerequisite.

The extent of the challenge, says Polishook, reaches all the way down to the IT infrastructure level. Most legacy IT infrastructure is designed to support batch-oriented applications. Those applications are not typically as latency sensitive as real-time analytics. Because of that issue, many IT organizations find they need to significantly upgrade IT infrastructure to support the real-time analytics that are at the core of any meaningful digital business transformation. And many of organizations are moving workloads to the cloud simply because the cost of upgrading the entire on-premise IT environment would be prohibitive.

The cloud helps real-time analytics but has its issues

Of course, the cloud often introduces additional latency into the IT environment. The compute horsepower available via the cloud is almost inexhaustible. But the networks through which those clouds are accessed often have limited bandwidth, which is why so many organizations are now setting up dedicated virtual networks between their on-premise IT environment and public clouds.

Even still, there’s no getting around the need to also upgrade the on-premise IT environment as part of a modernization of legacy applications that in many cases for either performance, compliance or security issues can’t be deployed on shared public cloud infrastructure.

Real-time computing will sooner than later become a new de facto standard for any digital business. The ability to process and analyze data in real time is going to be the difference between staying in business or being consigned to the dust heap of history. Achieving that goal requires everything from significant investments in IT infrastructure to implementing DevOps processes to manage it. Unfortunately, most business leaders still don’t really appreciate what’s really required from an IT perspective to become an actual digital business.

Leave a Reply

Your email address will not be published. Required fields are marked *