Survey Shines Light on Scope of Real Time Challenge

PinIt

Despite the hype surrounding personalization and digital business transformation, not enough organizations yet fully appreciate what’s required from a data perspective to drive processes in near real time.

At a time when more organizations are looking to make better fact-based decisions faster, it turns out many of them are finding they are unable to make current data available to end users.

A survey of 303 IT decision makers at organizations with 250 or more employees finds 94% of IT leaders say it’s important to have a system that ensures users receive current data, yet 58% said it’s somewhat or not likely that their organization is using fresh or current data. About half those respondents specifically identified data complexity issues stemming from siloed applications as the top barrier to providing real-time insights.

See also: Process + Data: It’s About Time!

Given those issues, it should not come as a surprise that on average only 54% of available data is being harnessed, with only 26% are fully maximizing potentially actionable insights. More troubling still, only a third (34%) said they are driving breakthrough insights using that data versus traditional operational reporting.

Despite the hype surrounding personalization and digital business transformation, not enough organizations yet fully appreciate what’s required from a data perspective to drive processes in near real time, says Jack Mardack, vice president of marketing for Actian.

“These systems simply don’t work when accessing static data,” says Mardack.

The survey makes it clear fundamental issues such as where data is stored is a major challenge. A full 89% of respondents said they need more flexibility from their platforms, while 87% said they will need a hybrid cloud platform capable of spanning cloud and on-premises IT environments to access data. Nearly half of respondents, however, said they were satisfied with their current platforms. At the same time, 45% also specifically identified Big Data within a real-time data environment as a major challenge.

Given the need to include on-premises IT environments, it should not be surprising to see cost being a significant factor as well. A total of 43% cited cost as the biggest challenge to deploying their ideal analytics platform, followed by competing management priorities (41%), faster user adoption (41%), lack of internal flexibility (35%), and management approval (22%). A full 84% of respondents also note they would deploy more data if it were less expensive and easier to accomplish and manage.

As data increasingly becomes viewed as an asset to be exploited rather than a burden to be minimized, it is clear organizations are struggling with often conflicting data strewn across the enterprise. In many cases, organizations have been deploying applications in isolation with nominal regard for consistency. As a result, IT organizations are now wrestling with a host of data quality issues as they attempt to apply predictive analytics and artificial intelligence (AI) to a range of processes in near real time. Those issues become even thornier when those processes involve multiple applications spanning multiple organizations.

At this juncture, there’s no doubt organizations are shifting toward real-time processes to provide more actionable insights within the context of an automated business process. The challenge organizations now face is sorting the proverbial data wheat from the chaff to achieve that goal.

Leave a Reply

Your email address will not be published. Required fields are marked *