A recent survey shows over three-quarters of IT execs identify streaming analytics, trickle, and change capture as critical capabilities.
A new survey finds that a full 79 percent of respondents now consider high volume, low latency analytics embedded within an application to be important to critical capabilities.
Bill Hostmann, a research fellow with Dresner Advisory Services and co-author of the 2018 Analytical Data Infrastructure Market Study, says that level of interest is indicative of the fact that being able to analyze data within the context of application in real time using, for example, streaming analytics is rising.
In fact, just over three quarters identify real-time/streaming, trickle, and increments/change capture as either important or critical capabilities.
See also: 3 ways streaming analytics could evolve in 2018
In terms of the use cases for analytics studied by Dresner, embedded analytics still comes in fourth compared to reporting and dashboards, data discovery and data science. But only data science and embedded analytics saw any increase compared to the 2017 Dresner survey. Dashboards and data discovery remain flat year to year.
None of this means any one form of analytics is starting to supplant another. Rather, Hostmann says other use cases for analytics are now starting to see increased adoption as more organizations become focused on making better data-driven decisions across all areas of the business.
The Dresner survey also makes it clear that when it comes to analytics performance, scalability, security, usability, features, agility, adaptability and price are all critical metrics. Less critical is any adherence to corporate standards, the study finds. Corporate standards in time may play a bigger role. But for now analytics is an emerging sets of technologies that organizations are willing to broadly evaluate, which is why so many startup vendors are attracted to the category, says Hostmann.
Cloud-based streaming analytics rules for small users
In terms of deployment preferences, the survey also makes it clear that smaller organizations prefer to consume analytics as a cloud service, while larger organizations are much more invested in on-premises applications. That dichotomy generally reflects that location where most of the data that needs to be analyzed resides, says Hostmann.
In terms of software licensing preferences, concurrent user and subscriptions services were relatively tied, followed closely by per-user licensing model. Finally, in terms of types of data being most critical to analyze transactions comes in first, followed by metadata, Excel/CSV files, text and then machine logs.
While internal IT organizations are still critical to setting up and managing analytics applications, their role is changing. The days when end users patiently waited for canned reports prepared by the IT staff are pretty much over. As part of the making the shift towards becoming a digital business, end users want to be able to interrogate the latest data any organization possesses within minutes of it being acquired. That enables decisions to be made that could still within hours affect a specific business outcome.
“The business has to be more agile,” says Hostmann.
The degree to which that need for agility ultimately impacts they types of analytics technologies employed remains to be seen. But in all probability organizations with be employing a much broader mix of real-time analytics alongside traditional business intelligence tools to compete more effectively or risk being left in the dust by rivals benefiting from faster access to greater actionable intelligence.