Continuous Intelligence Needs Real-Time and Historical Data

PinIt
streaming data

Businesses must accommodate streaming analytics to support the dynamic workflows demanded in the era of continuous intelligence.

In an increasingly data-rich world fixated on the next stage of growth and innovation, the primary challenge (and opportunity) presenting itself to many organizations, regardless of sector, is not one of data volume but of decision-making speed. Continuous intelligence can play an important role.

Every minute, organizations of all shapes and sizes are inundated with data from all directions, but the value of this data can quickly degrade depending on the length of time it takes to capture, process, analyze and act. Often framed in microseconds, this ‘window of opportunity is becoming a key metric for competitive differentiation in sectors as diverse as financial services, space exploration, telecommunication, and utilities.

See also: Considerations for Successful Continuous Data Ingestion and Analysis

This ability to operate in microseconds can empower organizations to make the best decisions that drive profitability, grow customer numbers, set new trends, and futureproof operations for years to come. But it’s not just about real-time data on its own. Bringing together real-time and historical data for real-time analysis (with historical context) can allow for not just faster but smarter decision making. This powerful combination yields what we call “continuous intelligence.”

Take, for example, temperature data captured by a machine sensor. Understanding this data in real time is useful for checking that the machine is operating efficiently, but when you combine this with historical data, mapped over many days and months, you develop a richer understanding of how a machine is performing and can build predictive models based on other machine performance profiles to understand when problems are likely to occur and take action in advance.

This example of how historical data informs and shapes large volumes of incoming data, as that data is created, can be extrapolated to any number of industries and scenarios – from tracking NASCAR car data to alerting engineers to engine abnormalities, to searching queries from home shoppers informing Google what advertisements to serve up with results and leveraging location data from sensors on autonomous vehicles or drones to warn of obstructions. The list is almost endless.

Evolving from big to fast data

Now is the time to move beyond the age of data management and into the era of continuous intelligence. The challenge for many organizations, however, is figuring out how to seamlessly deliver continuous intelligence with siloed and heavily fragmented software stacks.

In response to data explosion over the last several years, many organizations invested in numerous large-scale data infrastructure solutions – enterprise-class databases feeding off data warehouses and data lakes, all stitched together with integration and governance systems. As a result, many organizations have been left with complex software stacks using multiple applications from different vendors, covering storage, analytics, visualization, and more.

While some of this complexity is both intentional and understandable – due to legal and contractual requirements, privacy control, and confidentiality techniques, for example – it does nevertheless pose challenges to organizations looking to implement an overarching technology. While these big data solutions are relatively adept at managing large volumes of historical data, they are not well suited to delivering real-time insights from data captured ‘in-the-moment.’ Inefficiencies exist in terms of performance and increased TCO, as well as risks associated with data duplication due to multiple applications using the same data sets.

As data becomes ever richer, faster, and exponentially greater in both velocity and volume, the siloed approach to data management is not well suited to the era of continuous intelligence. Handily though, there are technologies and platforms that offer the interoperability and intelligence to merge and acquire data at speed from siloes and enable in-the-moment analysis and visualization.  

Getting started with streaming analytics

The collective name for these technologies is real time or streaming analytics, and for businesses looking to implement such a solution, there are a few considerations and best practices to keep in mind.

Organizations must consider how this technology can slot into an existing data management environment. Many organizations have invested heavily into data storage and management solutions, and it’s unrealistic to ask them to rip and replace. Ideally, streaming analytics solutions should be compatible with the major cloud platforms and computing architectures, interoperable with popular programming languages, and flexible in terms of deployment method – depending on preferences for running on-premises, in the cloud, or in a hybrid model.

As with any new solution deployment, enterprises should consider TCO and any impact that a streaming analytics deployment could have on costs. Having a low memory footprint and the ability to run on commodity hardware are important considerations, especially for IoT and other scenarios where analytics at or near the edge requires software to run on devices that are unlikely to have significant compute power.

Ongoing maintenance and operational costs are other factors to account for, along with the level of professional services that are available to support the analysis, remediation, and migration of data. Businesses may also want to look at the experience that exists within the organization to see if the appropriate skill sets exist or whether training and hiring policies need to be updated.

Prerequisites for continuous intelligence

Critically, any solution must have both the interoperability and intelligence to merge and acquire data on demand regardless of location, format, and size.

Unlocking continuous intelligence calls for a reshaping of data environments – moving away from siloed solutions and toward a horizontal platform approach that provides the performance and scale needed to meet the demands of a modern enterprise.

Businesses that implement the right streaming analytics platform gain access to total analytics, not only enabling them to serve applications in real time but gain better overall business intelligence through improved data warehousing and access to historical data outputs. This, in turn, enables the business to support the flexible and dynamic workflows demanded in the era of continuous intelligence.

James Corcoran

About James Corcoran

James Corcoran is CTO, Enterprise Solutions at KX, leading the platform, product development, and research teams across cloud and edge computing. He is responsible for all aspects of the technology roadmap, engineering operations, and the portfolio of KX business applications. He joined KX in 2009 as a software engineer and spent a number of years building enterprise platforms at global investment banks before being appointed to a number of engineering leadership positions covering pre-sales, solution engineering, and customer implementations.  He holds an MSc in Quantitative Finance from University College Dublin.

Leave a Reply

Your email address will not be published. Required fields are marked *