Center for Continuous Intelligence

How AI Makes Real-Time Analytics More Real


While great strides have been made in the adoption of real-time analytics in the marketplace, artificial intelligence could ramp this up.

We’ve come a long way with analytics in recent years, in which data is applied against algorithms or analytics engines to determine what it may mean to the business.

Lately, there’s been a lot of progress with real-time analytics, especially when applied against streaming data from systems or devices. But with artificial intelligence coming into the picture, we ain’t seen nothing yet.

That’s the word from a group of McKinsey Global Institute analysts, led by Michael Chui, who connected the dots between AI and hundreds of use cases from across 20 industries in a recent study. Notably, they observe, the most value coming from AI, as indicated by more than two-thirds of projects studied (69%), are in improving the performance of existing analytics efforts. For purposes of clarity, the analysts define AI as “deep learning techniques using artificial neural networks.”

See also: AI needs big data, and big data needs AI

It’s significant that every industry is finding a way to benefit from AI-driven analytics because the potential case studies vary considerably. A manufacturer may be concerned with syncing its production-floor machines with its supply chain, while a retailer may want to know what customers are using which channels, and healthcare establishment may be concentrating on better ways to track patients’ vital signs remotely.  Recognizing the wide variety of use cases for real-time analytics and operations, RTInsights maintains a library of case studies across a number of major industries. Every company has a different story to tell, and different ways of innovating.

When cognitive computing technologies such as AI are applied to enhance real-time analytics, the innovation explodes. Chui and his McKinsey team describe the following key applications arising from the intersection of AI and analytics:

Predictive maintenance. AI is being trained to detect a wide range of anomalies. “Deep learning’s capacity to analyze very large amounts of high dimensional data can take existing preventive maintenance systems to a new level,” Chui and his co-researchers observe. “Layering in additional data, such as audio and image data, from other sensors—including relatively cheap ones such as microphones and cameras—neural networks can enhance and possibly replace more traditional methods. AI’s ability to predict failures and allow planned interventions can be used to reduce downtime and operating costs while improving production yield.”

AI-driven logistics optimization. “AI can reduce costs through real-time forecasts and behavioral coaching,” the McKinsey team states. “Application of AI techniques such as continuous estimation to logistics can add substantial value across sectors. AI can optimize routing of delivery traffic, thereby improving fuel efficiency and reducing delivery times. One European trucking company has reduced fuel costs by 15 percent, for example, by using sensors that monitor both vehicle performance and driver behavior; drivers receive real-time coaching, including when to speed up or slow down, optimizing fuel consumption and reducing maintenance costs.”

Customer service management and personalization. “Improved speech recognition in call center management and call routing as a result of the application of AI techniques allow a more seamless experience for customers—and more efficient processing,” Chui and his co-authors state. “For example, deep learning analysis of audio allows systems to assess a customers’ emotional tone; in the event that a customer is responding badly to the system, the call can be rerouted automatically to human operators and managers.”

As AI takes real-time analytics to a whole new level, there are new types of requirements as well. For starters, Chui and his colleagues point out, “data requirements for deep learning are substantially greater than for other analytics in terms of both volume and variety.” It often is built upon thousands of data records that enable data models “to become relatively good at classification tasks and, in some cases, millions for them to perform at the level of humans. By one estimate, a supervised deep-learning algorithm will generally achieve acceptable performance with around 5,000 labeled examples per category and will match or exceed human deep level performance when trained with a data set containing at least 10 million labeled examples.”

Perhaps, with the ever-expanding Internet of Things, with the terabytes and petabytes’ worth of data streaming in from various devices, systems, and applications, this threshold can be achieved.


About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *