Latest Cisco-IBM Deal Brings Watson to the Edge

PinIt

“We’re trying to reduce the unnecessary transmission of data to the cloud.”

IBM’s fabled Watson system is well-known as an analytics platform that puts together archived data to provide likely answers to intractable problems or questions. A new initiative between Watson IoT and Cisco, however, is likely to result in real-time insights from any and all endpoints.

The Cisco-IBM collaboration seeks to enable organizations in remote areas tap the combined power of Watson IoT and Cisco’s edge analytics capabilities to more deeply understand and act on critical data on the network edge.

Cisco, through its offerings and support for “fog computing” initiatives, is a proponent of deploying intelligence at the edge, or anywhere else along the spectrum of connections between devices and centralized servers. “We’re pairing Cisco’s edge capabilities with Watson so we can make intelligent decisions at the edge of the network, on what data is valuable,” says Mike Flannagan, ‎vice president of data and analytics at Cisco.

Cisco provides the “ability to do edge analytics on streaming data, and make sense of data – we’re looking at aggregates and summaries, such as time series, be it one-minute, five-minute, or one-hour averages,” Flannagan explains. Through this partnership, “we’re providing the ability for Watson to distribute a little piece of their code to the edge, to tell our software what data is interesting to them.”

The ability to move more analytics processing to the edge is key, particularly for real-time insights based on IoT and other distributed data. But for businesses without easy access to high bandwidth connectivity, these capabilities are sometimes out of reach or take too long. Cisco and IBM’s intent is to provide such capabilities operating on the edge of computer networks such as oil rigs, factories, shipping companies and mines, where time is of the essence but bandwidth is often lacking.

For example, workers in remote environments will now be able to better monitor the health and behavior of critical machinery and more accurately plan for needed maintenance and equipment upgrades. For example, Flannagan illustrates, “a machine operator may need to take readings for things like vibrations, temperature and pressure. If there’s a problem that’s going to cause a safety issue, I need to find that right now, not three days from now.”

In addition, it may be too costly and wasteful to be constantly moving all data being generated to a centralized location for analytics. “You may need to do some decision making at the point or very near to the point of generation,” Flannagan says. “You need to decide how valuable it is, and what you need to do with it.”

Typically, companies using IBM Watson “would have sent all their raw data to the cloud,” he continues. “Then Watson would have to decide once it was already there how valuable it was. You send a lot of irrelevant data up to the cloud, with all the infrastructure required to do that — routers switches, bandwidth, cloud capacity – then discarding most if it at the end because it was not relevant to the problem you were trying to solve. We’re trying to reduce the unnecessary transmission of data to the cloud.”

The solution will be marketed and sold together and will include Watson IoT Platform cognitive and real time insights from IBM with Cisco streaming edge analytics. This will enable real-time analytics to be performed at the edge and collection of data for longer-term analysis in the cloud.

Related:

IoT architectures for edge analytics

Why edge computing is crucial for the IoT

Special report: fog computing reference architecture

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor. He is a regular contributor to Forbes on digital, cloud and Big Data topics. He served on the organizing committee for the recent IEEE International Conference on Edge Computing (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply