Testing Edge Processing for the Industrial IoT

PinIt
edge computing

A new initiative from the Industrial Internet Consortium seeks to test edge computing and analytics for industrial IoT applications.

A new testbed launched under the aegis of the Industrial Internet Consortium (IIC), in partnership with Hewlett Packard Enterprise, seeks to support more rapid development of testbeds and industrial IoT innovation.

Many emerging industrial IoT applications require coordinated, real-time analytics at the “edge” of a network, using algorithms that require a scale of computation and data volume/velocity previously seen only in the data center. Frequently, the networks connecting these edge devices do not provide sufficient capability, bandwidth, reliability, or cost structure to enable analytics-based control or coordination algorithms to run in a separate location from the machines.

[ Related: 7 IIC Testbeds Offer a Look at the Future of the Industrial IoT ]

IIC members HPE and Real-Time Innovation have joined together on the Edge Intelligence Testbed. The primary objective of the Edge Intelligence Testbed is to significantly accelerate the development of edge computing architectures and algorithms by removing the barriers that many developers face: access to a wide variety of advanced compute hardware, and software configurable to directly resemble state-of-the-art edge systems at very low cost to the tester/developer.

Edge Computing Architectures

“Edge intelligence requirements have been significantly underserved,” says Lin Nease, chief technologist for IoT at HPE. There are few resources at this time, he said, that can be employed for processing on the edge.

“If someone wanted to simulate the cloud, they wouldn’t have to go very far, they already have the cloud,” he said. The need for edge processing technologies can be found in “an airport, or in a plant environment, or on an oil rig, or in an aircraft or a train locomotive.” Such configurations may include “special-purpose devices, graphics-processing units, digital-signal processors, and different processor architectures that you would find in many different embedded and controlled system environments.” This type of on-site hardware “is not easy to find from a cloud provider,” Nease says. “Basically we’re providing a form of a testbed cloud for embedded electronics.”

For its part, HPE is providing the server hardware at no cost to participating companies, he continued. “We are partnering with whomever is required to get the testbed done. In some cases we’re required to carry gear around to different locations in order to fully simulate the environment that makes the testbed real.”

Use Cases for Edge Analytics

Nease provides examples of use cases that the testbed is exploring. For example, “let’s say I have an IoT device that’s doing some condition monitoring on some complex machinery that I sell. It always helps to have centralized correlation of data coming from all the different installations, so I can do a better job of understanding systematic failures or quality trends and the like.”

Often, he says, localized or control-type decision making is needed, “where some of that information needs to be gleaned very rapidly, and cannot be done over a wide area network with its latency and bandwidth limitations.”

There is a place for both architectures, he continues. “The centralized data will always be valuable for looking for trends and understanding in the aggregate how a globally distributed deployment is working. But at the same time, there will always be a need for rapid intelligence at the edge, either for control system purposes, or people who need to make controlled decisions.”

The Edge Intelligence Testbed is now active, with facilities in Houston, USA, Grenoble France, and Singapore. Nease hopes to extend the work coming out of the test cases to develop a standardized architecture, which is “what’s missing from the industry today,” says Nease. “My hope is that we can emerge some standard platforms for doing distributed event processing. None exists yet that are in widespread use. Also, higher-layer security protocols that address some problems, and a lot of that process will need to occur at the edge of the network.”

Nease adds that “a lot of the value proposition of edge processing today is simply presenting legacy devices on an IT network. Much of the gateway functionality people talk about is devoted to exposing devices on IT networks. That need for the edge will be pervasive, but relatively minor compared to distributed event processing and some of the analytics requirements are going to work their way to the edge.”

IIC testbeds bring together collaborating companies and customers from among 200 participating companies to build an ecosystem, where a vendor takes initiative to build a solution and works with collaborating companies to pursue test projects and implemented projects for real customers. IIC also oversees a number of other testbeds geared to specific functions. Those include the following:


Want more? Check out our most-read content:

White Paper: How to ‘Future-Proof’ a Streaming Analytics Platform
Research from Gartner: Real-Time Analytics with the Internet of Things
E-Book: How to Move to a Fast Data Architecture
The Value of Bringing Analytics to the Edge
Three Types of IoT Analytics: Approaches and Use Cases
IoT Architectures for Edge Analytics
IoT Security: It’s More About Privacy Than Killer Roombas

Liked this article? Share it with your colleagues!

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *