Edge Computing Finally Gets a Framework

PinIt

The distributed computing framework put forth by the IIC provides insights into the capabilities and architectures of edge computing nodes and systems.

What is edge computing? Ask 10 industry analysts that question, and you will get 11 responses. It’s about devices, it’s about remote systems, it’s about remote clouds, it’s about embedded systems, it’s about embedded chips.

To bring clarity to this fast-emerging part of the business technology space, the Industrial Internet Consortium recently issued an overview. “There is much architectural work still needed to address all the challenges in distributed, computing in the edge,” the IIC authors point out.

The IIC report authors offer up a definition of edge systems:

“A series of modules, including a system security management module, a system provisioning and management module. a trusted computing module. an end-to-end security module, and a node management module.”

The distributed computing framework put forth by the IIC “provides insights into the capabilities and architectures of distributed computing, edge nodes, and edge systems.” The IIC framework focuses on two essential properties of edge systems: end-to-end security and system management.

The essential components of edge architecture include the following deployment models:

Simple edge deployment: “An edge node is an assembly of hardware and software components that implement edge functions. Such a standalone edge computing node can be installed anywhere in the edge system to provide computing, networking, and storage services close to data producers or consumers”

Somewhat complex deployment: “In a slightly more complex model of edge deployment, multiple logical edge nodes may be instantiated in a single physical edge node. They share the same hardware platform but are fully isolated from each other. This deployment model is modular, scalable, and efficient. It is the primary support mechanism for multi-tenancy.”

Extremely complex model: “A logical edge computing node is assembled from the one or more physical or logical edge nodes. One version of this merges the capabilities of multiple physical edge computing nodes, which may be peers on the same or adjacent layer(s), to handle a
computation, networking, or storage load heavier than a single physical edge node can manage. In a variation, multiple physical edge nodes are grouped into a fault-tolerant cluster, so that a failure in one of the edge nodes will be mitigated by its peers.”

Defining Edge Computing

Edge systems are a key element in resiliency, and this is a capability that cannot be understated. These systems “help to balance the load between peer-level edge nodes. If one edge node is experiencing an overload, it can shuttle some of its traffic to a nearby edge node that is less heavily loaded. This balances the loads on the nodes in a layer and improves overall network resource utilization.”

The security architecture shows how to start with a trusted computing module and build edge node security up to a complete trustworthy system.

The system management capabilities described illustrate how to manage edge nodes and application services, supporting capabilities like orchestration, resource management, and the operation of distributed computing workloads throughout their lifecycle.

Beyond the security focus of this document, edge has implications for the four other dimensions of trustworthiness defined by IIC (privacy, safety, reliability, and resilience).

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *