Fog computing appears to be taking shape.
This week’s announcement from Cisco and SAS that they released an edge-to-enterprise analytics platform along with the release of a fog computing reference architecture in early February may bring the vision of fog computing out of the mist and into form.
A fog computing architecture generally describes the ability to perform analytics at various points in a network — from edge devices to the cloud and data center. Proponents of the approach claim it is needed to deal with issues of latency and security at the network edge, such as in applications in which control systems must respond instantly, such as in connected cars.
Under the Cisco-SAS platform, Cisco’s networking, edge and data center infrastructure will link with SAS’s software for streaming and advanced analytics. Edge routers will run SAS event stream processing software, connecting to external apps such as control systems, while also sending data to the data center for analysis including pattern detection. Cisco and SAS see applications in smart grids; connected cars; and smart manufacturing, including predictive maintenance.
The announcement from Cisco and SAS follows the release of a reference architecture for fog computing from the OpenFog Consortium, of which Cisco is a founding member, on Feb. 8.
“Just as TCP/IP became the standard and universal framework that enabled the Internet to take off, members of OpenFog have created a standard and universal framework to enable interoperability for 5G, IoT and AI applications,” said Helder Antunes, chairman of the OpenFog Consortium and senior director for the Corporate Strategic Innovation Group at Cisco.
Why fog computing?
According to the OpenFog Consortium, massive mounts of data produced, transported, analyzed and acted upon within industries such as transportation, healthcare, manufacturing and energy—collectively measured in zettabytes—expose challenges in cloud-only architectures and operations that reside only at the edge of the network.
For example, with an autonomous vehicle system, a smart car will generate terabytes of data per trip while it connects and communicates in motion with traffic control, municipality infrastructure and other vehicles. Current IoT system architectures cannot address latency measured in sub-milliseconds and situations where reliable network availability and bandwidth is crucial.
The OpenFog architecture—which can feature multiple layers of fog nodes acting upon the data closer to its source and managing the fog-to-thing, fog-to-fog and fog-to-cloud interfaces–successfully addresses these requirements, the group said.
According to the OpenFog Consortium, fog computing also deals with issues related to security, cognition, agility, and efficiency. “While fog computing is starting to be rolled out in smart cities, connected cars, drones and more, it needs a common, interoperable platform to turbocharge the tremendous opportunity in digital transformation. The new OpenFog Reference Architecture is an important giant step in that direction,” said Antunues.
The OpenFog Reference Architecture contains a medium- to high-level view of system architectures for fog nodes (smart, connected devices) and networks, deployment and hierarchy models, and use cases. Future documents will provide updated requirements and lower-level details, including formal, enumerated requirements that will form the basis of quantitative testbeds, certifications and the specified interoperability of fog elements. The reference architecture is based on eight core technical principles, including security, scalability, openness, autonomy, RAS (reliability, availability, and serviceability), agility, hierarchy and programmability.