After a year of intense debate over its definition, the OpenStack Foundation has published a white paper laying out what “the edge” should mean.
The OpenStack Foundation has published a white paper on edge computing, the result of a year of intense debate on the definition of “the edge” and the potential use cases for edge technologies.
Ildiko Vansca, one of the authors and ecosystem technical lead at OpenStack Foundation, said the white paper is a collaborative effort, involving people from multiple industries, companies, and open source communities.
See also: Intel Lays Out Vision for Future of AI Applications
In it, OpenStack defines edge computing as “offering application developers and service providers cloud computing capabilities, as well as an IT service environment at the edge of a network. The aim is to deliver compute, storage, and bandwidth much closer to data inputs and/or end users.”
It also sets out some of the requirements for edge computing, which include a virtual machine, an image manager, storage manager, network manager, administrative tools, a way to address storage latency, reinforced security, resource utilization monitoring, orchestration tools to manage multiple sites, and load balancing across hardware.
Alongside defining the edge, the white paper puts forward some emerging use cases for edge technologies, apart from SDN and 5G, which already look commercial viable. Network functions virtualization, Internet of Things, virtual reality, and gaming are all considered potential use cases for edge computing.
Fundamentally, almost all areas of IT that are seeing large increases in the need for more computing and networking could be considered for edge computing. MEF, ONF, ETSI and The Linux Foundation all are working on their own definitions and use cases for the edge, and some may see broader use cases than OpenStack Foundation.