Managing edge computing workloads is an increasingly compelling use case for Kubernetes.
Edge computing and Kubernetes aren’t frequently mentioned in the same sentence. Kubernetes can be used in a variety of contexts, and edge computing is only one of them.
Yet as interest in extending workloads to the edge grows, Kubernetes is likely to become more important to edge computing implementations. The platform offers a variety of benefits in deploying and managing edge workloads.
What does Kubernetes do?
Kubernetes is a platform designed mainly for deploying and managing applications that run inside containers (although it can be used to manage virtual machines, too).
Kubernetes has become popular over the past several years because of its ability to automate complex, large-scale application deployments. It would be virtually impossible to manage dozens of containers, and hundreds of container instances, by hand. Kubernetes manages them automatically by scaling applications up and down in response to demand, restarting failed applications, and moving workloads between different servers in a cluster in order to balance loads.
In other words, Kubernetes brings to every organization the type of massive, automated management that previously was available only to hyperscale organizations with complex, purpose-built orchestration frameworks.
Why use Kubernetes for edge computing?
Again, its features make the platform useful for a variety of use cases. You can use Kubernetes to manage any on-premises or cloud-based environment. You can also use it in a multi-cloud or hybrid context, wherein Kubernetes becomes the central control plane for multiple clouds, or for both cloud-based and on-prem infrastructure, at the same time.
For edge computing, however, Kubernetes offers some key benefits.
Centralized management of disparate infrastructure
Chief among them is the fact that, as we just noted, Kubernetes can manage multiple types of infrastructure at the same time. That makes Kubernetes ideal for an edge computing scenario in which some applications (or parts of applications) reside on-premises or in the cloud, while others run at the edge.
Kubernetes can manage all these resources centrally, eliminating the need to juggle different management tools for each environment.
Because of its automated management features, Kubernetes can react instantly to changes at the edge.
For example, if an edge location goes offline – which it may in some edge use cases, such as those involving vehicles that power down when they are not in use, or IoT sensors with intermittent, long-range network connectivity – Kubernetes can react in real time by rerouting traffic to an alternate location. A disconnected edge location is not that different from a failed node, which Kubernetes is designed to handle with ease.
The fact that Kubernetes can deploy new container or application versions quickly is also an advantage for edge computing use cases. If a new location needs to be brought online or an application requires an update, Kubernetes offers the automated deployment features necessary to implement the change quickly.
Easy to deploy
A third advantage of Kubernetes for edge use cases is the ease of deployment.
You can deploy a Kubernetes environment on virtually any infrastructure, using any deployment approach. You can set it up yourself on your own servers. You can deploy it as a managed service in the cloud using a platform like EKS or GKE. You can use a management layer like Platform9, which simplifies Kubernetes administration, but is not itself a Kubernetes distribution.
The fact that Kubernetes is a proven and easy-to-deploy platform makes it attractive for edge computing in the sense that edge remains a relatively new technological niche. Being able to implement edge environments using technology that organizations already know and trust, and that they can set up quickly, beats having to learn and deploy brand-new management solutions that are purpose-built for edge.
This is another reason why it’s likely that Kubernetes will remain an important part of the edge computing landscape for the foreseeable future.
You can do many things with Kubernetes, and it’s hard to imagine a future where Kubernetes is used only for edge computing. But managing edge workloads is an increasingly compelling use case for Kubernetes, given the ways in which its features cater to the challenges posed by edge computing.