by Michael Maxey

How to make Kubernetes work at the edge

feature
Feb 06, 20245 mins
Cloud ComputingSoftware Development

Kubernetes thrives in centralized data centers, but raises size, scalability, security, and interoperability concerns in distributed edge deployments. Here are some strategies to address them.

shutterstock 1748437547 cloud computing cloud architecture edge computing
Credit: amgun / Shutterstock

Kubernetes and edge computing are poised to power the new generation of applications, both together and separately. The enterprise market for edge computing is expected to grow four to five times faster than spending on networking equipment and overall enterprise IT. At the same time, Kubernetes is the default choice for overseeing the management of containerized applications in typical IT environments. A record high of 96% of organizations reported they are either using or evaluating Kubernetes—a major increase from 83% in 2020 and 78% in 2019.

Combining the two would open up tremendous opportunities in a range of industries, from retail and hospitality to renewable energy and oil and gas. With the proliferation of connected devices and equipment generating tremendous amounts of data, the processing and analysis that has been managed in the cloud is increasingly moving to the edge. Similarly, now that the vast majority of new software is being managed in a container, Kubernetes is the de facto choice for deploying, maintaining, and scaling that software.

But the pairing isn’t without its complexities. The nature of edge deployments—remote location, distributed environments, concerns around safety and security, unreliable network connections, and few skilled IT personnel in the field—is at odds with the basics of Kubernetes, which thrives in centralized data centers but doesn’t scale out to the distributed edge, support smaller edge node footprints, or have robust zero-trust security models.

Here are four common concerns about deploying Kubernetes at the edge and some real-world strategies to overcome them.

Concern #1: Kubernetes is too big for the edge

Although originally designed for large-scale cloud deployments, the core principles behind Kubernetes—containerization, orchestration, automation, and portability—are also attractive for distributed edge networks. So, while a straight one-to-one solution doesn’t make sense, developers can select the right Kubernetes distribution to meet their edge hardware and deployment requirements. Lightweight distributions like K3s carry a low memory and CPU footprint but may not adequately address elastic scaling needs. Flexibility is a key component here. Companies should look for partners that support any edge-ready Kubernetes distribution with optimized configurations, integrations, and ecosystems.

Concern #2: Scaling Kubernetes at the edge

It’s common for an operator managing Kubernetes in the cloud to handle three to five clusters that scale up to 1,000 nodes or more. However, the numbers are typically flipped at the edge, with thousands of clusters running three to five nodes each, overwhelming the design of current management tools.

There are a couple of different approaches to scaling Kubernetes at the edge. In the first scenario, companies would aim to maintain a manageable number of clusters via sharding orchestrator instances. This method is ideal for users who intend to leverage core Kubernetes capabilities or have internal expertise with Kubernetes.

In the second scenario, you would implement Kubernetes workflows in a non-Kubernetes environment. This approach takes a Kubernetes output like a Helm chart and implements it upon a different container management runtime, such as EVE-OS, an open-source operating system developed as part of the Linux Foundation’s LF Edge consortium, which supports running virtual machines and containers in the field.

Concern #3: Avoiding software and firmware attacks

Moving devices out of a centralized data center or the cloud and out to the edge greatly increases the attack surface and exposes them to a variety of new and existing security threats, including physical access to both the device and the data it contains. Security measures at the edge must extend beyond Kubernetes containers to include the devices themselves as well as any software running on them.

The ideal approach here is an infrastructure solution, like EVE-OS, which was purpose-built for the distributed edge. It addresses common edge concerns such as avoiding software and firmware attacks in the field, ensuring security and environmental consistency with unsecured or flaky network connections, and deploying and updating applications at scale with limited or inconsistent bandwidth.

Concern #4: Interoperability and performance requirements vary

The diversity of workloads and the number of systems and hardware and software providers inherent in distributed edge applications and across the edge ecosystem put increasing pressure on the need to ensure technology and resource compatibility and achieve desired performance standards. An open-source solution provides the best path forward here, one that disavows vendor lock-in and facilitates interoperability across an open edge ecosystem.

Kubernetes and edge computing: A harmonic convergence

It remains to be seen whether Kubernetes will one day be compatible with every edge computing project, or if it will provide as powerful a solution at the edge as it does in the cloud. But what has been proven is that Kubernetes and the edge is a viable combination, often with the power to deliver new levels of scale, security, and interoperability.

The key to success with Kubernetes at the edge is building in the time to plan for and solve potential issues and demonstrating a willingness to make trade-offs to tailor a solution to specific concerns. This approach may include leveraging vendor orchestration and management platforms to build the edge infrastructure that works best for specific edge applications.

With careful planning and the right tools, Kubernetes and edge computing can work in harmony to enable the next generation of connected, efficient, scalable, and secure applications across industries. The future looks bright for these two technologies as more organizations discover how to put them to work successfully.

Michael Maxey is VP of business development at ZEDEDA.

New Tech Forum provides a venue for technology leaders—including vendors and other outside contributors—to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to doug_dineley@foundryco.com.