David Linthicum
Contributor

What happened to edge computing?

analysis
Oct 20, 20234 mins
AnalyticsArtificial IntelligenceCloud Computing

Edge computing offers less latency and bandwidth savings, but the lack of standards and problems with interoperability and security still need to improve.

Woman standing in front of chalkboard with question marks.
Credit: Thinkstock

Edge computing emerged as a revolutionary tool to address the rising demand for real-time data processing. By enabling data processing at the edge of the network, closer to where it’s generated, edge computing significantly reduces latency and bandwidth use. 

That’s the story we’ve been told for years, but how will it evolve with the new demands of generative AI and bandwidth explosion?

Edge computing today

Currently, edge computing is a major force in many sectors. It ensures lower latency and optimized data deliverability—at least it has the potential for both benefits. The internet of things, autonomous vehicles, and Industry 4.0 widely incorporate edge computing.

However, edge entered its awkward teenage years. The number of applications was not what many had thought. In many instances, it first looked like edge computing would be the target architecture but it turned out to make more sense to centralize more processing and data storage.

This is mainly due to the expanding availability of bandwidth, such as 5G, and problems with managing many devices and systems at the edge. I believe this to be the most significant hindrance, and I’ll explain why.

Edge computing challenges

Despite the many benefits, edge computing is full of challenges. For instance, decentralizing data processing brings security and privacy concerns. A friend who deployed edge systems on oil rigs had 10% of the edge computing devices stolen, along with data stored on the devices. The data was encrypted, but what a huge wake-up call when systems can grow legs and walk away. That’s never been a problem with the cloud.

Standardizing edge computing devices and ensuring their interoperability are other significant hurdles. There is no way to leverage digital radio communications or management standards to operate these systems. Edge computing vendors need to get on the same page.

Despite the rise of some common standards, edge computing largely lacks interoperability with systems in enterprise data centers. With each edge computing vendor supporting their own “standard,” it gets expensive to keep the various skills around to support edge-based systems.

Edge computing vendors are quick to explain the lack of standards because each edge-based system’s mission is vastly different than the others. One may focus on high-speed data gathering and processing to support airplane engine operations. Others may support point-of-sale terminals. Both are edge computing, but they have very different missions.

Edge computing continues to find a path of promising innovation. However, we may be at innovation saturation and need to focus on expansion and operations.

The future at the edge

Developments such as 5G networking and generative AI will further elevate edge computing’s potential. Knowledge engines running within the edge are a massive area of growth right now. The advent of 5G will dramatically speed up data relay and computational tasks, while AI will enable much more sophisticated data processing at the edge.

The core issues with edge computing are the lack of standards and vast heterogeneity leading to complexity. The resulting operational problems may be more difficult to overcome than most understand. There are a few ways to look at this issue.

First, seeing edge computing as a valid architecture pattern is an apparent success. We’ve understood that moving data and processing closer to the point of generation is a better approach for many use cases, and now we have the technology and bandwidth to pull it off.

Second, given the diverse set of problems that edge computing solves, it’s unlikely that we’ll have common standards anytime soon. You can’t expect the data storage standards for an oil rig and an autonomous vehicle to be the same. They are attempting to solve very different problems, and you don’t want to implement “standards” to limit what they need to do.

Edge computing will likely evolve into different usage patterns during the next few years. Most of these will be defined by technology developed for those applications. The standard will follow those usage patterns, and we’ll likely see many. 

Edge computing will grow with cloud computing, AI, cloud-native, etc., but we must understand that it will vary by application. It’s a concept that can leverage many different technology types, and that’s why it’s useful.

David Linthicum
Contributor

David S. Linthicum is an internationally recognized industry expert and thought leader. Dave has authored 13 books on computing, the latest of which is An Insider’s Guide to Cloud Computing. Dave’s industry experience includes tenures as CTO and CEO of several successful software companies, and upper-level management positions in Fortune 100 companies. He keynotes leading technology conferences on cloud computing, SOA, enterprise application integration, and enterprise architecture. Dave writes the Cloud Computing blog for InfoWorld. His views are his own.

More from this author