Get started Bring yourself up to speed with our introductory content.

IoT edge devices need benchmarking standards

This is the third part of a four-part series. Start with the first post here.

In the first two installments of this four-part series I discussed what edge computing is and why it matters. I also outlined some key architectural and design considerations based on the increasing complexity of hardware and software as you approach the device edge from the cloud. In this installment, I’ll dig even deeper.

Infrastructure size matters

Cameras are one of the best sensors around, and computer vision — applying AI to image-based streaming data — is the first killer app for edge computing. Only someone who wants to sell you wide area connectivity thinks it’s a good idea to blindly send high-resolution video over the Internet. A smarter practice is to store video in place at your various edges and review or backhaul only when meaningful events occur.

For any given use of edge devices, a key value that a provider can give customers is pre-validation and performance benchmarks on workloads. This ensures customers purchase the right-sized infrastructure up front and get reliable performance in the field. Surveillance for safety and security is fairly constrained in terms of variables, such as the number of cameras, resolution and frame rate and footage retention time. The marketplace for camera makers and video management providers is well-established. The combined constrained variables make benchmarking appropriate infrastructure sizes relatively straightforward.

As new tools and the rise of edge computing enable scale for the use of computer vision, applications become less constrained than surveillance, and data regarding behavior and performance needs is harder to come by. For example, in a brick and mortar retail scenario, the compute power needed to identify basic customer demographics — such as gender or age range — with an AI model is different than what’s needed to assess individual identity. Retailers often don’t have power or cooling available in their equipment closets, so they must get creative. It would be valuable for them to know in advance what the loading requirements will be.

Users’ needs are likely to grow over time with the consolidation of more workloads on the same infrastructure. It’s important to deploy additional compute capacity in the field and invest in the right modular, software-defined approach up front, so you can readily redistribute workloads anywhere as your needs evolve.

Fragmented edge technology makes benchmarking trickier

In more traditional telemetry and event-based IoT, measuring efficacy and developing benchmarks is especially tough due to the inherent fragmentation near the device edge. Basically every deployment tends to be a special case. With so many different uses and tool sets, there are no established benchmark baselines.

I often draw edge to cloud architecture outlines left to right on a page because it fits better on slides with a landscape layout. Someone a few years back pointed out to me during a presentation when I was talking about many of these concepts that the cloud on the right is like the East with the longest legacy of refinement and stability, whereas the edge on the left is the Wild West. This pretty much nails it, and this is why it’s so important for us to collaborate on open tools like EdgeX Foundry that facilitate interoperability and order in an inherently fragmented edge solution stack. It takes a village to deploy an IoT solution that spans technology and domain expertise.

In addition to facilitating open interoperability, tools like the EdgeX Foundry framework provide bare-minimum plumbing to not only serve as a center of gravity for assembling predictable solutions, but also facilitate stronger performance benchmarks regardless of use.

Tools should fit a standard for IoT edge interoperability, so IT pros can focus on value instead of reinvention. An IoT standard would also create benchmarking for infrastructure sizes, so customers can better anticipate their needs over time.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

CIO
Security
Networking
Data Center
Data Management
Close