Edge computing is focused on putting compute infrastructure and storage in or near enterprise branches instead of in central data centers. The technology is coming to the fore as we move into an era of compute tasks that require short response times or want to operate on masses of data. These tasks are more practical to handle locally than to send to another location for analysis.
For example, IoT applications, such as real-time control of high-speed manufacturing lines, may require sub-millisecond response time and the need for massive data analysis.
What is edge computing anyway?
Edge computing is not just putting servers back in branches the way many organizations did in the past. Enterprises have spent too much time pulling servers out of branches and into data centers just to have them flow back in. The benefits of centralization in both service quality and staffing make the old version of distributed infrastructure a non-starter.
When we talk about edge computing in the enterprise context, we mean something more orderly, sustainable and limited. Edge computing is about creating a distributed infrastructure that has three main characteristics: central management, hands-free and lights-out operation, and cloud-style infrastructure.
Central management is critical to preventing the slide back into unevenly maintained systems, massive variations in performance and availability, and serious security risks that characterized overly local infrastructure.
With fully centralized management, edge infrastructure is expected to be lights out. In other words, staff on site are not expected to interact with the edge infrastructure under normal circumstances. It can live in the dark, perhaps in a wiring closet with adequate climate controls, perhaps in a local hosting facility.
Cloud-style infrastructure is key to enabling central management and lights-out operation. Generic compute and storage blocks, and especially converged or hyperconverged infrastructure, can be available to centrally provisioned and managed workloads. By using such building blocks, IT can easily deliver the services that have to run locally, resulting in a changing portfolio as needs evolve. This setup also lets IT add more resources if needed. Plug-and-play components, for instance, can be installed with little effort and time, then automatically discovered and brought into the resource pools for local use.
The rise of edge computing will offer some benefits to the network. Edge infrastructure will offload some traffic and ease some demands for the highest-performing connectivity. It can also offer a more convenient way to provision virtual network stacks.
Reduced traffic comes from eliminating the streams associated with the workloads that run locally. If the factory floor automation, for example, no longer flings several gigabytes of sensor data at the data center every minute, that eases the load on the WAN.
If the need for ultra-low response times and ultra-low packet loss is removed, the branch should not need as much high-quality WAN capacity. In that case, IT can consider decreasing the amount of MPLS bandwidth to the branch, or migrating off MPLS there altogether, especially if IT is also deploying SD-WAN.
Lastly, as enterprises continue to virtualize their network stacks broadly, especially in branches, the need for high-quality infrastructure to run virtual appliances or virtualized network functions will rise. Edge infrastructure can provide that and even form the nucleus of an internet and security hub for the organization, offloading network tasks from other branches and from central data centers.