Sergey Nivens - Fotolia


What the future of edge computing means for cloud and networks

IoT and application processing needs are shaping the future of edge computing, driving the growth of micro modular data centers and raising the question of cloud computing's relevance.

Edge computing could be defined as any computing environment located outside a cloud environment. This definition is broad, however, and includes many different application types. A better description of edge computing is it places processing close to the data source rather than perform concentrated processing at a distant central location.

Companies investigating possible edge computing applications shouldn't assume they can simply select an option off the shelf. The wide variety of applications means no single network architecture exists for the edge. Each network must be designed to meet the specific requirements of an application.

Edge computing is not new -- branch office servers have been executing applications for years. But much of the recent interest in edge computing has stemmed from the growth of IoT. Hundreds of IoT device types have been developed, ranging from simple devices that periodically report temperature to sophisticated machine tools.

As IoT growth and application processing requirements influence the future of edge computing, it's also useful to compare edge computing with cloud computing. While each option has its advantages, the ultimate decision filters down to organization requirements and customization.

How IoT sensors affect the future of edge computing

Applications that produce a large amount of data can be best processed at the edge. A large warehouse or industrial facility might have thousands of temperature and humidity sensors, for example. An application must examine this data as it arrives from the sensors and take action if the temperature or humidity exceeds limits. That application could be located either at the edge or in a cloud.

Edge computing and processing
Edge computing brings data processing closer to the data source.

Processing the application in a cloud environment would require transferring all the data readings across the network. Processing at the edge, however, would eliminate the need to transfer those readings. The link between the edge and the cloud would carry only periodic reports, so it would cost less than a link that carried a constant flow of high-volume data. The tradeoff would be the continuing cost of a communication link versus the cost of locating and maintaining a processor at the edge.

Processors have continued to fall in price, so edge computing will be less expensive than cloud computing in many cases. But each application is different, and organizations must carefully study before choosing an option.

Consider response time and security

Response time can also dictate the choice of edge computing vs. cloud computing. For example, a refinery might have thousands of sensors. In this case, the application that monitors the sensors would need to respond quickly when a temperature exceeds its limit. The delay that comes when sending data to the cloud and waiting for a response could cause a serious problem.

The wide variety of applications means no single network architecture exists for the edge.

Security is also a major issue when choosing computing architecture. Neither edge computing nor cloud computing can always provide the best option. One could argue that processing and storing critical information at the edge mean less information is stored at each location. An attack on one site will not compromise all of an enterprise's data. The counterargument here is storing data at numerous sites makes it more vulnerable.

The fact that specialized security personnel are available only at a central site is not an issue. In most cases, centralized staff monitors and manages the edge sites. Here again, it's necessary to examine the application, considering factors like the vulnerability of data at the edge and its value. Credit card numbers, for example, require more protection than temperatures and pressure readings.

Implement edge computing with MMDCs

An edge computing facility can be located within a warehouse, refinery or retail store, but other options are also available. In the past few years, micro modular data centers (MMDCs) have grown in popularity. An MMDC is a complete computing facility in a box, and it contains everything included in a data center: processors, network, power, cooling and fire suppression, as well as protection from electromagnetic interference, shock and vibration.

MMDCs are constructed and configured at a central facility and then shipped to a location close to the data source. Units vary from small units that contain a single rack to larger units with multiple racks and processors. Organizations can install MMDCs within a building, as well as in locations like cell phone towers.

MMDCs address versatile computing requirements. Cloud providers have purchased MMDC units and placed them near customers as extensions of their cloud facilities, but with the advantages of lower-cost communications and quicker response. For service providers, MMDCs enable the expansion of processing services. Enterprises are also using MMDC units as a cost-effective and efficient option for internal needs.

Future edge computing developments hinge on IoT and MMDCs

Edge computing will undoubtedly continue to increase as new types of IoT devices develop. It could also prove to be a viable alternative -- or complement -- to cloud computing, especially for applications that require fast response times. Other options -- such as MMDCs -- will make edge computing easier and less expensive than acquiring components and assembling them at remote sites.

Dig Deeper on Network infrastructure

Unified Communications
Mobile Computing
Data Center