It will be some time before the edge vs. cloud computing question has a clear-cut answer, but organizations can use options such as fog nodes to move in the right direction with confidence.
IoT is expanding at an exponential rate, and with it there is a growing requirement for edge computing, the network architecture that brings data processing closer to the source of data or applications. Part of what prevents organizations from choosing edge computing when it comes to IoT infrastructure is that edge architecture is still very new and so flexible that there's no best way to configure it. Even though edge computing and dedicated networks beyond the enterprise firewall offer the flexibility many IoT projects need, organizations can find the set up challenging without the established architectural standards found with more mature technologies. Organizations accrue great efficiencies and insightful data from their IoT investments even when using cloud computing. But extending resources to the edge means a continued effort to get the most out of IoT.
IoT is a new frontier when it comes to infrastructure, and it's not always as simple as edge vs. cloud computing. Organizations must determine whether edge or cloud computing will best distribute processing resources for optimal performance and then weigh the challenges.
The architecture that brings the cloud to the edge
Sometimes the edge needs an extra boost -- the fog node -- which changes the considerations of edge and cloud computing. Most edge and cloud computing architectures need components to make their data processes possible. Typically, cloud computing uses IoT gateways to organize and transport data through the network and route traffic while applying security protocols. Edge nodes create data from IoT devices with many different communication protocols usable by cloud computing. The problem is that gateways and edge nodes are no longer enough for IoT applications, such as a facial recognition system, that need real-time analysis at the edge. The latency introduced by a round trip to the cloud to crunch data diminishes the usefulness of an application intended to be real-time.
The fog node is the architectural answer to this problem, an addition to edge architecture that can tip the scales in favor of the edge in the cloud vs. edge debate. A fog node is a physical server that brings real-time analytical processing to IoT, above and beyond the routing and messaging functionality of simple edge nodes. This extension of resources to the edge greatly empowers IoT. Gateway functionality -- including device monitoring, message routing and data clean-up and aggregation -- can be consolidated with augmented computing resources that can perform at a big data level by the fog node, without the back and forth to the cloud. This makes the IoT environment far more efficient and responsive. Fog nodes can, in principle, make the IoT deployment self-sufficient.
When to use fog nodes
To understand when technologists should use edge with fog nodes or cloud computing, they should ask the following questions.
Does the system need to respond in real time? Applications that require real-time decisions with minimal latency will require fog nodes. For example, facial recognition in a security system requires AI; a camera system in the field would therefore need fog nodes to do the heavy lifting. Traffic management systems are another example.
Is there a human in the loop? A low latency requirement in an IoT application is usually driven by machine-to-machine communication; processing must happen immediately because a device is waiting for a response. If an IoT application needs processing power for analytics in the service of decision support, and the result is notifying a human being, then there's probably time for that round trip to the cloud. Fog nodes would be an unnecessary expenditure.
Are you gathering multipurpose data? An increasingly common scenario is that data gathered by IoT has more than one use and serves more than one machine learning purpose. In this case, it falls to the fog nodes to parse the data. What stays on the edge and what gets sent to the cloud?
One typical scenario would be machine monitoring on a factory floor. The edge system works in real time, ready to respond to hardware failures that might threaten human safety, and machine performance metrics go to the cloud for long-term process efficiency and resource consumption analysis. Another example is a retail store that uses smart mirrors to provide real-time imaging and recommendations of in-stock products for the consumer via the edge system. The consumer's responses are sent to the cloud to feed the predictive model, which will later be used to update the mirrors.