folienfeuer - Fotolia

Edge computing use cases led by autonomous cars and coffee bars

Edge computing can decrease latency times dramatically and has found its place in autonomous vehicles, manufacturing plants and retail.

Edge computing has found successful adoption in numerous industries such as in industrial automation, autonomous vehicles and local coffee shops. By allowing users to distribute their IT architecture, edge computing can significantly reduce latency.

Edge computing uses smart devices with computation power at the edge of the network, as close to the user as possible, decentralizing storage and processing centers.

The adoption of edge computing leaves questions about the risks associated with expanding the number of smart devices processing data and whether it should work in tandem with cloud computing or as a replacement.

Edge computing vs. cloud computing

Cloud computing has data gathered and processed in a centralized location, which allows for increased security and control than when devices are deployed to the edge of the network. But these services are not mutually exclusive, and companies will most likely need to have a combination of both in order to succeed.

"Edge computing can optimize cloud computing systems by processing data at the edge of the network, in the cloud, near the source of the data," Shamik Mishra, vice president of technology and innovation at Altran, an engineering services company headquartered in Paris.

Mobile and IoT applications generally have their services hosted on clouds. In edge computing, parts of those applications would be moved closer to the devices and to the edge of the network in order to decrease latency and provide for immediate responses and decision-making.

However, that does not restrict other parts of the application from continuing to reside in the cloud. In fact, companies that seek big data analytics answers require use of the cloud. Only by combining the data gathered from all devices can relevant conclusions and trends be made and understood.

Why lowering latency matters

"It's really about lowering latency between you and your data," Cody Hill, field CTO at Packet, a New York-based web hosting company, said. "Gone are the days of a single large data center serving your data around the world."

Having the capability to store data and process it right at the source of the input decreases latency for those interacting with the technology. One of the more relevant use cases is autonomous vehicles and their need for immediate responses. Each vehicle can process the data that it takes in and act on this information rather than sending it to a central location and waiting for a response before proceeding. For these vehicles, having immediate response time is necessary for both basic operations and public safety.

Edge computing use cases are rising, but the cases may not be the same for all industries. For the aforementioned vehicles, the devices are right there with the user, but for the cellphone industry perhaps 10 locations spread across the U.S. is enough for edge computing to work. They may not require the latency to be down to 1 millisecond and may instead accept more latency in exchange for fewer places to safeguard against potential risks.

Successful edge computing use cases

  1. Industrial automation
  2. Autonomous vehicles
  3. Intelligent assistants
  4. Hands-free robotic surgery
  5. Video streaming
  6. Connected homes

Edge computing cuts inoperability

Edge computing in a different market may be in place for a wholly different reason than decreasing latency. For coffee shops or retail outlets that hold numerous branches, having the ability to store data and operate partially independently allows for protection from widespread problems. If the core data storage center crashes, not every store would be affected or rendered inoperable.

For companies that are running multiple stores but don't require the latency to be extremely low, edge computing can still find a place. Industries have gotten to the point where sending endless amounts of data to a central hub to be processed and stored just does not make sense, both fiscally and generally.

Manufacturing facilities that operate numerous robotics within them are going to accrue a lot of data over time. Collecting IoT sensor data from these robots and tuning them to ensure efficiency is important, but sending all that is gathered to a central location wouldn't make sense. Instead, they stream the data on location and aggregate it there and send it off to their analytics handler.

Having a local point-of-sale option protects the individual store from worrying about if its global SaaS application were to go down or if its connection to the global company is interrupted. The store can still take purchases, including through the swipe of cards, and send them off later when no issues exist, Hill said.

Safeguarding from risk

The risks associated with this however are clear. With a larger number of distributed computes that are either inter-connected as a mesh or a tree, the overall attack surface increases, Mishra said. There are more points where a company needs to safeguard the data being collected.

The risks can be mitigated, however, as long as companies employ safeguards.

"Devices have to be authenticated. Applications need to be authorized. Access to edge through APIs needs to be regulated and secured," Mishra said. "The fact that the device and the edge need to authenticate each other is critical to ensure secure computing at the edge of the network."

Dig Deeper on Enterprise applications of AI

Business Analytics
CIO
Data Management
ERP
Close