For the past few years, it seems like the hot topic in enterprise architecture has been the cloud — a centralized approach where simple browsers at the edge of a network access cloud-based services and analytics.
There’s been a lot written about why the cloud is the best place to store, manage and analyze data. A centralized model can deliver the processing power and cost advantages of hyperscale, coupled with the ability to more easily manage and coordinate thousands of geographically separated endpoints.
But not every application or use case is best served by the cloud.
By 2020 more than 20 billion devices will likely be connected via the internet of things. Every minute, they are creating unimaginable amounts of data — measured in zettabytes, a unit equal to one trillion gigabytes. By the end of next year, IoT will generate more than 500 zettabytes per year in data — and in the years beyond, that number is expected to grow exponentially.
As more users and things connect and interact with one another, the volume and velocity of content gathered from or delivered to the edge increase rapidly. Greater use of rich media types, such as video, also intensifies this challenge. It takes time to move data to the cloud, perform some service on it and then move it back to the edge.
Distributed deployment models are often better for addressing connectivity and latency challenges, bandwidth constraints and greater processing power and storage embedded at the edge of the network. Creating a layer of “edge” computing keeps the heaviest traffic and processing closest to the end-user applications and devices that generate and consume data.
Edge sensors and devices handle very specific roles in very specific locations, such as a retail store, a stadium, or a home or office. In many of these use cases, edge devices have existed for decades. According to a study by IDC, 45% of all data created by IoT devices will be stored, processed, analyzed and acted upon close to or at the edge of a network by 2020.
The lag time between sensing and doing – latency — can be one of the most pressing issues for connected devices. By having computing power reside at or near the sensors, in effect on the edge of the enterprise, having devices talk and provide data directly to each other, can dramatically reduce latency. This can be critical for some use cases. For instance, in an autonomous vehicle traveling at highway speeds, any delay between sensing an impending collision and initiating a course change (steering, braking) can have catastrophic consequences. Latency is also a problem for industrial safety systems like fire alarms, where seconds lost in data transmission can be critical. For oil and gas companies, having near-instant data capture and analysis at well sites can help anticipate signs of a disaster and take measures to prevent a catastrophe before it starts.
Edge computing can also provide a path to accelerate and simplify data processing and provide needed insights where and when you need them. For example, in a brick-and-mortar retail environment, edge devices such as beacons interacting with customer smartphones, analytics of sales data, coupons used, traffic patterns and videos can provide valuable insights into consumer behavior.
In the financial sector, edge computing can help institutions identify and prevent non-compliant transactions in real time.
Despite the potential benefits of robust edge and near-edge capabilities and diverse edge systems, centralized cloud services are probably not going away. Cloud technology is still an excellent way to provide, manage and update software and services on edge and near-edge devices. Centralized cloud services also have a role in coordinating operation across highly distributed edge devices, and in aggregating and archiving data from the edge or intermediate gateways and servers. Centralized cloud services can also provide significantly more robust and scalable machine learning and sophisticated processing capabilities that can link to traditional back-office processing.
With the ready availability of more processing and storage capabilities in ever smaller implementation, sophisticated manipulation, storage and communication of data becomes possible on users’ phones, laptops or tablets, as well as on devices in remote locations or space-limited geography. Operations that were once the realm of dedicated data centers are now reduced to apps. The opportunity is for an edge layer to provide a richer stream of data to the cloud, while also offering real-time responsiveness and local authority to a network of connected devices.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.