It’s one thing to recognize that workloads are becoming more distributed and ushering in new opportunities. It’s entirely different to understand how distributed workloads can effectively be monetized at the edge.
Examples of edge monetization include a major medical equipment provider that has deployed thousands of edge locations within hospitals. Medical information along with machine sensor data is anonymized and transmitted to the cloud, where information across all hospital deployments is aggregated and analyzed to improve diagnostics, equipment performance and uptime. Their IoT deployment is part of a larger architecture that includes containers, machine learning and cloud processing.
Another example is a leading automobile manufacturer that is treating each autonomous driving test vehicle as a digital mobile edge that generates terabytes of data per vehicle per day. Each vehicle is part of a larger system that is constantly learning, gaining intelligence across all events and pushing the collective intelligence back out to the edge. This application focus is very similar to leading energy companies that have deployed real-time drilling applications that adjust to pursue the most optimum drill path while monitoring to prevent breakage and downtime.
The key to these applications — and monetizing the edge in general — is to understand how to coordinate each edge location as part of a larger whole. This requires collecting data from each edge to see the global picture. Analytics cannot be simply descriptive to report on historical events at the edge. The analytics should also be used to gain intelligence about events that have happened to better predict future events, such as equipment failures. However, the most significant analytics are prescriptive analytics that inject intelligence at the edge to respond in real time. These are the foundation to game-changing edge applications such as autonomous driving and real-time drilling.
Monetizing the edge is dependent on a persistent data fabric. A data fabric encompasses many different data types, files, tables, streams, videos and so forth. A fabric can also parse event streams that can act as digital threads for advanced AI. New models can replay these streams or threads, making it easy to compare new to existing models, speed burn-in time and increase accuracy. This is the important layer that can process and collect interesting event data to learn globally and act locally.
So, as you look at IoT at the edge, keep in mind that it’s not just a one-way path of sensor data that is collected centrally. A larger system of data flows to and from and across edge devices is required. And to fully monetize, you will eventually need to embrace AI, cloud and container technology.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.