Go back 20 years ago: You visit a remote mining site a good distance from any major city. It’s humming with human activity. Operators work within a closed control network, manually monitoring and adjusting set points to keep things running smoothly. Physical dangers abound.
You return to the remote mining site today and it has few, if any, human workers. Autonomous trucks and other assets move, directed by operators from a remote operating center hundreds of miles away. In some cases, machine learning or AI suggest the optimal operating conditions that direct these autonomous operations. In fact, cement manufacturer Cemex unveiled at a recent conference its success with a pilot for AI-controlled operation of its clinker coolers. AI analyzes data, forecasts optimal operating conditions and then adjusts critical set points. (Disclaimer: The conference PI World is hosted by OSIsoft, where I am currently the CTO.)
For both AI-directed autonomous or operator-controlled remote operations, trust is based on three criteria:
- That the entity giving the directions, whether human or algorithm, will get the operating parameters correct;
- That those parameters can be communicated to assets at the edge of operations; and
- That those assets can adjust behavior based on that communication.
Autonomous and remote operations require edge-to-edge communication, as well as cloud-to-edge communication. Unlike the cloud, which has consolidated around a handful of vendors, including Microsoft Azure, AWS and Google Cloud, the edge is specialized and more diverse by nature, encompassing everything from IoT gateways to large mining trucks to local transmission lines. As a result, the edge is a fragmented and distributed landscape. The variety of edge devices and the jobs they perform make it unlikely that a single vendor or handful of vendors will gain the critical mass to win the market.
Edge devices and assets will continue to come from different vendors, complicating coordination of communication efforts. For example, how does a truck from one vendor talk to other assets that need to know it is coming down the road at 37 kilometers per hour, while continually optimizing its overall cost of operation? How can the cloud tell different manufacturing assets that while they can stamp out 10,000 widgets per hour, the optimal rate for the whole process and supply chain is 7,200?
One answer is the cloud. A centralized IoT platform can coordinate edge devices with the cloud as an intermediary. Latency, security, regulatory compliance and the need for self-sufficient, local operations makes cloud-based coordination suboptimal. That mining truck traveling 37 kilometers per hour in a harsh environment simply cannot rely on a server hundreds or thousands of miles away to tell it to adjust course for road conditions. It needs to make those physical adjustments locally.
This suggests the need for a common framework and language, so that edge devices and assets can talk directly to one another, no matter their maker or origin. The internet provides a useful example: Decentralized and distributed by nature, the internet came of age only when TCP/IP emerged as the dominant protocol, allowing seamless data transfer. Similarly, the edge needs standards that allow devices from different vendors to “plug and play” so information can flow freely across operations.
Today, some of the most promising developments on the edge aren’t coming from individual vendors, but from consortiums that are slowly converging on the necessary standards. The OpenFog Consortium merged with Industrial Internet Consortium, which has a joint agreement with Plattform Industrie 4.0. In that capacity, these groups align to accelerate and drive change by laying common ground of language, architecture and concerns. (Full disclosure: I am the CTO for OSIsoft, which is one of 60 founding members for LF Edge and are members of Open Fog and IIC.)
Add to this groups like the Linux Foundation which recently aligned a number of efforts through LF Edge. Using the mindset and dynamic of open source, various interests and working together with the belief that finding a common ground through an open source framework in this space is needed to accelerate time to value for edge systems.
While standards may not be the most exciting of technological topics, it is critical to realizing the full value of industrial IoT and edge computing, and to establish the hardware and software infrastructure for the development of application and analytics. According to McKinsey, “Interoperability between IoT systems is critically important to capturing maximum value; on average, interoperability is required for 40% of potential value across IoT applications and by nearly 60% in some settings.”
Interoperability will only be achieved when we reach a critical mass of participation across the IIoT and edge landscape. Unlike the critical mass phenomena that dominates the “network” effect, where the first mover has the opportunity to dominate the segment, IIoT and edge are dominated by the “internet” effect — which also depends on participation of anyone or anything that cooperates, yet stays decentralized, fluid and open.
The work of consortiums and foundations to bring together vendors, academia and consumers to establish a shared semantic for communication and ontology for representation of what devices do and how they interact will speed the adoption of IoT and edge technologies. Today, we have lights out factories operating with semi-autonomy, but these are just a micro version of what is possible. With a more complete framework and seamless interoperability, the edge and IoT have the potential to accelerate and reach the critical mass of participation needed to continue revolutionizing manufacturing, the entire supply chains and whole industries.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.