This is the first part in a collection of articles covering IoT and AI.
There is a parable in Buddhism about problems: It says that we all have 83 problems, and as we take one away, another appears to take its place. I don’t know about you, but I am pretty certain that the same applies to the Pareto principle.
Many are all familiar with the Pareto principle, or the 80/20 rule: 20% of your effort can yield 80% of the desired value, while 80% of your remaining effort can yield 20% of the desired value. Whether it be sales, relationships, learning or the arts, this rule informs us that focusing our efforts on the right 20% will drive the most impactful outcomes.
But in the world of software where inputs are more interdependent and the 80% is a precursor to reaching the 20%, we must work to automate and be more efficient so that these precursors demand less time, investment and energy. This is one of the promises that IoT and AI can deliver.
Many problems that the 80% represents can be streamlined, automated and optimized through AI. But first, you must have the right data and context.
AI in manufacturing
Applying AI to manufacturing is cited as a common 80/20 dilemma in which gathering, organizing and preparing data is required prior to receiving the high-value learning and modeling for IoT applications to increase efficiency. Strategic investment and effort can not only reduce this burden now, but for all future data science programs that use similar data sources.
It has been demonstrated that deep learning neural networks can remove the need for feature engineering, which is one of the time-consuming contributions to the 80%. The key ingredient here is ample data. Manufacturing can learn from consumer spaces where big data collected from trillions of financial and consumption transactions has been used to drive customer and product insights, helping to produce enormous value.
Manufacturer investment in sensor technology, data collection, IoT and other monitoring opportunities is crucial to collect more relevant data that will lead to a big value add. Many manufacturers have copious amounts of dark data that is not properly collected or managed. Targeting this low-hanging fruit along with strategic IoT applications can create the burst of time needed for impactful learning in AI.
Understanding the context of data is vital
It is critical to not only gather this data but to do so with a thoughtful plan in mind that takes context into account. In operations, much of the data is first and foremost used for process control, real-time operator insight and forensic analysis. On the maintenance side of manufacturing, the data serves to schedule people, equipment and services in the most efficient manner possible by considering production schedules and customer commitments that typically reside in a separate data source
Even data gathered from IoT initiatives has a core function that should not be compromised in the pursuit of data collection for AI. For example, consider vibration analysis, which is a specific data set with targeted advanced analytics that creates meta data and failure predictions about the equipment being monitored. AI-driven data collection initiatives must be designed in a way that doesn’t muddy the waters.
OSIsoft recently learned this lesson the hard way when an undisciplined data acquisition from its building management system (BMS) caused the entire system to crash. Rather than becoming more efficient, the building grounded to a halt in the middle of a busy workday. Similar to OSIsoft’s BMS, almost every source of data that can feed an AI initiative has an original or core purpose that must not be jeopardized.
For example, a SCADA system has a primary function that requires responsiveness in real time for guaranteed control actions, and it requires specific, timely data to make those decisions. This critical functionality creates a set of competing demands on the same resources that a data management system will need to gather and manage the underlying data streams and control decisions.
Ultimately, the challenge of gathering data is not just in finding it, but also in learning how to effectively collect data from various systems without putting their core functions at risk.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.