The year was 1990. The Berlin Wall had just crumbled. The Human Genome Project had just formally begun. The first internet companies, PSINet and EUnet, were starting to sell dial-up and trunk access to the nascent World Wide Web in America and Europe, seriously disrupting the BBS and CompuServe communities of the day that provided connecting and sharing for members.
And I had just completed a degree in chemical engineering at Berkeley and joined a small process engineering turned software company as a software developer, building operational data management technology for industrial plant operations data on digital equipment/VMS systems.
Today, the concept of data access through your personal computer may seem obvious, but at that time PCs weren’t typically “connected.” We have moved beyond connected desktops to mobile devices that can give us real-time directions, track our daily steps, monitor our sleep and suggest entertainment tailored to our taste. But in 1990, it was less obvious that personal computing and information access would become ubiquitous — the inertia was still for centralized systems. Although I had seen young engineering students using PCs in their college labs (the liberal arts kids used the cool new Macs), not everyone agreed that PCs would displace mainframes for data access. I started working on a side project to build out an operational data integration to PCs and Unix, despite some internal naysaying that PCs were toys and that no customer would pay for the tool. Two years later, at our 1992 conference, I gave an ad hoc demo of the PC access tool. It was the runaway hit of the conference and quickly became part of our core offering. I had bet right.
No one can predict the future with 100% accuracy, and not even the most powerful AI will change that. However, after three decades of trying to read the tea leaves of technological trends to bring successful products and services to market, I have one paramount rule: The best predictions come from observation.
OK, Aristotle may have said this first, but it’s as true now as it was in the 4th century B.C. When we observe our world closely and objectively, the magic happens. In fact, the tremendous advances we’ve seen in AI in the past 20 years have come in part because software developers have shifted from a Boolean, rule-based view of AI to one based on Bayesian probability.
Today, as CTO, my colleagues often want me to talk about digital transformation and the big technology trends impacting our world — internet of things, cloud computing, machine learning and artificial intelligence, the fourth industrial revolution, blockchain, and so on. But it is easy to have a conversation about these topics and talk past each other, missing the key observations that can tell us how these trends can and will mature and shape the world.
When it comes to the explosion of sensor data, IT and operations are often the ones talking past each other. Let’s take machine data, for example. A chorus of analysts has estimated it will unleash trillions in economic value. But how does this value actually get realized? How does the insight get translated into the physical world through changed equipment, changed materials, changed processes and changed thinking? Much of it doesn’t require machine learning or AI algorithms, it is as simple as an engineer looking at one or two data streams.
IT teams, however, don’t live on the plant floor — they know the physics of digital computation, communications and storage, not typically the physics and chemistry and mechanics of their operations. They live on the cutting edge of digital technology, and in my experience, they often want to send machine and operations data to the tools they know — the data warehouses, data lakes and relational databases where data from other parts of the business is handled.
Plant managers, maintenance crews and operators, however, know that operational data has to be handled in real time to optimize operations. After all, they use this data every day to make decisions about how to operate assets and production lines worth millions of dollars.
On the flipside, plant floor operations are generally slower than their IT counterparts to see the value of big data, data science and artificial intelligence. Most operations personnel are engineers trained on first principles and deterministic models. They care about seeing data locally to model and run their plant. As a result, they have sometimes failed to fully observe how and where the shift from human-derived to computer-derived insight is occurring.
How can we get past the buzzwords and the hype and start to talk about what impact technology will have on the world? The past is prelude. When we observe how the world has changed, we start to see how it will change.
For all the talk in Silicon Valley about disruption, most technology matures and comes of age over decades, not months. The lightbulb was invented in 1879. However, it was another 40 years before the first integrated electricity system (aka the electric grid) came to be.
The sensor data driving IIoT and Industrie 4.0 has a lot in common with the lightbulb. It will transform how our world operates, and 50 years from now, data infrastructure will likely be just as critical as our power grids. However, initially, value realization of IIoT won’t happen in the cloud, but on the plant floor, where a few key data points in the right context and the right hands can drive thousands, or even millions, in value.
As our capabilities to collect and manage sensor data mature, the value will continue to migrate to the cloud, where IT will see AI, machine learning and big data truly come of age. But predicting the future, I see the greatest potential will be in learning to dynamically and adaptively use available and changing digital resources — compute, storage and communications — optimally, moving data to use and use to data across a comprehensive edge to cloud topology, and intelligent digital infrastructure.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.