Digital Twin Consortium CTO Dan Isaacs explains the organization's work and assesses the progress made in digital twin technology, standards and best practices.
Digital twins are a potentially important technology that, like IoT, could not exist without reliable connections. In the case of digital twins, the connection is between virtual copies of real-world entities or processes -- i.e., digital twins -- and the things they represent.
Interoperability, that age-old bugaboo of computing, is therefore essential. Facilitating it is one of the primary roles of the Digital Twin Consortium, a program of the Object Management Group (OMG), the Boston-based standards development organization.
The consortium's charge is much broader, however: to further the digital twin cause by bringing vendors, users, governments and academia together to define use cases, best practices and other practicalities. The consortium doesn't try to develop standards on its own, instead seeking to influence them by working to specify requirements for digital twins, which it then submits to standards bodies such as the OMG and ISO.
In this podcast, I talked to Dan Isaacs, CTO of the Digital Twin Consortium, to learn more about the organization's work, successes and challenges and where digital twin technology stands today.
Uniting IT and OT
Interoperability often comes down to having a common language for the information transmitted across digital pipelines rather than the mechanics of the pipelines themselves.
"The lifeblood of a digital twin is data," Isaacs said. "You need to be able to understand what the data is telling you to gain actionable insights."
This emphasis on data in turn raises general, or horizontal, issues that cut across the specialized needs of industry verticals. The consortium has working groups dedicated to industries that have been early adopters of digital twins, such as manufacturing, healthcare, aerospace and agriculture. Other working groups focus on the horizontal issues that affect all the verticals, especially security, data storage and the analytics and machine learning for interpreting data, according to Isaacs.
"We're looking across the lifecycle of the digital twin at each of these different areas, and within the horizontal working groups we look at it effectively in terms of creating a blueprint," he said. The blueprints become reference architectures that capture the different industry domains and their areas of commonality -- in other words, the intersection of information technology and operational technology, he said.
The goal is to create a unified view of the foundational elements of digital twins.
Reaching consensus on a definition
In the 20 years since Michael Grieves, a product lifecycle management expert, coined the term, digital twins have suffered from a certain fuzziness in how they're defined. Things only got worse in the past two years as digital twins achieved buzzword status and some vendors applied the term to digitization that lacks the differentiating characteristics of digital twins.
Isaacs discussed the consortium's own considerable efforts to come to consensus on a definition, which took several months. "Our steering committee said, 'does this really pass the sniff test? Can you do like a Turing Test on it?' We found some areas that we needed to update, [so] we went back through. It was a very iterative process, but it was very collaborative in nature." He said the definition is part of the consortium's open source process and there is a mechanism to accommodate changes.
The current wording reads more like a political platform than the concise product of dictionary writers. It starts with the basics, calling a digital twin "a virtual representation of real-world entities and processes, synchronized at a specified frequency and fidelity." Then it veers into a combination of salesmanship and idealism, referring to business transformation through "accelerating holistic understanding, optimal decision-making and effective action."
Digital twins, according to this definition, are also "motivated by outcomes, tailored to use cases, powered by integration, built on data, guided by domain knowledge and implemented in IT/OT systems."
Other topics discussed include the following:
- the role of digital twins in the metaverse;
- the consortium's recent white paper on reality capture technology; and
- a capabilities framework shown in the form of a periodic table to facilitate the design of digital twins.
To hear the podcast, click on the link above.