d1revolver - Fotolia

PLM platforms evolve for a role in IoT

Integration with IoT systems can extend the possibilities for PLM platforms, but according to experts, they must first evolve technically to handle the IoT data and processes.

While IoT promises a stage for PLM's second act, existing PLM platforms need to evolve in order to address the challenges of supporting smart, connected products throughout their entire lifecycle.

This evolution will need to address critical areas, such as big data management, visualization and analytics, and integration capabilities, according to experts.

Originally positioned more than 20 years ago as a central repository for all product-related data and tasks, product lifecycle management (PLM) platforms have struggled to gain traction beyond engineering, serving primarily as systems for managing 3D files and related documents and facilitating domain-specific workflows, like engineering change orders, design collaboration and bill of materials management.

The combination of IoT and sensored products promises to deliver real-time data streams that PLM platforms can mine for insights into the performance of a product, such as a car or consumer appliance, long after it's been manufactured. Historically, it's been this lack of intelligence on how a product is used or how it performs in the field that has stymied more widespread adoption of PLM platforms, limiting their utility outside of engineering.

The roots of PLM are still based in engineering -- the owners in industrial organizations are more often than not the engineering groups, and that's where most of the funding comes from. In a way, a lot of companies remain stuck with a legacy of how to structure the business and ownership of PLM initiatives.
Peter BilelloPresident, CIMdata

"The roots of PLM are still based in engineering -- the owners in industrial organizations are more often than not the engineering groups, and that's where most of the funding comes from," according to Peter Bilello, president of CIMdata, a PLM consultancy. "In a way, a lot of companies remain stuck with a legacy of how to structure the business and ownership of PLM initiatives."

According to a recent CIMdata survey on PLM, usage of the platforms is still heavily weighted toward "traditional product data management" aspects of PLM, including engineering release/change management (83%), engineering data management (82%), configuration management (70%), product engineering process management (59%) and global engineering collaboration (55%). The survey found that far fewer respondents reported using PLM platforms to support new digital processes, such as Digital twin (15%), analytics and big data (11%), and IoT (5%).

Interdisciplinary approach needed

In order to make that next leap, existing PLM platforms are being modernized with new capabilities, while vendors in this category augment their portfolios with complementary products, such as IoT connectivity platforms, big data analytics and visualization tools, and advanced simulation capabilities.

One key change for many PLM offerings is the addition of cross-domain functionality for managing the electrical and software lifecycle management aspects of a product, which are increasingly as important, if not more so, than the mechanical functions, according to Steve Chalgren, business strategy and planning executive with Altium, a vendor of printed circuit board design  software.

"Instead of being just mechanical PLM or electrical PLM, now, we're all working toward dealing with multiple disciplines within one PLM platform," he said, underscoring the importance of an interdisciplinary approach in order to properly manage IoT-enabled products throughout the entire lifecycle. In a similar vein, PLM platforms should also support new structures as part of a model-based systems engineering framework to enable nonengineering-related tasks, like cost optimization and project management, according to experts.

While PLM platforms don't necessarily need to become the repository for the streams of data collected from IoT products, they do need to be able to sync up with the big data lakes and cloud platforms that will ultimately store and manage that data. Similarly, PLM platforms don't need built-in big data analytics capabilities, but they will need to integrate with analytics and visualization tools, Chalgren said.

"PLM doesn't need analytics -- it just needs a way to expose that data set in the platform," he explained.

More robust integration capabilities -- specifically, support for open standards -- is also important for PLM to be viable in the emerging world of IoT products. Rather than a monolithic platform that stores everything, the new vision of PLM will be to exchange product data and facilitate closed-loop workflows with other core systems, from ERP to field maintenance software.

Thermo Fisher Scientific has just started down the integration path, syncing its custom return authorization data for its high-end cameras for radiation applications into its Qube ERP platform through Omnify's PLM platform, according to Simon Gu, senior systems analyst on the project.

The PLM-ERP integration was critical because the ERP package had limited capabilities for handling defect data, unlike the PLM package, which had a highly scalable architecture that could adequately track defect data. In the next phase of the project, Gu's team will create an integration from Omnify PLM back to the ERP platform to record replaced parts for inventory control purposes. Further down the line, Gu could see collecting more data from cameras' usage in the field and feeding that data back into PLM to help guide future development.

"I can see a point that we'd like to collect as much information from the field as possible to see how the product is used and what the source of defects are," he said.

Dig Deeper on Supply chain and manufacturing

SearchOracle
Data Management
SearchSAP
Business Analytics
Content Management
HRSoftware
Close