When you are buying a luxury car, you want it to come fully loaded with all the amenities like navigation, in-vehicle infotainment, intelligent park assist, collision avoidance, etc. These bells and whistles make for a comfortable customer experience. But for the manufacturers and OEMs that are making these software-driven vehicles with upwards of 100 million lines of code, product development is a challenge.
And it’s not just automobiles that are morphing into complex electronic systems. Many of the smart things saturating our everyday lives as part of the Internet of Things (IoT) are engineered with mechanical, electrical, software and firmware components, all of which amount to connected chaos when it comes to managing configurations, revisions and quality control as part of the product lifecycle management (PLM) system.
Here’s why: PLM traditionally has been a way for manufacturers to bring new products to market faster by reducing development and production time, cutting costs and maintaining product quality. And it all starts with the computer-aided design (CAD) tools to create mechanical diagrams of materials, processes, tolerances and dimensions using 2D vector-based drafting systems and 3D solid and surface models.
In the past, PLM was very much rooted in design and providing a way to manage the lifecycle of a product from inception through engineering and manufacturing and, ultimately, end-of-life product disposal. It was based on a collaborative framework for organizing, connecting and tracking product documents, including CAD and computer-aided manufacturing (CAM) files. More recently, there’s been a push to close the loop between engineering, production and enterprise software—like manufacturing execution systems (MES) and enterprise resource planning (ERP)—to create synchronicity and visibility between design, manufacturing, and order and inventory.
Now the industry has entered a new stage in which suppliers and customers are also an important part of the product lifecycle. Suppliers, in some ways, are becoming design partners and customers are leveraging social networks to provide feedback on product functionality. In addition, the “connected products” that make up the IoT are creating a convergence of electronics and software, so there is a need for more integrated simulation models.
Collectively, these factors are feeding into the next generation of PLM, which industry experts are calling the product innovation platform. It is the ability to tie the traditional engineering workgroup into upstream front end of innovation—including understanding customer needs—with downstream manufacturing processes to head off any quality issues. This unified cross-functional architecture also factors in which suppliers to work with and what materials to use.
“It is no longer PLM and CAD tools for a few guys in an engineering workgroup,” says Jeff Hojlo, program director of product innovation strategies at analyst firm IDC, noting that the existing setup focuses exclusively on R&D. “PLM [now] encompasses suppliers and non-engineers on the manufacturing side so that changes can be made quickly, quality issues can be serviced quickly, and adjustments can be made [based on] product demand.”
As a result, the PLM providers have been making some unconventional acquisitions that bring an array of expertise together into a unified lifecycle management platform for the digital enterprise.
A few examples include:
- Autodesk, which made a handful of acquisitions that enhance its AutoCAD design and Inventor modeling/simulation products. In the past few years, the company has bought: Delcam, a provider of CAM software; Netfabb, a developer of industrial additive design software; CadSoft, the maker of Eagle printed circuit board design; and Magestic Systems, a maker of manufacturing software for CNC cutting applications.
- Dassault Systèmes added manufacturing operations management to its 3D design portfolio with the acquisition of Apriso in 2013, and more recently bought Computer Simulation Technology (CST), a maker of electromagnetic and electronics simulation. It also acquired Next Limit Dynamics, a developer of simulation for highly dynamic fluid flow; and Quintiq, a provider of on-premise and on-cloud supply chain and operations planning software.
- PTC, known for its Creo CAD software and Windchill PLM software, bought: Kepware Technologies, a maker of manufacturing connectivity tools; ThingWorx, which offers an ecosystem of IoT development tools; Axeda, another IoT company; ColdLight, which has machine learning and predictive analytics; Vuforia, which adds augmented reality into the mix; and Servigistics, a service parts management tool.
- Siemens PLM brought UGS into the fold way back in 2007 as it outlined its vision to blend virtual and real worlds. In 2012, the company acquired LMS International for its test and mechatronic simulation software. Last year, the company picked up CD-adapco, a simulation company covering a range of engineering disciplines, including fluid dynamics, solid mechanics, heat transfer and electrochemistry. Siemens also added Mentor Graphics to its portfolio for its design automation software, including for automotive integrated circuits and system-on-chip devices.
Though it might seem that PLM providers are on a random buying spree, they are all on the same path to move away from a lot of disparate data and toward a more collaborative visualization effort between engineering, manufacturing, the supply chain and the customer.
“The future opportunity is to use the information around connected products [as it relates to] usage and performance in order to make better products,” Hojlo says.
The digital world changes everything. Think about cars. Once all mechanical, automobiles more recently have begun to look like rolling computers. Now the automotive industry must evolve with its products.
“The automotive industry is becoming the mobility industry,” says Tom Maurer, senior director of strategy at Siemens PLM. “It’s not about selling automobiles, but about mobility in transportation.”
Such shifts also open the door for selling products as a service. Maurer points to Konecranes, a maker of heavy-lifting equipment. The company is not just selling cranes anymore, but rather selling lifting as a service. To do that, they’ve worked with Siemens to equip the cranes with sensors that report information back to the manufacturer, which then applies analytics and knows when it needs to be serviced. This new business model allows the customer to buy an operational service rather than capital equipment.
These examples also mean the design of the products must be different. The Siemens PLM investments are supporting the digital twin concept—not just for the product—but across the entire lifecycle. “Now we can simulate all of the physics in a product and also its performance,” Maurer says.
The idea that a manufacturer can monitor and track a product’s performance is a major gain that comes from a new type of digital thread sewn into every asset, whether it is mechanical, electrical or software. And PLM is providing that single source of truth throughout the entire organization.
“For years, we’ve been in the CAD world designing digital prototypes. But as an engineer, the design is based upon assumptions in the requirements,” says Paul Sagar, vice president of product management at PTC, noting that in the past the only kind of feedback engineers received came from an irate customer calling to say a product is broken. “With IoT, I know how those products are behaving and can not only monitor them, but understand usage scenarios to optimize designs going forward.”
To that end, PTC is adding more IoT capability into its Creo software to connect with physical products in the field and feed that information back into the CAD system. The program, called Design for Connectivity, takes information from sensors on products in the field and feeds it through ThingWorx, which connects back to Creo to provide a real representation of forces and constraints on the model. “When you connect the digital and the physical, it gives traceability throughout design,” Sagar says.
The third platform and PLM
Of course, when it comes to collaboration, analytics and mobility, discussions about the third-platform technologies like the cloud and Big Data always come up. And it is no different for PLM, which IDC’s Hojlo sees as a growth area over the next five years for quality and service planning. “There is a need for speed when it comes to executing on service and delivery, and if there is a quality issue, you want to react quickly to that,” he says.
The adoption of cloud-based PLM is on the rise, agrees Chuck Cimalore, CTO of Omnify Software. The PLM provider offers both on-premise and hosted systems, and for the past two years, it’s been a 50/50 split for new customers. For folks opting for the hosted model, the biggest concern has been intellectual property security and accessibility to design data. But customers are becoming more comfortable with the due diligence of cloud suppliers around encryption and blocking unwanted visitors. As a result, Cimalore expects to see Omnify’s cloud instances outgrow on-premise deployments and represent 70-80 percent of the installations over the next five years.
IoT is a big motivation for moving to the cloud because companies will want their smart devices to talk directly to the PLM system. To support that, Omnify’s latest release of its Empower PLM product is built on a representational state transfer (REST) service platform to enhance the integration framework for third-party systems, providing a standard way for devices to communicate. “IoT will present more opportunities to provide synchronized solutions that tie the product record to the device in the field,” Cimalore says.
Arena Solutions, a pioneer of cloud-based software-as-a-service (SaaS) PLM, also sees the product record as a key enabler to next-generation PLM. Arena’s PLM platform includes a quality management system (QMS), application lifecycle management (ALM) and supply chain collaboration in one holistic platform. A new product, Arena Verify, adds requirements and defect management. All these systems are tied to the same product records, allowing all teams—engineering, electrical, mechanical and software—to work together across the supply chain. In addition, a partnership with cloud-based quality control supplier 1factory adds the ability to identify non-conformance to specifications early in the design and manufacturing process to accelerate the necessary corrective action.
Indeed, the ability to keep track of all the moving parts in a unified way is the biggest benefit of PLM.
GFS, which makes natural gas conversion systems for high-horsepower diesel engines, uses Arena for creating bills of material (BOMs) and managing technical documents and quality control processes to ensure the company builds a consistent and reliable product. The company’s products allow mining trucks, oil drilling rigs, power generators and other industrial applications to run on a combination of liquefied natural gas (LNG) and diesel fuel as a way to save money and improve emissions. The conversion systems include a comprehensive list of sensors, hardware, wiring harnesses and electronic controls.
Prior to the adoption of Arena in 2013, the company was operating in product development mode without a formal PLM system. BOMs were created and tracked using conventional spreadsheets, which proved increasingly difficult to manage as the products matured.
In the future, GFS plans to more fully utilize PLM to optimize the design of the conversion systems, but for now the company is leveraging Arena as a unified platform that allows its procurement, production and engineering teams to access a common set of BOMs using strict document control and tracking.
“The move to Arena allowed us to make an efficient transition from a product developer to a manufacturing company,” says Kerry Hackney, chief marketing and communication officer at GFS.
An open, additive approach
As more complexity is introduced into the design and lifecycle management process, it’s important to keep an open mind—and open infrastructure—to be able to scale and evolve with the changing product requirements.
The PLM systems of the past were rooted in a rigid architecture that managed CAD first and dealt with the enterprise later, which was a clunky approach. Today’s PLM should embrace an open architecture, even when customers are using legacy systems or different CAD tools.
To help with this, Rockwell will soon introduce a new release of its design tool, Studio 5000 Architect, that includes new data exchange interfaces with adapters for different electrical CAD software. These interfaces will be used to close the loop between electrical and automation systems.
Also on the flexibility path is Aras, which built its PLM on an open data model that can work with other PLM offerings and still manage configurations across different points in the lifecycle. For example, GE Aviation uses Siemens PLM Teamcenter to manage the engineering BOM, but can be transitioned seamlessly, using Aras, to a manufacturing BOM (MBOM) that is released into ERP for procurement.
A new product from Aras called the Manufacturing Process Planning (MPP) application adds the ability to manage the manufacturing process plan, work instructions and the MBOM, making them interdependent and automatically synchronized as changes occur.
“The ability to manage configurations at different points in the lifecycle is one of our differentiators,” says Doug Macdonald, product marketing director at Aras.
An open approach will prove very useful as manufacturers adopt 3D printing. It’s a very different way to make parts, and therefore requires a very different way of designing, simulating and configuring. It is for these reasons Autodesk acquired Netfabb, which enables the analysis of a design in advance to show where the challenges will be.
No matter what crosses the PLM path—be it IoT, analytics or additive manufacturing—all vendors understand that manufacturers require repeatability of the design and the manufacturing processes. And that’s what they aim to give them.