Everyone’s talking about predictive maintenance. When it comes to implementing it in practice, though, many users often fail because they don’t have access to the right data. Mathias Mayer at Audi Neckarsulm was faced with this very situation. His experience showed that 90% of the data in body construction isn’t used or accessed. This usually resulted in an additional sensor being required.
That’s not the path Mayer wanted to go down. Quite to the contrary, he thought, “Let’s process the unused data first. If an additional sensor really ends up being necessary, I’d certainly be willing to talk about it.”
To Mayer, better utilization of available data is the most important requirement for reducing downtimes and working more efficiently. This is going to become even more decisive, as the complexity of production processes and the degree of automation will continue increasing in the near future.
Why is data collection so difficult, though? A glimpse into body construction at the Neckarsulm site reveals the challenge. It’s at this location where A4, A6, A7, A8, R8, and A5 Cabrio model Audis are assembled by around 2,500 industrial robots. Each individual system is controlled via a programmable logic controller (PLC). “We always see the PLC as a puppet master making up to ten robots dance,” said Mayer, describing the situation in his division. The actual value creation takes place at the robot, which is why access to robot data is so immensely important.
In addition to the large number of plants involved, the various different production methods used also make data access and evaluation more difficult. For example, reducing weight while retaining maximum durability can only be achieved by combining different materials. This entails the use of a variety of different connection technologies. A plethora of joining technologies are used just for the new A8 alone, ranging from a very wide variety of welding processes to gluing to riveting. All told, 15 different processes need to be coordinated. Should production falter, experts in each of these individual processes are needed. This ends up being very expensive and time-consuming when you consider three-shift production, as a large number of employees would need to be trained and qualified.
Breaking new ground
To Mayer, deviating from tried-and-tested processes is out of the question. “Our qualification process is definitely expensive and time-consuming, but our customers expect top quality.” Having different employees simply examine something can yield different results—in contrast to the data, which is always the same.
“It’s precisely this data which we have to use to optimize both production and processes,” said Mayer with conviction. For this purpose, the process data has to be processed in such a way that even a non-expert can get a friction welding process started up again, for example.
In this way, unplanned production system downtimes are to be reduced and the availability and process efficiency and quality are to be increased, for example through live system monitoring and the automatic adaptation of process parameters. Previous process monitoring and optimization methods based on expert knowledge have to be retained here. Ultimately, this will reduce maintenance costs and minimize testing efforts.
How does this work in practice, though? In the process chain of the future for body construction, the corresponding data from the devices will be collected, integrated, and visualized directly without additional gateways, as the robots have enough capacity. At the end, there’s an employee who understands the process and can intervene if necessary.
In Mayer’s view, this division of labor is the key to success. It’s only on this basis that data mining and machine learning can be implemented successfully.
In the Audi architecture, OPC UA and MQTT are used as the means of transport for data, which is routed to an edge layer over which a Big Data platform is located. Applications like diagnostic analytics for condition monitoring and predictive analytics for condition-based maintenance can be placed on top of this.
This path is also set out in the Profinet OPC UA companion specification, whose essential content includes the collection and presentation of asset management and diagnostic data. For this purpose, asset and diagnostic data from the devices employed today are collected in a system controller through existing Profinet services and delivered to higher-level instances by means of OPC UA. Profinet’s openness makes it possible, for instance, to add sensors with an OPC UA interface which send their data directly to corresponding cloud services or edge gateways without needing to tediously reconstruct the automation solution. This makes it possible to implement innovative diagnostic methods, even in existing systems.
Access to all data
It’s a situation that’s also known in body construction. “To us, a robot is simply a subordinate device of the PLC. We’d like penetration down to the data all the way at the bottom, but we don’t want to put a separate network in place,” said Mayer, who then immediately provided a pragmatic explanation. “If you have to lay an additional cable to more than 2,000 robots, it just doesn’t work. Not only this, but we don’t just use a single robot manufacturer. Depending on the application, we rely on a host of different manufacturers.”
In addition to this, OPC UA hasn’t been implemented by all manufacturers. It’s still missing in the most important technologies in body construction, such as spot welding, stud welding, gluing, and riveting. In contrast, robot manufacturers are already a good step ahead, as are RFID manufacturers. Mayer also sees the fact that Audi hasn’t yet called for this as the reason for the continuing reluctance. This will be made up for in upcoming calls for proposals, though.
When implementing these technologies, one thing becomes clear: In applications where a combination of Profinet and OPC UA has already been introduced, the advantages quickly took effect.
A good example of this at the Audi site in Neckarsulm is the in-line measurement system for the feeding of rivets over a highly flexible hose from the filling area to the riveting tool on a robot arm. The challenge here lies with the rivet closure speed, which is relatively high at 20 meters per second. The hose has to be replaced sometime between the 500,000th and one millionth rivets.
Now, the hose is no longer to be replaced during production, but rather at a more convenient time instead, as the process needs to be stopped for 20 to 30 minutes for each change. The team created a time series analysis for detecting wear in the rivet feeding hose. Implementation was relatively easy—more air flows through the hose as soon as the smallest porous spots crop up. These results are recorded, forwarded to Profinet via OPC UA at the same time and visualized. Now, every employee has the ability to find out about events taking place on the lowest level and take action faster, even without an additional cable.
Mayer hopes that specifications will be implemented faster by device manufacturers in the future. He also reminds users that they shouldn’t wait too long, either. “If you want to achieve benefits in production, you have to get involved with this subject early on. From my perspective, Industry 4.0 already arrived in practice a while ago. All we have to do now is implement it.”