Not long ago, a certain supplier of automotive seat lumbar assemblies required four workers to transfer bins stacked full of actuators from a feeder conveyor to the assembly table, where they would pick each actuator from the bin, or remove dividers inside the bin to reach subsequent layers of actuators. Workers then placed each actuator into a “nest” for final assembly with a wire mat. This process of producing each part took approximately 90 seconds, primarily due to the difficulty involved in manipulating the sharp-edged, flexible wire mat used in the assembly process.
To speed up this process, the supplier turned to Systematix, a system integrator, to develop a vision-guided, robot bin-picking application that could serve as part of a work cell for the company’s automotive seat lumbar-actuator assembly.
Clearly, the biggest challenge for the robots would be in handling the removal of the actuators and wire mats from the bins, placing the parts into the “nests” and handling assembly of the connections.
Rob Veldhuis of Systematix noted that one of the critical challenges when designing this work cell, which consists of seven Yaskawa robots, was how to program the robots to reliably pick up randomly ordered parts with high accuracy and speed. The robots had to meet the new work cell’s cycle-time requirement of five seconds per assembly. On top of this demanding requirement, Systematix had to deliver on these requirements without redesigning the bins, which would add costs and impact other production steps.
The layering of actuators in these bins was not the only challenge the robots had to overcome. Because the actuators were not secure in the bins, they could shift when moved around the facility. This meant the robots needed the ability to locate each actuator for picking. In addition, because of the supplier’s practice of layering actuators in the bin, the robot’s vision system would also have to be able to adjust to differences in vertical distance as it picked through the bins.
To address this specific vision issue, Systematix did not opt for a typical setup using a three-dimensional (3D) sensor. Why did they make this decision? Because of the time required in such systems to capture the image, send it to a separate computer or device for processing, execute advanced 3D analysis algorithms, and communicate the results to the robot. Speed was critical to this system because of the new work cell’s requirement of five seconds per assembly. Instead of a 3D sensor, Systematix opted for a two-dimensional (2D) part location system using a photoelectric distance-measurement sensor to provide the third-dimension (height) information.
The vision system
The biggest challenge for the vision system was locating the randomly positioned actuators and bin dividers at high speed. To address this, Systematix paired an IFM Efector 200 photoelectric distance-measuring sensor with a Cognex In-Sight 8000 smart camera mounted to an IAI servo-driven slide. The In-Sight smart camera machine vision system comes standard with Cognex’s PatMax RedLine pattern-search algorithm—which can, reportedly, speed up part location by up to 10 times. PatMax is able to do this because it learns an object’s geometry using a set of boundary curves tied to a pixel grid and then looks for similar shapes in the image without relying on specific gray levels. Cognex says this dramatically improves the system’s ability to accurately find objects despite changes in angle, size and shading.
With PatMax, the In-Sight 8000 camera in the bin-picking application processes a part image and compares it to a reference image to determine each part’s orientation for the robot. Camera images measured in pixels are translated into robot pick coordinates in millimeters. Transfer of data from the camera to the robot is handled with Cognex Connect software which Cognex says requires no intermediate processing between industrial devices.
In Systematix’s arrangement of these automation technologies, the sensor measures the vertical distance to the bin divider, the results of which direct the motion of the slide so that it maintains a fixed distance between the camera and the divider. Using the sensors and slider this way eliminates the need to re-calculate the 2D camera image as the bin layers empty.
A key aspect of Systematix’s strategy in meeting the time requirements for this new workcell involved enabling the robot to pick one piece from the bin behind the pick bin directly in front of the robot. Though this may seem counterintuitive, it allows more time for the empty bin to move out and the new bin to advance while still meeting the requirement to have a new part assembled every five seconds.