Robotics, Embedded PC Make this Handle Applicator Tick

March 5, 2010
‘Dog-bone’-style carrier handles are applied at speeds to 60 cases/min by a pair of vision-guided robots, and it’s all governed by an industrial PC.

PakTech (www.paktech-opi.com) has turned to a robotic solution from Fanuc (www.fanucrobotics.com) to bring to its range of carrier-handle applicators a machine capable of operating at 60 cases/min. As with many robotic applications in packaging, this one relies on a machine-vision system to communicate the exact location of the cases in which the bottles to be handled sit. The vision system comes from Sony (www.sony.com).

“Fanuc suggested the Sony camera and we elected to go with it partly because it has a wide field of view,” says Dan Shook, director of operations at PakTech. “The space where we capture the digital image is only about three or three-and-one-half feet above the conveyor, so to get a good image to analyze, we needed that wide angle.” Controls on the PakTech handle applicator are handled by a Beckhoff (www.beckhoff.com) CX1020 embedded industrial PC.

Shook says this is PakTech’s first use of robotics. “We used to use linear actuators on a system that was what you might call ‘robot-like’ but was really more of a pick-and-place system. Speeds were limited because it was an intermittent-motion solution,” says Shook. “We tried a few options that would get us to continuous motion, but the price started to reach a point where who would buy it?” By integrating the Fanuc robots into their machine, continuous motion and affordability were both within reach.

“One of the key benefits with the controller from Beckhoff is that we can build our own HMI applications, including an ability to offer customers remote assistance capabilities,” says Shook. The HMI panel, he notes, is also from Beckhoff.

The two robots on the PakTech system operate as a tag team, with the upstream robot functioning as master and the downstream robot as slave. As a case passes under the Sony camera’s field of view, the camera takes a picture every so many encoder pulse counts. That data is used to mark the position and orientation of the case. The data is shared all along the network, and because the robots are on the network, they, too, know the position and orientation of each case. Then it’s just a matter of tracking that position down the line. If the upstream robot is occupied and can’t handle a case, the downstream robot “knows” this and will apply handles because the two robots share information over the EtherCat network.

● See video of this line at packworld.com/video-29038

PakTech
www.paktech-opi.com

Fanuc
www.fanucrobotics.com

Sony
www.sony.com

Beckhoff
www.beckhoff.com

Subscribe to Automation World's RSS Feeds for Feature Articles

Sponsored Recommendations

Put the Plant Floor in Your Pocket with Ignition Perspective

Build mobile-responsive HTML applications that run natively on any screen.

Ignition: Industrial-Strength System Security and Stability

Ignition is built on a solid, unified architecture and proven, industrial-grade security technology, which is why industrial organizations all over the world have been trusting...

Iron Foundry Gains Competitive Edge & Increases Efficiency with Innovative Technology

With help from Artek, Ferroloy implemented Ignition to digitally transform their disconnected foundry through efficient data collection and analysis while integrating the new ...

Empowering Data Center Growth: Leveraging Ignition for Scalability and Efficiency

Data center growth has exploded over the past decade. Initially driven by organizations moving their computer assets to the cloud, this trend has only accelerated. With the rise...