Robotics, Embedded PC Make this Handle Applicator Tick

March 5, 2010
‘Dog-bone’-style carrier handles are applied at speeds to 60 cases/min by a pair of vision-guided robots, and it’s all governed by an industrial PC.

PakTech (www.paktech-opi.com) has turned to a robotic solution from Fanuc (www.fanucrobotics.com) to bring to its range of carrier-handle applicators a machine capable of operating at 60 cases/min. As with many robotic applications in packaging, this one relies on a machine-vision system to communicate the exact location of the cases in which the bottles to be handled sit. The vision system comes from Sony (www.sony.com).

“Fanuc suggested the Sony camera and we elected to go with it partly because it has a wide field of view,” says Dan Shook, director of operations at PakTech. “The space where we capture the digital image is only about three or three-and-one-half feet above the conveyor, so to get a good image to analyze, we needed that wide angle.” Controls on the PakTech handle applicator are handled by a Beckhoff (www.beckhoff.com) CX1020 embedded industrial PC.

Shook says this is PakTech’s first use of robotics. “We used to use linear actuators on a system that was what you might call ‘robot-like’ but was really more of a pick-and-place system. Speeds were limited because it was an intermittent-motion solution,” says Shook. “We tried a few options that would get us to continuous motion, but the price started to reach a point where who would buy it?” By integrating the Fanuc robots into their machine, continuous motion and affordability were both within reach.

“One of the key benefits with the controller from Beckhoff is that we can build our own HMI applications, including an ability to offer customers remote assistance capabilities,” says Shook. The HMI panel, he notes, is also from Beckhoff.

The two robots on the PakTech system operate as a tag team, with the upstream robot functioning as master and the downstream robot as slave. As a case passes under the Sony camera’s field of view, the camera takes a picture every so many encoder pulse counts. That data is used to mark the position and orientation of the case. The data is shared all along the network, and because the robots are on the network, they, too, know the position and orientation of each case. Then it’s just a matter of tracking that position down the line. If the upstream robot is occupied and can’t handle a case, the downstream robot “knows” this and will apply handles because the two robots share information over the EtherCat network.

● See video of this line at packworld.com/video-29038

PakTech
www.paktech-opi.com

Fanuc
www.fanucrobotics.com

Sony
www.sony.com

Beckhoff
www.beckhoff.com

Subscribe to Automation World's RSS Feeds for Feature Articles

Sponsored Recommendations

Strategizing for sustainable success in material handling and packaging

Download our visual factory brochure to explore how, together, we can fully optimize your industrial operations for ongoing success in material handling and packaging. As your...

A closer look at modern design considerations for food and beverage

With new and changing safety and hygiene regulations at top of mind, its easy to understand how other crucial aspects of machine design can get pushed aside. Our whitepaper explores...

Fueling the Future of Commercial EV Charging Infrastructure

Miguel Gudino, an Associate Application Engineer at RS, addresses various EV charging challenges and opportunities, ranging from charging station design strategies to the advanced...

Condition Monitoring for Energy and Utilities Assets

Condition monitoring is an essential element of asset management in the energy and utilities industry. The American oil and gas, water and wastewater, and electrical grid sectors...