Packaging Robots: More Than Just Brawn

June 1, 2011
What’s ahead for robotic packaging systems? A simplified, more integrated and more economical control architecture, plus a more sophisticated approach toward sensing.

The use of robots in packaging is changing. Let’s start with the obvious: There are more of them. A visit to a few plants, or the floor of a trade show like Pack Expo, should be enough to confirm that observation. Packagers are seeking to reduce headcount and run leaner while simultaneously striving to deal with ever-greater product variation, and flexible automation in the form of robotic systems hits the mark on both counts.

Equally significant, though, are the less obvious ways in which packaging robots are changing. “One important way that machine builders and automation solution providers are helping enhance the capabilities of robots is to integrate them into the machine control system,” says Leo Petrokonis, packaging business development manager for automation supplier Rockwell Automation Inc., Milwaukee.

Machine builders have three main options for achieving this integration, Petrokonis says. The first is through a networked connection over Ethernet. In this option, a robot’s controller is linked with the machine’s controller. The operator can use a single human-machine interface (HMI) terminal to command the robot and the packaging machine, but each still has its own controller.

The second way, known as embedded integration, is where the controller is communicating with the robot through the backplane to issue commands and look at status. “The third way, which offers the most benefits to the end user,” insists Petrokonis, “is a fully integrated application in which the robot is integrated directly to the main controller, eliminating a separate controller for the robot.”

Going from two controllers to one reduces system complexity, simplifies programming, cuts programming and support costs, decreases machine footprint, and potentially permits faster communication and data manipulation.

A number of relatively new systems illustrate this trend. One of them is the RTL-MX (Robotic Top Loader—Multi Access) case packer from Pearson Packaging Systems of Spokane, Wash. The system is used to case pack a wide variety of products, depending on the end-of-arm tooling Pearson designs for it. In the past when Pearson included a robot in one of its systems, which would typically be controlled via a Rockwell control platform, the robot’s controller had to be programmed and maintained separately from the rest of the system, then interfaced with the Rockwell system. By contrast, the current version of the RTL-MX employs a Kuka robot integrated directly into Rockwell’s Allen-Bradley ControlLogix.

Natural Progression 

The entire robotic application “is being controlled directly from the system’s PLC (programmable logic controller),” says Michael Senske, president of Pearson. This has eliminated the need for two separate controllers, he says, and has resulted in easier synchronization, lower operating costs and reduced maintenance. It has also saved valuable floor space.

“I think this is a natural progression, this blurring of the lines between the proprietary robot controller and the system controller,” says Senske. “A lot of customers have indicated that this is a direction in which they really want to go because it lowers the total cost of ownership and makes integration easier. I think that over the next several years you are going to see more robot companies move down this path.”

Edson Packaging Machinery, a Hamilton, Ontario, Canada-based provider of end-of-line packaging systems, doesn’t necessarily push robotics. According to Bob Krouse, Edson’s operations manager, the company will not automatically try to “force feed” a robot into an application that might be better and more economically served by the custom-designed hard automation it builds. Last year, however, it unveiled its VI case packer, a system for the growing number of high-variability applications in which a robot’s flexibility is decidedly welcome.

The VI employs a 4-axis Delta robot (called the DeltaBot) from AEMK Systems Inc., of Waterloo, Ontario, Canada, specialists in high-speed, vision-based robotics systems. The DeltaBot offers integrated vision and a control system from Beckhoff Automation LLC, Burnsville, Minn. The controller is a CX1010 embedded personal computer (PC) with Beckhoff’s TwinCat software. It also uses high-speed EtherCat input/output (I/O) terminals and Beckhoff EtherCat servo drives.

According to AEMK President Amir Khajepour, the Beckhoff control system was chosen because it was open enough for AEMK to write its own kinematics, and it had the power and flexibility needed for integrated control of other components such as cameras and conveyors. Beckhoff has augmented this openness by adding a kinematics software library to its TwinCat control platform. Integrating robotic functions such as Delta kinematics or SCARA into the TwinCat system greatly facilitates the creation of integrated control applications.

Khajepour maintains that this single controller approach nets big savings in terms of hardware and wiring costs when compared with multiple-controller architectures, and yields floor space reductions as well. Interestingly, this integrated and relatively open control system was itself integrated into the Rockwell Automation control system of the new Edson VI case packer.

It was a good fit, according to Edson’s Bob Krouse, because with its newer control platforms, Rockwell has “moved toward open source motion profiles and robot kinematics, so this allows the robot program to be basically open to everybody.”

Open System Benefits

Edson also chose to go with EtherNet/IP CIP Motion (for Common Industrial Protocol) for its motion networking needs rather than the widely used Sercos digital motion control bus. “Again this goes back to the idea of open source and the desirability of not having a lot of black boxes and hidden intellectual property,” says Krouse. “Sercos 3 does use Ethernet as a cable, but it’s proprietary in its communication structure. So we chose to go with CIP Motion, which is open and available.

“Eliminating networks eliminates a lot of hardware and upkeep as well,” Krouse continues. “And since CIP is nonproprietary, in a pinch you can go into your local Best Buy and buy an Ethernet cable. Those are the types of things that appeal to us, and to our customers as well.”

Krouse could have mentioned Delta robots as one of the things that appeal both to the folks at Edson and to their customers. Over the last three years there has been an explosion of Delta robots in packaging applications as machine builders realize how easily these small arms can be embedded in packaging processes when compared to the large automotive-type robots formerly employed.

“When the Delta robots first came on the scene, they were typically used to pick and place small items utilizing a suction cup,” notes Clay Cooper, engineer/corporate development with Applied Robotics Inc. of Glenville, NY. “Then, as various applications arose, the Delta robots started taking on increasing complexity. Today, with their increased payload and speed, they are doing more.” In the near future they will be used even more because, as he puts it, “the need to automate or perish will continue to open new doors for the Delta robot.”

However, the AEMK Delta robot—the DeltaBot—used in Edson’s VI case packer differs noticeably from the Delta robots that manufacturers have become increasingly used to seeing. Instead of rigid metal or carbon fiber arms, DeltaBots have arms made of steel cables. It’s a design that was developed at the Department of Mechanical and Mechatronics Engineering at the University of Waterloo (Ontario) where Khajepour is a professor.

Khajepour says the comparatively lightweight cables reduce inertia, which in turn boosts the robot’s speed (AMEK reports the design attains speeds of 120 pick and place cycles per minute). The reduction in inertia tends to heighten accuracy. The switch to cables also makes the robot less expensive to build and lowers maintenance costs.

Machin Vision Explosion 

Just as the Edson system is an example of the popularity of Delta robots in packaging today, it is also illustrative of the prevalence of machine vision. Only a few short years ago, robots with pre-engineered vision integration were looked upon as marvelously high tech. Since then, however, the increased standardization of vision system components has driven down engineering and installation costs, making machine vision applications more desirable. This, in turn, has led vendors to introduce an increasing number of low-cost vision sensors to take advantage of the trend—making vision applications even easier to justify.

That’s what Dick Motley has been seeing. Motley, who is national account manager, packaging integrator network, for Fanuc Robotics America Corp., Rochester Hills, Mich., says he has observed an increasing demand for Fanuc’s built-in iRVision robot vision. He says a growing number of both integrators and end users now recognize the benefits machine vision systems can deliver.

And not just vision, but three-dimensional vision, according to Steve Prehn, senior product manager for material handling and vision at Fanuc. “3D vision could become standard equipment on robots in a decade or less,” he maintains.

Prehn explains that a machine vision system takes an image and uses algorithms to find the things it is trained to find. An image is basically data, a series of pixels that each have a gray-scale value.  Image processing algorithms have been created that recognize patterns or structures in the image data. With vision cameras, operators can find product features, mathematically figure out where the part is in space, and guide the robot to the part.

The robot knows where an image was taken, so it can identify where an object is sitting and make judgments about its size, whether it’s part A or part B, and whether it has errors or not. Whatever program the robot is designed to do, it can adapt based on the images and algorithms. For example, if a part is larger than another part, the robot might take a different path. Or, part A might have to get dropped off at a different place than part B.

Now, when extracting part positions with a single two-dimensional camera, certain assumptions must be made. ”2D systems find a part in X, Y and rotation if you assume the Z (distance from the camera) did not change,” Prehn explains. “If the part moves closer to the camera, changes size, or is tipped differently, traditional 2D systems may miscalculate where the part is in space. 3D systems are more robust, and allow the robot to know exactly where the part is, so the robot can pick up the part cleanly.”

PERCEPTION IN PACKAGING

Packaging, says Prehn, is an area where machine vision is critical. Food products often come down a conveyor or slide down a ramp into a pickup area. There’s no repeatable positioning. The products end up in different positions and need to be picked up, oriented and placed in the package. Vision allows the robots to find the product and make that happen.

3D vision in particular also offers performance advantages when stacking parts on a wooden pallet, or removing them. Parts may not only shift side to side, but may also be at different heights, or at different angles. Pallets are easily damaged by fork trucks, and this damage translates into inconsistencies in the presentation of the parts to the vision system. “Robots that handle parts or boxes that are stacked on pallets can’t blindly assume they are always in the exact same place,” says Prehn.

The key here is what Erik Nieves, technology director at Yaskawa America Inc.’s, Motoman Robotics Division, West Carrollton, Ohio, is the utilization of multiple sensing technologies to achieve something akin to perception. “The future of robots is tied to the development of perception,” he insists. “There is a coming flood of sensors that may be applied to robotics—vision, time-of-flight, force/torque, tactile—but sensors alone are not enough; they are just data.”

The key, says Nieves, is to develop algorithms whereby these multiple streams of sensor information are analyzed together to achieve a higher synthesis. “That is perception. And whether the application for the robot is assembly, inspection, collaboration, whatever, the need for intelligence in sensing is paramount.”

Fanuc’s Steve Prehn agrees: “The ability to recognize features and correctly react to changes in the environment is clearly a key point on the evolutionary path of robots.” He feels that some of his company’s applications that utilize machine vision together with tactile and force feedback illustrate this.

“For instance, a customer recently approached us to help them solve a problem that required very quick and accurate placement of a part on a surface that was not consistently flat,” says Prehn. The challenge required calculating a best fit plane for the part held by the robot, and the surface it was to be placed upon. “Once the part was placed square to the surface, a precise amount of pressure was applied by the robot to force the mating surfaces together, allowing the part to flex, maximizing surface contact. Measured moments around the direction of the force told the robot how to alter pressure applied to the part,” he explains.

Prehn says this sort of sophistication will become increasingly common in robotic applications in the future.

Sponsored Recommendations

Wireless Data Acquisition System Case Studies

Wireless data acquisition systems are vital elements of connected factories, collecting data that allows operators to remotely access and visualize equipment and process information...

Strategizing for sustainable success in material handling and packaging

Download our visual factory brochure to explore how, together, we can fully optimize your industrial operations for ongoing success in material handling and packaging. As your...

A closer look at modern design considerations for food and beverage

With new and changing safety and hygiene regulations at top of mind, its easy to understand how other crucial aspects of machine design can get pushed aside. Our whitepaper explores...

Fueling the Future of Commercial EV Charging Infrastructure

Miguel Gudino, an Associate Application Engineer at RS, addresses various EV charging challenges and opportunities, ranging from charging station design strategies to the advanced...