But augmented reality (AR) technology can also be used for real-world, non-video applications in which computer-generated information is merged with, or overlaid onto, a person’s view of the physical world. Researchers have already demonstrated AR’s use as an aid to surgical procedures, for example, and as a guide to industrial workers in performing complex assembly tasks.
In another promising industrial application, researchers at Georgia Tech University, in Atlanta, are now testing AR technology for use in poultry plants to improve communication between automated systems and workers. Using AR technology, the researchers have designed two systems that project graphical instructions from an automated inspection system onto birds on a processing line. These symbols tell workers how to trim birds to eliminate damaged or diseased sections, or whether to discard a bird entirely.
Today, inspections are done by U.S. Department of Agriculture inspectors, who communicate instructions to trimmers using hand gestures. But an automated machine vision inspection system developed by the Georgia Tech Research Institute (GTRI) is being commercialized, and other, competitive systems are also on the market, says J. Craig Wyvill, division chief of the GTRI Food Technology Processing Division (http://foodtech.gatech.edu). Pending proposed changes in USDA inspection protocols, these automated systems hold potential to replace the human inspectors on poultry lines.
Before that can happen, however, the poultry processing industry will need an efficient way to communicate instructions from the automated inspection systems to the trimmers, says Wyvill. Techniques such as audio headphones and computer displays located near the trim stations have been tried. But with lines moving at speeds up to 182 birds per minute, these methods fall short in effectively communicating to workers exactly which sets of trim instructions apply to which birds.
To overcome this problem, GTRI is collaborating with researchers at the Georgia Tech College of Computing to develop an AR-based solution. The idea is to visually overlay the trim instructions directly onto each bird, eliminating worker confusion.
A team led by Blair Macintyre, an assistant professor at the College of Computing, has developed two AR solutions. The first approach uses a location-tracked, see-through, head-mounted display that would be worn by a trimmer. It directly overlays graphical instructions on a trimmer’s view of the birds. The second solution uses a laser scanner, mounted on a fixed location near the processing line, to project graphical instructions directly onto each bird that requires some action, such as trimming. In this approach, the bird carcasses, but not the head position of the user, must be tracked for the instructions to correctly appear.
“Each solution looks to have advantages and disadvantages,” says Macintyre. Both systems provide potential for advance warning to trimmers of the workload coming down the line, which current practices don’t allow, he notes. “But in the near term, I think the projection version will be more practical, for a variety of reasons.”
Among other issues, the head-mounted display version could pose hygiene and ergonomic concerns, says Macintyre, and “any time you start putting something on workers, they probably aren’t going to like it.” The head-mounted system would offer advantages in environments in which users are required to move around, he observes, but the poultry trimmers are relatively stationary.
Poultry plants are typically wet and slippery and must be washed down with high-pressured water systems daily—factors that would impact the design of the laser projection-based system, Macintyre notes. “But once you put it in a hardened case and attach it to the line so that nobody has to interact with it, I think the projection system will be easier to install and maintain.”
The Georgia Tech researchers plan to conduct laboratory experiments beginning this fall on both AR proof-of-concept systems, and will likely later choose one to develop further, says Macintyre. Depending on market need, the projection-based AR system could be ready for commercial deployment within one to two years, he says. The head-mounted version, due to cost and technology reasons, would likely take longer.