GE has been at the cutting edge of robotics development since the 1960s, when the company began its work on a powered exoskeleton called Hardiman that was designed to enable the wearer to easily lift loads of 1,500 lbs.
At a recent visit to GE Global Research in Niskayuna, N.Y., I got a chance to see some of GE’s newest developments in robotics—taking on a whole new range of tasks in a technology space that has gotten more exciting than ever.
Moore’s Law, additive manufacturing, advances in sensing, computer vision, artificial intelligence and other technologies have come together to create a world of possibilities. “They’ve opened up a lot of use cases that we couldn’t even think about five years ago,” said John Lizzi, robotics breakout leader for GE Global Research. “How do you get these things to work together in collaborative ways?”
GE has some ideas. Like tiny robots that can travel into very small spaces for in situ inspection and repair inside aircraft engines or gas turbines, for example—minimally invasive surgery for machinery, so to speak. Or crawlers that can scale walls magnetically to inspect tank walls.
GE is putting some of these concepts to work through Avitas Systems, a business launched earlier this year by GE Ventures. “It came out of our work with robotics; something that we built from the ground up,” Lizzi said.
For inspection of critical infrastructures in oil and gas, transportation and energy industries, humans in harnesses are still the state-of-the-art today, Lizzi says. Avitas instead combines robotics, AI and predictive analytics to create 3D targeted inspection that makes asset inspections easier and quicker. Just in September, Avitas announced its work with Nvidia and its AI supercomputers to create inspection services power by AI-trained drones and crawlers.
Armed with sensors, the autonomous and semi-autonomous robots can gather data that is then sent to a cloud-based platform hosted on GE’s Predix software, where it can be analyzed and combined with other types of data—such as maintenance or regulatory records, along with geological or weather data, for example. Small computers on the drones can be used for edge analytics as well, Lizzi noted. “In some ways, drones could be thought of as edge devices,” he said.
Refineries and flare stacks are great examples of the types of environments well suited to these inspection services. The drones can get pretty close to the hot assets—keeping flare stacks in operation during inspection, for example, rather than having to shut them down for days to have them cool enough for human inspectors. Inspection is improved also through the use of infrared cameras and other sensing technologies attached to the robots.
The Avitas engineering team continues to work closely with GE Global Research in Niskayuna, where I got a demonstration of the drone technology at GE Research’s outdoor test site. Considering all the technologies being combined in the work, it takes a diversified team, commented Judy Guzzo, a senior research scientist in aerial robotics systems at GE Global Research. The team includes experts in computer vision, hardware engineering, software engineering, robotics and more, she said.
Together, the researchers have developed the first autonomous 3D inspection platform, Guzzo said, its value proposition being to inspect assets quickly and automatically, and take humans out of the loop. “We have done a lot of work to automate this process,” she said.
First generating a 3D model of the asset, the drones enable targeted 3D inspection—zeroing in specifically on the data that an inspector wants, explained Shiraj Sen, a lead scientist in aerial robotics systems at GE Global Research.
See the video at the bottom of this story for a demonstration of the drone creating a 3D model of a small tower at GE Research’s test site, including explanation from Sen about how the drone is creating the model.
Right now, the technology makes a model of a specific asset, Sen said, but could ultimately be generalized to a whole fleet of assets. The team would also eventually like to take the drone capabilities beyond the line of sight, Lizzi said, but that depends more on loosening regulations from the U.S. Federal Aviation Administration (FAA) than the technology itself.
In GE’s new Forge Lab, which had only been open for a couple days when I got a chance to look around there, engineers are working on another project geared toward human-robot collaboration. Rather than the collaborative robotics that have humans and robots working side by side on an assembly line, GE’s telerobotics demo shows the possibility of humans and robots working together over long distances, explained John Hoare, lead robotics and autonomous systems engineer.
The current project in the Forge Lab has a mobile robot that could turn a valve on, say, an offshore oil platform, operated by a human back in a control room. “In a way, it’s a physical telepresence,” Hoare said. “The robot has sensors and a knowledge of that world. So it feels like the person is there.”
Key reasons for using such technology would be to keep people out of dangerous locations or to reduce the need for travel out to remote locations. Though GE is working on valve turning specifically, Hoare said, the concept could be applied to other tasks as well.
Humans maintain the role of intelligent decision-making, and multiple people with different skillsets could be working together in the same virtual environment. But the researchers are developing the robot to do more of the work on its own.
“We’re trying to push more and more autonomy out to that robot,” Hoare said. “Humans show the robot how to do a task, and the robot can then do more. Today the robot can do a little more than it could do yesterday. We’d inevitably like to get to a point where the robot is totally autonomous.”
The technology could not only help keep people out of harm’s way, but could also improve knowledge of the assets, Hoare said. Asset inspection at a remote location might be done only every two years because it’s expensive to send somebody out, he explained. “But if there were a resident robot, maybe it could check the asset every week. It would get a lot more data on that asset.”