Although Big Data continues to be a hot topic, with lots of manufacturers excited about the opportunities all that data presents for even just predictive maintenance alone…we continue to hear more from the realistic viewpoints of those in the trenches.
“We’re in perhaps the most transformational time in industry that we’ve ever been in,” says Jim Walsh, general manager for GE Intelligent Platforms. He spoke last week at GE’s Connected World conference in Chicago, and was referring to all the promise that Big Data potentially delivers. But the question is: How do you make it real?
GE has been dealing with Big Data for quite some time, Walsh says, with more than 200,000 GE assets connected around the world. How does Big Data get big? Consider just one consumer goods manufacturer, which takes 152,000 sensor samples every second. “That adds up to 4 trillion samples a year that they’ve got to figure out a way to deal with,” Walsh notes.
But it’s not just about the size of the data; it’s about the type as well, Walsh says. With 4 trillion data points, you need to be able to correlate that data, managing it in a consistent and coherent way.
Walsh emphasizes the importance of laying a solid foundation, however, and not trying to rush the data process. “Everybody wants to start too far up the continuum,” he says. “They want the change-the-world analytics. But you need to lay the foundation. That’s not as sexy as the analytics that have an impact on your balance sheets.” He adds, however, “If you have the foundation set up, your ability to accelerate up that value continuum increases exponentially.”
In a separate conversation, Brian Courtney, general manager of industrial data intelligence for GE’s software and services business, reiterated the importance of that foundation. “You can’t really cheat,” he says. “You can’t optimize the process if you don’t know what’s wrong.”
Courtney echoes the sentiments mentioned in a post earlier this year—that manufacturers too often want to get the ultimate gain from Big Data without taking the baby steps they need to start with. They want to leap to maximize throughput, say, without first using the data to improve stability, reliability and uptime. “But you can’t get there until you have stability,” Courtney says. “There’s too much process variability.”
The technology of Big Data isn’t actually the hard part; it’s the people, Courtney says. To a large degree, there is a shift in thinking that needs to happen for Big Data to really be effective. An operator often knows just by the vibration or sound that a machine is going to go down. And now we’re asking that operator to trust numbers instead of his own instincts. “If you can’t get him to take action, predictive analytics is useless.”
Courtney cites an example from a recent pilot with a potential customer. GE’s sensor data found vibration on a turbine that didn’t make sense. So GE asked the manufacturer to shut it down to take a look. The operator, however, insisted there was nothing wrong, and didn’t want to stop production for what he saw as load-related vibration. Some time went by, and GE was still monitoring the disturbing vibration on what was a very expensive asset. So they contacted the company again, urging them to shut it down.
Because it was a trial situation, the customer essentially said that if they shut it down and GE was wrong, that would be it for their relationship. But GE stood by its data, confident that there was a problem that needed to be explored. The customer shut down the turbine and put a bore scope in, finding corrosion on one of the blades that ran most of the way through. As Chad Stoecker, who runs GE’s Industrial Performance and Reliability Center, put it, “That blade was three to five days from liberation.”
It was a $30 million catch—considering the potential lost production and cost to repair—and the customer was suddenly convinced of the value of the data proposition. “They have to trust the analytics more than their own brains,” Courtney says. “To get to predictive analytics, that’s a big step.”
But they’re not all such big finds. A customer can be easily convinced of the value of data analytics when there’s $30 million at stake. But the fact is that only about 5% of the information that GE’s reliability center finds is the high-priority issues. The majority of it is geared toward giving the client weeks and months of early warning, according to Stoecker.
The big catches might come early on, but as manufacturers continue to rely on the data, they start finding instead little problems that are kept from turning into big problems, Courtney says. “Uptime is the key factor,” he emphasizes. “The value is not in the big catch, but in the day-to-day stuff.”
One example of a little problem that would’ve turned big if not caught early was at a facility in a remote part of Russia, where nobody was kept on site. GE alerted the customer to a vibration issue with a key asset. The customer sent out a technician to find that a footing bolt had come loose. He tightened the bolt and the issue was fixed. The client was annoyed, saying, “We’re not paying you to discover a bolt loose.” GE’s response? “Yes you are.”
If that bolt had stayed loose, the customer would’ve had a much bigger problem on its hands. “We’re finding problems further and further into the future. As you get closer to asset health, the problems get smaller and smaller,” Courtney says. But if the customer doesn’t accept a shift in thinking, understanding the value in finding the little problems, then they will face much greater expenses in lost production and asset repair. “That’s expensive learning for a customer.”
GE’s Industrial Performance and Reliability Center has $5 billion worth of assets under its management—160+ units, 4,000+ assets and 100,000+ sensors—and avoids tens of millions of dollars in costs for its customers. The equipment is monitored based on what’s critical to the individual customer, Stoecker notes, letting them make the shift from reactive to proactive maintenance. “We estimate that we save $80 million a year across all our customers,” he says.