Big Data headlines not only tech news but also popular news—as in what’s the government doing with all the information it’s storing about us. Big Data comprises just a twig compared with the full-grown oak that Big Analog Data can generate. National Instruments Fellow Tom Bradicich mentioned twice in separate interviews during NIWeek last month that all of the analog data acquired from manufacturing and products—a.k.a. the Internet of Things (IoT)—dwarfs what is currently known as Big Data.
When thinking about data, consider the flow. First is acquisition from analog measurements. This may or may not be used in real time. Then there is data in motion and data at rest. Finally there is archiving the data. Then characterize data by where it is. The insight comes from how the data is used. Real time is important if you are monitoring a motor about to catch fire. On the other hand, maybe you want to go through three years of data to look for trend.
“In test and measurement, we might debate with IT about whose data is bigger,” Bradicich says. “It’s not just size, but also velocity. When data leaves NI devices, it’s in motion. Then first it hits a switch, server or workstation. Now it is at rest in an IT server. Now the IT world takes over for analytics, then archiving. The question for us is, Where do customers want to derive insight? Maybe closer to the instrument, or maybe later at the desk. The four variables of data classically are volume, velocity, variety and value. We have added a fifth—visibility—for who needs to see and analyze results.”
Since NI is a measurement company, it has partnered with several companies to bring a Big Data solution. IBM has become a close partner—not surprising given that NI’s senior vice president of R&D and Bradicich are both from IBM. Specifically, the product from IBM is InfoSphere Streams, part of the IBM Big Data platform. It processes vast amounts of generated streaming data in real time and allows user-developed applications to quickly ingest, analyze and correlate information as it arrives from thousands of real-time sources. The solution can handle very high data throughput rates—up to millions of events or messages per second.
Terabytes of data
An NI partner, Phasor Measurement, has developed a solution to monitor the electric power grid. Bradicich says it can generate 5 TB of data per month. A wind turbine can generate 10 TB per day, and a jet engine can generate 20 TB per hour. It's easy to see how this fast, streaming data could add up quickly.
Duke Energy built a system to conquer the problem of monitoring and analyzing diagnostics of its “fossil fuel fleet” of generating plants. The old way sent condition monitoring specialists to each site with handheld data collection devices. The company figured that the specialists spent 80 percent of their time merely collecting data while using only 20 percent of their time actually analyzing the data. Implementing a Big Analog Data solution, predictive maintenance specialists in remote centers watch key signatures from equipment and note abnormalities. They can then compare these signatures when necessary to a fault signature database and take corrective action much more quickly.
When you delve into the guts of a buzzword, sometimes you find a solution to some intractable problems. So, don't get turned off by all the hype of Big Data. See how you can use it to solve your major engineering problems.
Gary Mintchell, email@example.com, is Founding Editor of Automation World.