For example, an engineer may develop a real-time embedded control system at the same time as its human-machine interface. Maybe the system also has a computation-intensive task such as high-speed motion control or machine vision.
Traditional logic has been sequential—that is, a program executes one step at a time. But today’s increasingly complex systems often require several things to happen at the same time and then be brought together.
Jeff Meisel, LabView product manager for multicore technologies at National Instruments Corp. (www.ni.com), the Austin, Texas instrumentation and control supplier, talks of the complexities. “We’ve seen the silicon guys bringing out new technologies, from multicore processors to field programmable gate array (FPGA) chips. Another engineering complexity is the trend toward distributed processing. To get the most performance from this hardware, I have to think in a parallel way. Fundamentally, the process is to break up the problem into different tasks.”
Data revolution
Parallel programming is a topic discussed in “high technology” circles with bleedover to manufacturing. For example, Joe Hellerstein, a professor of Computer Science at the University of California, Berkeley, writing in the popular tech blog GigaOm (www.gigaom.com), states, “We’re now entering what I call the ‘Industrial Revolution of Data,’ where the majority of data will be stamped out by machines: software logs, cameras, microphones, RFID readers, wireless sensor networks and so on. These machines generate data a lot faster than people can, and their production rates will grow exponentially with Moore’s Law.
“Storing this data is cheap, and it can be mined for valuable information,” Hellerstein continues. “In this context, there is some good news for parallel programming. Data analysis software parallelizes fairly naturally. In fact, software written in SQL (Structured Query Language) has been running in parallel for more than 20 years. But with ‘Big Data’ now becoming a reality, more programmers are interested in building programs on the parallel model—and they often find SQL an unfamiliar and restrictive way to wrangle data and write code.”
Programming for parallel processing, as opposed to the traditional sequential manner, poses unique challenges. The online encyclopedia, Wikipedia (http://en.wikipedia.org), notes, “Parallel computing…uses multiple processing elements simultaneously to solve a problem. This is accomplished by breaking the problem into independent parts so that each processing element can execute its part of the algorithm simultaneously with the others. The processing elements can be diverse and include resources such as a single computer with multiple processors, several networked computers, specialized hardware, or any combination of the above.”
National Instruments, whose LabView graphical programming system is inherently parallel, according to co-founder and NI Fellow Jeff Kodosky, believes that dataflow programming models offer inherent advantages over languages such as C. A white paper at www.ni.com states, “with a dataflow model, nodes on a block diagram are connected to one another to express the logical execution flow, and they can be used to easily express parallelism. When a block diagram node receives all required inputs, it produces output data and passes that data to the next node in the dataflow path. The movement of data through the nodes determines the execution order of the functions on the block diagram.”
While NI has a justifiable bias in this regard, it does suggest that picking a programming language with the built-in characteristic of implementing parallelism offers a significant advantage to industrial control developers. They don't need to be experts in the low-level intricacies of parallel hardware such as multicore processors and FPGAs in order to reap the performance benefits.
So whether you’re developing multi-processing control applications or massive database operations, parallel programming may be in your future.
Gary Mintchell, [email protected], is Automation World’s Editor in Chief.
National Instruments Corp.
www.ni.com