Bringing a steady stream of new and better products to the global marketplace is a balancing act. Just ask Hamid Khazaei. As product test director at Opnext Inc., it’s his job is to ensure that the Fremont, Calif.-based company produces quality, high-speed optical subsystems for telecommunications and data networks. Introducing quality new products quickly is a continuing struggle for makers of electronics subsystems, where product lifecycles are measured in months, rather than years.
Like many of his counterparts at other companies, Khazaei worked with his colleagues to enhance this balance through a new generation of automated test and measurement tools. Not only are these increasingly powerful tools improving product quality throughout the supply chain, but their flexibility is also helping them to bring new products to market faster.
At Opnext, Khazaei and his colleagues exploited the power of some of these tools while designing and preparing to build a new 40-Gbps optical subsystem for transferring digital data. To accelerate the new product introduction without sacrificing quality, they approached Montreal-based Averna Technologies Inc. (www.averna.com) for an automated platform that would perform the required functional testing. “We asked them to propose a series of test stations and a software infrastructure to connect, monitor and manage the stations,” says Khazaei.
Because Averna’s staff would be participating in a new product introduction, their challenge was to fit into Opnext’s current engineering schedule, offering advice as needed, testing prototypes, and delivering the final stations. On-time delivery required that they begin their work before the design of the dense wavelength division multiplexing (DWDM) module was completed.
“Typically, in new product introductions, the design from R&D is never 100 percent complete,” notes Jacques Routhier, business development manager at Averna. “It always needs to be tweaked.” Because these tweaks often creep beyond the time allotted in the original schedule, outside partners must offer technology that is flexible enough to accommodate these changes.
Working with Opnext’s engineering team, the specialists at Averna designed and built three kinds of automated test stations for Opnext’s high-volume manufacturing and assembly environment. The first checks printed circuit boards to make sure that they work, before they are joined to optical components to produce the DWDM module. “You don’t want to risk integrating a defective circuit board with very expensive optics and having to scrap the assembly,” says Routhier. Once the module is built, the other two test stations conduct a series of performance, calibration and functional tests.
Because these test stations would eventually be installed worldwide in a variety of manufacturing facilities belonging to Opnext and its contract manufacturers, Averna linked them together with its Proligent Enterprise Test Software. The software holds all of Opnext’s product specifications, station configurations, reports and deployment histories in a central environment, which gives Opnext visibility into all manufacturing sites, control over the tests, and traceability.
For Opnext and other global manufacturers, this kind of software provides the necessary cohesion over quality control. “Not only is quality control being integrated more into manufacturing, but it is also being moved increasingly upstream to the supplier’s locations,” notes Routhier. “These manufacturers may be exchanging information with multiple design centers and contract manufacturing partners spread globally.” Deploying a common test platform among all players in what he calls “the ecosystem” gives them almost real-time visibility over the entire supply chain.
The current generation of software platforms like Proligent is not the only technology that has enhanced the power of test and measurement tool over the last several years. Other technical developments have made contributions, too. One is field programmable gate arrays (FPGAs). “These electronic systems permit users to reprogram processor-level types of activities very quickly, making it a very flexible type of processor technology that can be adapted and reconfigured depending on the application,” says Routhier. Averna has a technical center in Atlanta for FPGA programming.
Other reasons for the greater power in test and measurement tools have been ever more capable Intel microprocessors and the advent of high-throughput, highly time-deterministic fieldbuses. Because of these developments, some are rethinking their views of automation technologies, expanding it beyond just programmable logic controllers (PLCs), motion controls, and human-machine interfaces (HMIs) to include test and measurement as well.
>> Quality in the Manufacturing Process. Read how AV&R Vision and Robotics created an automated, industrial system to inspect and deburr high-precision turbine airfoils. Visit http://bit.ly/awapp008
“Processing power together with fast, deterministic access to field inputs and outputs has allowed unprecedented access to real-time information,” explains Robert Trask, PE, senior electrical engineer at Beckhoff Automation LLC (www.beckhoff.com/usa) in Burnsville, Minn. “You can access, process and write outputs in a highly cyclic way.”
Trask points to the EtherCAT I/O and TwinCAT software that his company offers. EtherCAT, combined with the power of today’s PC-based controllers and automation software, enables all kinds of technological breakthroughs in test and measurement, he says. Among them is the use of multicore processors—microcomputer chips that contain more than one processor, or core, to distribute computing tasks. For example, one core might run the basic control, and others might handle measurements, the HMI, and so on.
“This balancing of tasks across processor cores is what allows adding more functionality to a single IPC [industrial PC],” notes Trask. Having only one controller powerful enough do all the work simplifies the system, making it cheaper to install and maintain.
Technical developments such as these were necessary before this integration could take place, because test and measurement not only require higher precision from industrial devices but also generate huge amounts of data. With TwinCAT, EtherCAT, the right IPC and high-performance I/O hardware, “we can handle test and measurement and run the algorithms necessary to gather all parameters that measure and track the health of a system,” says Trask.
>> Doing Double Duty: Click here to read how one company was able to retool an existing milling machine into a coordinate measuring tool.
The availability of more computing power has also led to greater use of simulation in functional testing of subsystems. Not only are many more products assembled from such subsystems now than there used to be, but they also tend to be designed to communicate with high-powered processors. Consequently, “simulators can test hydraulic actuators, electronic circuit boards and other devices by making them think that they are sitting in the actual end product,” says Jim Campbell, president of Viewpoint Systems Inc. (www.viewpointusa.com) in Rochester, N.Y. “Aircraft actuators, for example, need to think that they are on an airplane.”
Besides using this technology to check the function of moving parts, Campbell’s engineers were also able to validate that the algorithms in a Gatling gun controller for jet fighters could fire rounds and detect faults. Rather than connecting the controller to a real gun and fire rounds, the engineers connected the controller to a simulator so the control thought it was actually firing the gun. “They could change the performance of the gun in software much more rapidly than designing a whole new gun,” Campbell says. “More importantly, they could cause faults that are hard to do with a real Gatling gun.”
Automated test and measurement tools like these are reinforcing the continuing trend to move quality control out of the laboratory and integrate it into the manufacturing process itself. They also can be implemented in stages to weed out bad components as early as possible to avoid investing more resources in scrap and throwing good money after bad.
This level of test and measurement is already a reality, according to Trask at Beckhoff Automation. He points to a customer that builds fuel injectors for diesel trucks. Because a fuel injector is a very precise mechanism, automation at each stage verifies that the previous task has been completed successfully. Verification on this production line sometimes occurs by means of a vision system, and other times with a functional check.
The automation rejects any assembly that fails a test at any stage. This avoids situations where defective parts are discovered much later in the process and the entire part has to be discarded after all the components are put in place, notes Trask. Not wasting resources on bad assemblies can lead to significant savings, especially in mass production or for expensive workpieces.
The challenges are different for continuous processes. For them, the strategy is usually to detect minute changes in the process and to either make the appropriate adjustments or flag a bad section. As an example, Trask offers a continuous process that produces optical fibers. If the fiber goes out of tolerance, spooling stops, and the fiber is rerouted until the process returns to specification. Only then does spooling resume.
“It’s always cumbersome and expensive to stop a continuous process,” says Trask. “So, we always strive to avoid disruptions by making good decisions on trusted information.” He believes that this requires access to information that is both accurate and timely, and the ability to base responses upon measured deterministic data.
Although most original equipment manufacturers (OEMs) admit that quality control has a huge impact upon the bottom line, few have the near real-time visibility into their quality metrics necessary for generating those savings. In fact, surveys reveal that only about 10 percent of manufacturers have this visibility, reports Julie Fraser, an independent analyst and advisor based in southeastern Massachusetts. More than half need longer than a shift to collect and disseminate quality data to the people who need it both internally and throughout their supply chains.
Why? The answer is people. “There are still a lot of places where data may not be collected on a clipboard anymore, but some human being is still entering information into a spreadsheet or other electronic repository,” explains Fraser. “Studies show that human error in data collection is a foundational issue that companies need to address. Your data is not going to be clean if you are touching it.”
Her recommendation is to use the automated data-collection features that come with most controllers, when that is feasible, and to seek other automated or semi-automated data collection methods when it’s not.
>> Wireless, security, OEE and other topics were discussed at The 2012 Automation Conference. Click here for coverage