Real-Time Analytics for Batch Processes

Nov. 18, 2014
Batch processes have been difficult to control and analyze because each batch is unique. But batch analytics software can help.

Companies that use batch manufacturing processes for specialty chemicals, food, pharmaceuticals, biotech products and other products can use data that are already being collected to make real-time decisions and take appropriate actions. Real-time batch analytics can help companies gain a better understanding of their processes, minimize variations, and know where to make improvements to the process instantaneously.

Historically, batch processes have been difficult to control and analyze because each batch is unique: batches are not the same length; time lags differ; raw materials can differ; and there are often differences in equipment, operating conditions and process activities. Advanced batch controls can be complex. Understanding how these variations impact batch quality while the batch is running can provide enormous benefits.

Manufacturers use batch analytics software to compare batches to help uncover potential problems in real time. One analytics solution enables users to compare ideal, “golden” batches to other batches to help better understand how variables affect the current batch. The supplier calls this dynamic time warping (DTW). It aligns the data and can compare many batches’ parameters and accommodate for variable timing differences. The technology helps align the data accurately between batches, and match parameters from historical batch data to the variations found in the live batches. The data analytics software can be used to determine how the batch is progressing and predict if a batch will meet specifications or have to be reworked, modified or even discarded.

Significantly, the batch analytics software expands the range of processes that can take advantage of advanced process control. The solution can be used to interpret data to help optimize the process in real time. The software analyzes comparisons of batch trajectories across different batches to other parameters and variables. The primary multivariate methods employed include principal components analysis (PCA) and projections to latent structures (PLS). PCA provides a concise overview of a data set. It recognizes patterns, including outliers, trends, groups, relationships, etc., and helps detect abnormal operations. PLS establishes relationships between input and output variables and develops predictive models of a process for quality predictions.

In addition, the included model predictive multivariable analysis (MVA) software enables users to adjust batch trajectories and predictions for control using a comparative model. MVA helps the engineer look at the batch parameters holistically to be able to identify the interaction of the variables and uncover what is contributing to a particular condition. The real-time analytics can help determine how all the variables affect the batch. By drilling down on individual parameters, an engineer can determine if something is out of range or “not quite right,” make decisions about the process, and take appropriate actions. The software can also help predict when problems are beginning to develop so that corrective measures can be taken.

The analysis can examine conditions and measurements that impact product quality. By visualizing the data, the engineer can determine if the batch should be used for model generation. It’s critical to compare data from multiple batches to align batch parameters to determine how a batch is doing.

During the data extraction and model building process, data for the selected batches are automatically aligned with the correct parameters using DTW. By generating models and using the DTW screens, the manufacturer can determine if a parameter is different from batch to batch. In the past, it was not easy to access, visualize and compare data; or generate models to compare parameters on the fly. This technology makes it far easier to do so.

The included model-building application tools can enable workers who are familiar with their process to step through the process. Users can select which batches should be used to generate models. They can also compare the results with other models and check predictions. Lab analysis data can be used to validate the models and determine when the model is working well. The analysis can help determine what is working well and what needs to be improved.

Applying batch analytics in brewing

A major brewer is using batch analytics software as part of a beta trial to identify process problems. According to one of its engineers, the brewing company used the software to model its Briggs lauter tun—a unit that separates extracted wort (sugar from grains)—to identify the critical quality parameters during production runs.

The brewing company runs 60-80 batches a week on this tun, and the company loses money if it deviates from the standard operating procedures. It chose this unit for the analysis because it was already collecting a lot of data on it.

The batch analytics software is used to build a model for the batch process or unit and executes alongside a running batch process. The models aid in predicting quality parameters, identifying variables that are affecting the process, and detecting faults early in the process. The model was built to compare the running batch against historical batches, and enables users to drill down on individual parameters and compare with other batches to determine if something is out of range or otherwise not right. The company used the software to build a model and then used the model’s advanced statistics to determine that the steam pressure solenoid was plugged.

“Creating batch process models can be particularly challenging for batch applications because of the inherent time variability from batch to batch,” the plant engineer says. “Batch lengths vary because of equipment, operating conditions, faults in one stage of the batch, time lags, and raw material variations. The analytics can be used to compare the current batch against what we consider to be a good batch to find the cause of a problem."

The multivariate analysis built into the model showed the parameter outliers and helped identify potential parameters that might be an issue. The company was also able to use the DTW feature that overlays different batches and matches the parameters to identify abnormal conditions with its pH meters. The company corrected this problem to increase efficiencies, and is now using the technology to identify other challenges.

>>Janice Abel, [email protected], is principal consultant at ARC Advisory Group.

Companies in this Article

Sponsored Recommendations

Why should American-Made Products be a top priority?

Within this white paper, Shalabh “Shalli” Kumar, founder of AVG Advanced Technologies, stresses the importance of prioritizing American-made products to safeguard the country'...

How to Improve Production Accountability in Manufacturing

David Greenfield, Automation World's Editor-in-Chief, and Shalli Kumar, founder of EZAutomation, discuss the idea of production monitors: a preprogrammed PLC/LED display that ...

HALT/HASS: The Ultimate Test for Reliability

Discover how companies like EZAutomation push the limits of reliability with HALT/HASS testing, originally designed to mimic the extreme conditions of space shuttle launches. ...

Your Next Production Monitor Is Only a Few Clicks Away

Shop for your very own EZ Production Monitor. It's designed for non-technical staff, so there's no programming required! It combines pre-coded firmware, real-time data, and WiFi...