Danaca Jordan’s Question
What is the technical basis and ability of technologies other than PID and model predictive control (MPC)? These technologies seem fascinating and I would like to know more, particularly as I study for the ISA Certified Automation Professional (CAP) exam.
Greg McMillan’s Answer
Michel Ruel has achieved considerable success in the use of fuzzy logic control (FLC) in mineral processing as documented in Ruel’s Rules for Use of PID, MPC and FLC. The process interrelationships and dynamics in the processing of ores are not defined due to the predominance of missing measurements and unknown effects. Mineral processing PID loops are often in manual, not only for the usual reasons of valve and measurement problems, but also because process dynamics between a controlled and manipulated variable radically change, including even the sign of the process action (reverse or direct) based on complex multivariable effects that can’t be quantified.
If the FLC configuration and interface is set up properly for visibility, understandability and adjustability of the rules, the plant can change the rules as needed, enabling sustainable benefits. In the application cited by Michel Ruel, every week metallurgists validate rules, make slight adjustments, and work with control engineers to make slight adjustments. A production record was achieved in the first week. The average use of energy per ton had decreased by 8 percent, and the tonnage per day had increased by 14 percent.
There have been successful applications of PID and MPC in the mining industry as detailed in the Control Talk columns Process control challenges and solutions in mineral processing and Smart measurement and control in mineral processing.
I have successfully used FLC on a waste treatment pH system to prevent RCRA violations at a Pensacola, Fla. plant because of my initial excitement about the technology. It did very well for decades but the plant was afraid to touch it. The Control magazine article Virtual Control of Real pH with Mark Sowell showed how you could replace the FLC with an MPC and PID strategy that could be better maintained, tuned and optimized.
We used FLC integrated into the software for a major supplier of expert systems in the 1980s and 1990s but there were no real success stories for FLC. There was one successful application of an expert system for a smart level alarm but it did not use FLC. However, a simple material balance could have done as well. There were several applications for smart alarms that were turned off. After nearly 100 man-years, we have not much at all to show for these expert systems. You could add a lot rules for FLC and logic based on the expertise of the developer of the application, but how these rules played together and how you could tell which rule needed to be changed was a major problem. When the developer left the production unit, operators and process engineers were not able to make changes inevitably needed.
The standalone field FLC advertised for better temperature setpoint response cannot do better than a well-tuned PID if you use all of the PID options summarized in the Control magazine article The greatest source of process control knowledge, including PID structure such as 2 Degrees of Freedom (2DOF) or a setpoint lead-lag. You can also use gain scheduling in the PID if necessary. The problem with FLC is how you tune it and update it for changing process conditions. I wrote the original section on FLC in A Guide to the Automation Body of Knowledge but the next edition is going to have it omitted due to common agreement between me and ISA that making more room to help get the most out of your PID was more generally useful.
FLC has been used in pulp and paper. I remember instances of FLC for kiln control but since then we have developed much better PID and MPC strategies that eliminate interaction and tuning problems.
So far as artificial neural networks (ANN), I have seen some successful applications in batch end point detection and prediction and for inferential dryer moisture control. The insertion of time delays on inputs to make them coincide with measured output is required for continuous operations. For plug flow operations like dryers, this can be readily done since the deadtime is simply the volume divided by the flow rate. For continuous vessels and columns, the insertion of very large lag times and possibly a small lead time are needed besides dead time. No dynamic compensation is needed for batch operation end point prediction.
You have to be very careful not to be outside of the test data range because of bizarre nonlinear predictions. You can also get local reversals of process gain sign causing buzzing if the predicted variable is used for closed loop control. Finally, you need to eliminate correlations between inputs. I prefer multivariate statistical process control (MSPC) that eliminates cross correlation of inputs by virtue of principle component analysis and does not exhibit process gain sign reversals or bizarre nonlinearity upon extrapolation outside of test data range. Also, MSPC can provide a piecewise linear fit to nonlinear batch profiles, which is a technique we commonly do with signal characterizers for any nonlinearity. I think there is an opportunity for MSPC to provide more intelligent and linear variables for an MPC like we do with signal characterizers.
Join the ISA Mentor Program
The ISA Mentor Program enables young professionals to access the wisdom and expertise of seasoned ISA members, and offers veteran ISA professionals the chance to share their wisdom and make a difference in someone’s career. Click this link to learn more about how you can join the ISA Mentor Program.
Hunter Vegas’ Answer:
I implemented a neural network some years ago on a distillation column level control. The column was notoriously difficult to control. The level would swing all over and anything would set it off, such as weather or feed changes. The operators had to run it in manual because automatic was a hopeless waste of time.
At the time (and this information might be dated) the neural network was created by bringing a stack of parameters into the calculation and “training” it on the data. Theoretically the calculation would strengthen the parameters that mattered, weaken the parameters that didn’t, and eventually configure itself to learn the system.
The process taught me much. Here are my main learning points:
1) Choose the training data wisely. If you give it straight line data then it learns straight lines. You need to teach it using upset data so it learns what to do when things go wrong. (Then use new upset data to test it.)
2) Choose the input parameters wisely. I started by giving it everything. Over time I came to realize that the data it needed wasn’t the obvious. In this case it needed:
- The level valve output (not a surprise).
- The incoming flow (again, not a surprise).
- The pressure control valve position (This was a surprise. I figured it wanted pressure, but the control valve kept the pressure very flat. However as the control valve moved around to maintain the pressure, the level swung so knowing the valve position helped the level controller).
- The temperature valve position (same idea as pressure).
- Sometimes the derivative (rate of change) of a parameter is much more important than the parameter itself.
3) Ultimately the system worked very well – but honestly by the time I had gone through four iterations of training and building the system I KNEW the physics behind it. The calculation for controlling the level was fairly simple when all was said and done. I probably could have just fed it into a feedforward PID and accomplished the same thing.
The experience was interesting and fun, and I actually got an award from ISA for the work. However when all was said and done, I realized it wasn’t nearly as impressive a tool as all the marketing brochures suggested. (At the time it was all the rage – companies were selling neutral network controller packages and magazine articles were predicting it would replace PID in a matter of years.)
Danaca Jordan’s Subsequent Question:
Thank you, this is a lot more practical insight than I have been able to glean from the books.
I imagine the batch data analytics program offered by a major supplier of control systems is an example of the MSPC you mentioned. I think I have some papers on it stashed somewhere, since we have considered using it for some of our batch systems. What is batch data analytics and what can it do?
Greg McMillan’s Answer:
Yes, batch data analytics uses MSPC technology with some additional features, such as dynamic time warping. The supplier of the control system software worked with Lubrizol’s technology manager Robert Wojewodka to develop and improve the product for batch processes as highlighted in the Control magazine article Data Analytics in Batch Operations. Data analytics eliminates relationships between process inputs (cross correlations) and reduces the number of process inputs by the use of principal components constructed that are orthogonal and thus independent of each other in a plot of a process output versus principle components. For two principal components, this is readily seen as an X, Y and Z plot with each axis at a 90-degree angle to the each other axis. The X and Y axis covers the range of values principal components and the Z axis is the process output. The user can drill down into each principal component to see the contribution of each process input. The use of graphics to show this can greatly increase operator understanding. Data analytics excels at identifying unsuspected relationships. For process conditions outside of the data range used in developing the empirical models, linear extrapolation helps prevent bizarre extraneous predictions. Also, the use of a piecewise linear fit means there are no humps or bumps that cause a local reversal of process gain and buzzing.
Batch data analytics (MSPC) does not need to identify the process dynamics because all of the process inputs are focused on a process output at a particular part of the batch cycle (e.g., endpoint). This is incredibly liberating. The piecewise linear fit to the batch profile enables batch data analytics to deal with the nonlinearity of the batch response. The results can be used to make mid-batch corrections.
There is an opportunity for ANN to be used with MSPC to deal with some of the nonlinearities of inputs but the proponents of MSPC and ANN often think their technologies is the total solution and don’t work together. Some even think their favorite technology can replace all types of controllers.
Getting laboratory information on a consistent basis is a challenge. I think for training the model, you could enter the batch results manually. When choosing batches, you want to include a variety of batches but all with normal operation (no outliers from failures of devices or equipment or improper operations). The applications as noted in the Wojewodka article emphasize what you want to have as a model is the average batch and not the best batch (not the “golden batch”). I think this is right to start detecting abnormal batches but process control seeks to find the best and reduce the variability from the best so eventually you want a model that is representative of the best batches.
I like MSPC “worm plots” because they tell me from tail to the head the past and future of batches with tightness of coil adding insight. The worm plot is a series of batch end points expressed as a key process variable (PV1n) that is plotted as scores of principal component 1 (PC1) and principal component 2 (PC2)
If you want to do some automated correction of the prediction by taking a fraction of the difference between the predicted result and lab result, you would need to get the lab result into your DCS probably via OPC or some lab entry system interfaced to your DCS. Again the timing of the correction is not important for batch operations. Whenever the bias correction comes in, the prediction is improved for the next batch. The bias correction is similar to what is done in MPC and the trend of the bias is useful as a history of how the accuracy is changing and whether there is possibly noise in the lab result or model prediction.
The really big name in MSPC is John F. MacGregor at McMaster University in Ontario, Canada. McMaster University has expanded beyond MSPC to offer a process control degree. Another big name there is Tom Marlin, who I think came originally from the Monsanto Solutia Pensacola Nylon Intermediates plant. Tom gives his view in the InTech magazine article Educating the engineer, Part 2 of a two-part series. Part 1 of the series, Student to engineer, focused on engineering curriculum in universities.
For more on my view of why some technologies have been much more successful than others, see my Control Talk blog Keys to Successful Control Technologies.
Additional Mentor Program Resources
See the ISA book 101 Tips for a Successful Automation Career that grew out of this Mentor Program to gain concise and practical advice. See the InTech magazine feature article Enabling new automation engineers for candid comments from some of the original program participants. See the Control Talk column How to effectively get engineering knowledge with the ISA Mentor Program protégée Keneisha Williams on the challenges faced by young engineers today, and the column How to succeed at career and project migration with protégé Bill Thomas on how to make the most out of yourself and your project. Providing discussion and answers besides Greg McMillan and co-founder of the program Hunter Vegas (project engineering manager at Wunderlich-Malec) are resources Mark Darby (principal consultant at CMiD Solutions), Brian Hrankowsky (consultant engineer at a major pharmaceutical company), Michel Ruel (executive director, engineering practice at BBA Inc.), Leah Ruder (director of global project engineering at the Midwest Engineering Center of Emerson Automation Solutions), Nick Sands (ISA Fellow and Manufacturing Technology Fellow at DuPont), Bart Propst (process control leader for the Ascend Performance Materials Chocolate Bayou plant) and Daniel Warren (senior instrumentation/electrical specialist at D.M.W. Instrumentation Consulting Services, Ltd.).