2020 Virtual AIChE Annual Meeting

(578c) Data-Driven Process Optimization Under Neural Network Surrogate Model Uncertainty

Authors

Yang, S. B. - Presenter, University of Alberta
Li, Z., University of Alberta
With the advances made in machine learning and data science, data-driven modeling and optimization have received lots of attention in recent years. Among the various machine learning technique, artificial neural network (ANN) is a powerful surrogate modeling method which can be used to handle highly nonlinear processes. While this technique has received lots of study in the past, the uncertainty in the generated surrogate model has been widely ignored. The main purpose of this work is to propose an effective approach to handle the uncertainty in the neural network surrogate model and to address it in an optimization framework. We aim to obtain more reliable and robust solutions with the consideration of the ANN prediction uncertainty caused by the variation of the training data set.

Neural networks with rectified linear units (ReLU) is considered in this work. The main advantage of this type of network is that mixed integer linear optimization model can be formulated with the trained model [1]. This is useful for addressing complex large-scale process optimization problems in the field of process systems engineering. In the proposed approach for addressing uncertainty of ANN model, an ensemble of ANNs trained by different training data sets is first generated. Then those networks are used to predict the output variation caused by the variation of training data set [2, 3]. Furthermore, a mixture density network (MDN) [4, 5] is used to approximate an ensemble of ANNs for estimating the prediction uncertainty. With the trained mixture density network, a chance constrained optimization problem is formulated to enforce solution robustness through the probability of the constraint satisfaction. To handle the objective uncertainty, the mean and standard deviation of the objective is combined into a mean-variance type of criterion to find a trade-off between solution quality and variability. The output mean and standard deviation are modeled through the mixture density network surrogate model and are embedded in the optimization problem. The final deterministic counterpart problem is a MILP optimization problem and can be solved using off-the-shelf solver such as CPLEX.

To demonstrate the proposed approach, the minimization of the total annual cost of a water-ethanol distillation column is studied. In this problem, the input variables include the reflux ratio, reboiler duty, total number of stages, and feed stage number, and the output variables includes the top mole fraction of ethanol and top recovery of ethanol, and total annual cost (TAC). Two chance constraints related to ethanol fraction and are applied in this optimization problem. First, 120000 data points are generated from a distillation column simulation model in Aspen Plus. Among them, 20000 samples are kept aside as the validation set and the rest of 100000 samples are used for training. Next, 100 different training data sets (80000 samples in each set) are collected from the mentioned 100000-sample data set with the use of bootstrapping resampling with replacement. Then, an ensemble of 100 ReLU ANNs are trained with different training sets. The mean and standard deviation of an ANN prediction based on a specific observation can be calculated from this ANN ensemble. Then, the ReLU mixture density network is trained, validated, and then embedded in the MILP optimization problem for minimizing the TAC with chance constraints applied. The solution from the proposed approach is shown to satisfy the constraints with probability greater or close to the corresponding confidence values in the chance-constrained optimizations, which means the solution is more reliable in terms of constraint satisfaction. In addition, the obtained solution has less performance variability compared to the traditional method based on a single neural network surrogate model.

Reference

[1] B. Grimstad and H. Andersson, "ReLU networks as surrogate models in mixed-integer linear programs," Computers & chemical engineering, vol. 131, p. 106580, 2019.

[2] R. Srivastav, K. Sudheer, and I. Chaubey, "A simplified approach to quantifying predictive and parametric uncertainty in artificial neural network hydrologic models," Water Resources Research, vol. 43, no. 10, 2007.

[3] K. Kasiviswanathan and K. Sudheer, "Quantification of the predictive uncertainty of artificial neural network based river flow forecast models," Stochastic environmental research and risk assessment, vol. 27, no. 1, pp. 137-146, 2013.

[4] C. M. Bishop, "Mixture density networks," 1994.

[5] R. Herzallah and D. Lowe, "A mixture density network approach to modelling and exploiting uncertainty in nonlinear control problems," Engineering Applications of Artificial Intelligence, vol. 17, no. 2, pp. 145-158, 2004.