2025 AIChE Annual Meeting

(679d) Optimization with Piecewise-Linear Functions Using Generalized Disjunctive Programming

Authors

Bashar Ammari - Presenter, Carnegie Mellon University
Emma Johnson, Sandia National Laboratories
Piecewise-linear functions and their computational performance as surrogate models within optimization problems continue to attract attention within the engineering and operations research communities. Formulating piecewise-linear functions as mixed-integer programs (MIPs) is useful across several application spaces, especially approximating nonconvex functions to convert mixed-integer nonlinear programs to mixed-integer linear programs. Additionally, several recent papers propose formulations for embedding machine learning models that are in fact piecewise-linear functions into optimization problems. These include neural networks with rectified-linear unit activation functions (Fischetti & Jo, 2018), gradient-boosted decision trees (Mistry et al., 2021), and linear model decision trees (Ammari et al., 2023). The broad applicability of piecewise-linear surrogates justifies additional investigation into the best formulations for including piecewise-linear functions in optimization problems.

The literature contains several different MIP formulations for piecewise-linear functions. Vielma et al. (2008, 2010), Vielma (2015, 2018) and Huchette and Vielma (2023) present the multiple-choice, disaggregated convex-combination, convex-combination, disaggregated logarithmic, logarithmic, binary zigzag, general integer zigzag, and incremental models. They comment on the properties of these formulations and compare their computational performance for univariate and bivariate piecewise-linear functions embedded within optimization problems.

In this work, we present generalized disjunctive programming (GDP) formulations for piecewise-linear functions and show that we can recover many of the well-known MIP formulations in the literature by applying standard big-M, convex-hull, and multiple big-M transformations. In addition to well-known formulations, we show that systematic application of GDP transformations yields formulations that were previously unconsidered. We demonstrate the contrib.piecewise package in Pyomo as a tool for generating these formulations automatically, and we run computational comparisons of these formulations for multivariate piecewise-linear functions.

We import nonlinear model instances from MINLPLib (Bussieck et al., 2003) and replace the nonlinear expressions in these instances with piecewise-linear approximations using the contrib.piecewise package. We then generate the different GDP representations of these piecewise-linear functions and transform the GDPs into MIPs by applying the big-M, convex-hull, and multiple big-M transformations. We report the time to solve these MIPs using Gurobi. Results show that the formulation of multivariate piecewise-linear functions corresponding to the convex-combination model performs well across many instances from MINLPLib. Similar formulations derived from the same GDP representation also perform well. We expect that the performance profiles of these formulations will change with new versions of Gurobi. Therefore, we provide a general software implementation of all the formulations so that we and others can benchmark with new solver versions in the future.

Acknowledgements

Sandia National Laboratories is a multimission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-NA0003525. This paper describes objective technical results and analysis. Any subjective views or opinions that might be expressed in the paper do not necessarily represent the views of the U.S. Department of Energy or the United States Government

References

Ammari BL, Johnson ES, Stinchfield G, Kim T, Bynum M, Hart WE, Pulsipher J, Laird CD (2023). “Linear Model Decision Trees as Surrogates in Optimization of Engineering Applications. Computers & Chemical Engineering 178:108347

Bussieck MR, Drud AS, Meeraus A (2003) “MINLPLib—A Collection of Test Models for Mixed-integer Nonlinear Programming.” INFORMS Journal on Computing 15:114–119

Fischetti, M and Jo, J (2018). “Deep Neural Networks and Mixed Integer Linear Optimization.” Constraints, 23:296–309.

Huchette J, Vielma JP (2023) “Nonconvex Piecewise Linear Functions: Advanced Formulations and Simple Modeling Tools.” Operations Research 71:1835–1856

Mistry, M, Letsios, D, Krennrich, G, Lee, RM, and Misener, R (2021). “Mixed-integer Convex Nonlinear Optimization with Gradient-boosted Trees Embedded.” INFORMS Journal on Computing, 33:1103–1119.

Vielma JP (2015) “Mixed Integer Linear Programming Formulation Techniques.” SIAM Review 57:3–57

Vielma JP (2018) “Embedding Formulations and Complexity for Unions of Polyhedra.” Management Science 64:4721–4734

Vielma JP, Ahmed S, Nemhauser G (2010) “Mixed-integer Models for Nonseparable Piecewise-linear Optimization: Unifying Framework and Extensions.” Operations Research 58:303–315

Vielma JP, Keha AB, Nemhauser GL (2008) “Nonconvex, Lower Semicontinuous Piecewise Linear Optimization.” Discrete Optimization 5:467–488