2025 AIChE Annual Meeting

(681e) Hyperplane Decision Trees As Piecewise Linear Surrogate Models for Optimization Problems

Authors

Sneha Akhade, Lawrence Livermore National Laboratory
Matthew Mcnenly, Lawrence Livermore National Laboratory
John Kitchin, Carnegie Mellon University
Recent advancements in process systems engineering emphasize the importance of data-driven modeling in optimization and decision-making. In particular, regression models such as neural networks can be embedded as surrogates in optimization problems. While neural networks can effectively capture complex, high-dimensional relationships, their training can be expensive, unreliable, and data-hungry. In this presentation, we introduce hyperplane decision trees (HTs) as a general, highly expensive, and efficient surrogate model architecture. HTs are locally linear and feature linear decision boundaries, yielding a piecewise linear model which can be formulated as a set of mixed-integer linear constraints. This is achieved by a linear feature engineering step and a recursive training procedure. Our open-source PyTorch implementation provides a fast, flexible, and accessible tool for building accurate piecewise linear models which can be directly embedded in optimization problems via Pyomo and the Optimization and Machine Learning Toolkit (OMLT).