2025 AIChE Annual Meeting

(204b) Learning Lyapunov Functions Via Constrained Symbolic Regression

Authors

Ilias Mitrai - Presenter, University of Minnesota
In this work we propose a constrained symbolic regression approach for learning Lyapunov functions from data. Lyapunov functions are key tools for analyzing the stability of dynamical systems and designing nonlinear controllers. However, identifying a Lyapunov function for dynamical systems is nontrivial – on one hand, one must search the space of all possible functions which can be very large; on the other hand, the search space is highly constrained since a Lyapunov function must be nonnegative and have a negative semidefinite time derivative (or increment).

Over the years, several approaches have been proposed to automate the search for a Lyapunov function. For example, one can leverage genetic programming to search for a Lyapunov function by randomly mutating expressions [1,2]. However, the search in this approach is random and guaranteeing constraint satisfaction is not always possible. An alternative approach is to postulate a functional form and estimate the values of the parameters using mathematical optimization [3-5]. Although this approach has been used for polynomial and hybrid dynamical systems, one must select the functional form a priori. Finally, machine learning approaches have been used to learn Lyapunov functions from data by combining supervised learning with constraint satisfaction to ensure that the learned neural network is a Lyapunov function [6,7]. However, the returned function is a black-box function since it is a neural network. To this end, existing approaches either are black-box or assume a priori the functional form.

In this work, we propose a novel approach to overcome these limitations by combining constrained symbolic regression and global optimization.

  • Motivated by the literature on symbolic regression [8,9], we model the Lyapunov function V(x) as an expression tree of fixed depth. This modeling approach can consider many functional forms for the Lyapunov function using simple mathematical operations such as addition, division, subtraction, and multiplication.
  • Furthermore, we impose constraints that guarantee that the output of the Lyapunov function is always non-negative and its derivative is negative semi-definite, i.e., the returned function is a valid Lyapunov function for the given dataset. The resulting learning problem is a mixed-integer nonlinear programming (MINLP) problem, which can be solved with state-of-the-art branch and bound-based global optimization solvers. The objective can be either the complexity of the returned function, i.e., find the simplest Lyapunov function for the given data or the convergence rate.
  • Finally, we develop a hybrid branch-and-bound and check algorithm to solve the learning problem efficiently. Specifically, during branch and bound, once an integer feasible solution is found at a node of the branch and bound tree, we check if the function is Lyapunov and stop if the required conditions are satisfied, i.e., V(x) ≥ 0 and dV(x)/dt ≤ 0; otherwise the branch-and-bound search continues.

We apply the proposed approach to several case studies involving linear and polynomial continuous dynamical systems. In all cases, the proposed approach returns a valid Lyapunov function and the solution time is always less than 100 seconds.

References

[1] Grosman, B. and Lewin, D.R., 2006, October. Lyapunov-based stability analysis automated by genetic programming. In 2006 IEEE Conference on Computer Aided Control System Design, 2006 IEEE International Conference on Control Applications, 2006 IEEE International Symposium on Intelligent Control (pp. 766-771). IEEE.

[2] McGough, J.S., Christianson, A.W. and Hoover, R.C., 2010. Symbolic computation of Lyapunov functions using evolutionary algorithms. In Proceedings of the 12th IASTED International Conference (Vol. 15, pp. 508-515).

[3] Papachristodoulou, A. and Prajna, S., 2002, December. On the construction of Lyapunov functions using the sum of squares decomposition. In Proceedings of the 41st IEEE Conference on Decision and Control, 2002. (Vol. 3, pp. 3482-3487). IEEE.

[4] Kapinski, J., Deshmukh, J.V., Sankaranarayanan, S. and Arechiga, N., 2014, April. Simulation-guided Lyapunov analysis for hybrid dynamical systems. In Proceedings of the 17th International Conference on Hybrid Systems: Computation and Control (pp. 133-142).

[5] Chen, S., Fazlyab, M., Morari, M., Pappas, G.J. and Preciado, V.M., 2021, May. Learning Lyapunov functions for hybrid systems. In Proceedings of the 24th International Conference on Hybrid Systems: Computation and Control (pp. 1-11).

[6] Chang, Y.C., Roohi, N. and Gao, S., 2019. Neural lyapunov control. Advances in Neural Information Processing Systems, 32.

[7] Kolter, J.Z. and Manek, G., 2019. Learning stable deep dynamics models. Advances in Neural Information Processing Systems, 32.

[8] Cozad, A. and Sahinidis, N.V., 2018. A global MINLP approach to symbolic regression. Mathematical Programming, 170, pp. 97-119.

[9]. Kim, J., Leyffer, S. and Balaprakash, P., 2023. Learning symbolic expressions: Mixed-integer formulations, cuts, and heuristics. INFORMS Journal on Computing, 35(6), pp.1383-1403.