Breadcrumb
- Home
- Publications
- Proceedings
- 2025 AIChE Annual Meeting
- Computing and Systems Technology Division
- 10B: Advances in Process Control I
- (204b) Learning Lyapunov Functions Via Constrained Symbolic Regression
Over the years, several approaches have been proposed to automate the search for a Lyapunov function. For example, one can leverage genetic programming to search for a Lyapunov function by randomly mutating expressions [1,2]. However, the search in this approach is random and guaranteeing constraint satisfaction is not always possible. An alternative approach is to postulate a functional form and estimate the values of the parameters using mathematical optimization [3-5]. Although this approach has been used for polynomial and hybrid dynamical systems, one must select the functional form a priori. Finally, machine learning approaches have been used to learn Lyapunov functions from data by combining supervised learning with constraint satisfaction to ensure that the learned neural network is a Lyapunov function [6,7]. However, the returned function is a black-box function since it is a neural network. To this end, existing approaches either are black-box or assume a priori the functional form.
In this work, we propose a novel approach to overcome these limitations by combining constrained symbolic regression and global optimization.
We apply the proposed approach to several case studies involving linear and polynomial continuous dynamical systems. In all cases, the proposed approach returns a valid Lyapunov function and the solution time is always less than 100 seconds.
References
[1] Grosman, B. and Lewin, D.R., 2006, October. Lyapunov-based stability analysis automated by genetic programming. In 2006 IEEE Conference on Computer Aided Control System Design, 2006 IEEE International Conference on Control Applications, 2006 IEEE International Symposium on Intelligent Control (pp. 766-771). IEEE.
[2] McGough, J.S., Christianson, A.W. and Hoover, R.C., 2010. Symbolic computation of Lyapunov functions using evolutionary algorithms. In Proceedings of the 12th IASTED International Conference (Vol. 15, pp. 508-515).
[3] Papachristodoulou, A. and Prajna, S., 2002, December. On the construction of Lyapunov functions using the sum of squares decomposition. In Proceedings of the 41st IEEE Conference on Decision and Control, 2002. (Vol. 3, pp. 3482-3487). IEEE.
[4] Kapinski, J., Deshmukh, J.V., Sankaranarayanan, S. and Arechiga, N., 2014, April. Simulation-guided Lyapunov analysis for hybrid dynamical systems. In Proceedings of the 17th International Conference on Hybrid Systems: Computation and Control (pp. 133-142).
[5] Chen, S., Fazlyab, M., Morari, M., Pappas, G.J. and Preciado, V.M., 2021, May. Learning Lyapunov functions for hybrid systems. In Proceedings of the 24th International Conference on Hybrid Systems: Computation and Control (pp. 1-11).
[6] Chang, Y.C., Roohi, N. and Gao, S., 2019. Neural lyapunov control. Advances in Neural Information Processing Systems, 32.
[7] Kolter, J.Z. and Manek, G., 2019. Learning stable deep dynamics models. Advances in Neural Information Processing Systems, 32.
[8] Cozad, A. and Sahinidis, N.V., 2018. A global MINLP approach to symbolic regression. Mathematical Programming, 170, pp. 97-119.
[9]. Kim, J., Leyffer, S. and Balaprakash, P., 2023. Learning symbolic expressions: Mixed-integer formulations, cuts, and heuristics. INFORMS Journal on Computing, 35(6), pp.1383-1403.