2024 AIChE Annual Meeting
(607e) Transformer-Based Hybrid Modeling and Control of Evolving, Nonlinear Processes
Author
Our hybrid modeling framework incorporates a recent innovation: attention-based time-series transformers (TSTs) coupled with positional encoding. This marks a pioneering venture into applying the transformer algorithm – a cornerstone in ChatGPT’s triumph - to nonlinear, time-varying processes. By analyzing data across both current and preceding time steps, the TST captures both immediate and historical changes in process states, granting a contextual insight on process dynamics, mirroring ChatGPT’s textual context understanding. This TST-based hybrid model identifies correlations between process parameters and state variables. Its versatility is evident as it adapts to a spectrum of models - from density function theory to computational fluid dynamics - and scales, spanning from laboratory to extensive industrial environments. We will present applications of this hybrid modeling and control architecture, showcasing its utility from labs to industrial processes, made possible through partnerships with leading chemical process enterprises.