2023 AIChE Annual Meeting

(216b) Training a Deep Learning Model on GPU to Estimate the Remaining Useful Life of a Lithium-Ion Battery

Authors

Savai, M., MathWorks
Gonuguntla, S., MathWorks
Gopinath, A., MathWorks
MATLAB is a widely used scientific computation tool in chemical engineering for tasks such as numeric and symbolic computations, data analysis, and visualization, and implementation of control, optimization, and AI algorithms. Some of these computational tasks such as training AI models may require the use of parallel computing and GPUs. Chemical engineers share the results of their work with their audience and collaborators in various forms including reports, stand-alone executables, web apps, and using source control tools. To test the models in the real world, they need deployable code.

Many chemical engineers utilize data-driven modeling techniques for process modeling, predictive maintenance, and remaining useful life estimation. We implemented an AI model to estimate the remaining useful life of a lithium-ion battery using MATLAB based on “Data-driven prediction of battery cycle life before capacity degradation” by Severson et. Al. To shorten the training time of the deep learning model, we took advantage of MATLAB’s built-in GPU support for deep learning. We automatically generated C code from MATLAB code for deployment. Using source code integration, we maintained our code base. In this talk, we will showcase how we created these artifacts. Our MATLAB-based approach to scale up computations, share results, and create deployable code can be applied to other chemical engineering modeling tasks as well.