Resource-efficient artificial intelligence for battery capacity estimation using convolutional FlashAttention fusion networks

Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational reso...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:eTransportation (Amsterdam) 2025-01, Vol.23, p.100383, Article 100383
Hauptverfasser: Lv, Zhilong, Zhao, Jingyuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Accurate battery capacity estimation is crucial for optimizing lifespan and monitoring health conditions. Deep learning has made notable strides in addressing long-standing issues in the artificial intelligence community. However, large AI models often face challenges such as high computational resource consumption, extended training times, and elevated deployment costs. To address these issues, we developed an efficient end-to-end hybrid fusion neural network model. This model combines FlashAttention-2 with local feature extraction through convolutional neural networks (CNNs), significantly reducing memory usage and computational demands while maintaining precise and efficient health estimation. For practical implementation, the model uses only basic parameters, such as voltage and charge, and employs partial charging data (from 80 % SOC to the upper limit voltage) as features, without requiring complex feature engineering. We evaluated the model using three datasets: 77 lithium iron phosphate (LFP) cells, 16 nickel cobalt aluminum (NCA) cells, and 50 nickel cobalt manganese (NCM) oxide cells. For LFP battery health estimation, the model achieved a root mean square error of 0.109 %, a coefficient of determination of 0.99, and a mean absolute percentage error of 0.096 %. Moreover, the proposed convolutional and flash-attention fusion networks deliver an average inference time of 57 milliseconds for health diagnosis across the full battery life cycle (approximately 1898 cycles per cell). The resource-efficient AI (REAI) model operates at an average of 1.36 billion floating point operations per second (FLOPs), with GPU power consumption of 17W and memory usage of 403 MB. This significantly outperforms the Transformer model with vanilla attention. Furthermore, the multi-fusion model proved to be a powerful tool for evaluating capacity in NCA and NCM cells using transfer learning. The results emphasize its ability to reduce computational complexity, energy consumption, and memory usage, while maintaining high accuracy and robust generalization capabilities. •Developed a resource-efficient AI hybrid neural network model for battery capacity estimation.•Optimized FlashAttention-2 to enhance computational efficiency and reduce memory usage.•Achieved millisecond-level diagnosis (averaging 57 milliseconds) over the full battery life cycle.•Validated generalization ability in health diagnosis from LFP to NCM cells using transfer learning.•The multi-fusion model stri
ISSN:2590-1168
2590-1168
DOI:10.1016/j.etran.2024.100383