SOC Estimation using Deep Bidirectional Gated Recurrent Units with Tree Parzen Estimator Hyperparameter Optimization

Publication Name

IEEE Transactions on Industry Applications

Abstract

State-of-charge (SOC) is a crucial battery quantity that needs constant monitoring to ensure cell longevity and safe operation. However, SOC is not an observable quantity and cannot be practically measured outside of laboratory environments. Hence, machine learning (ML) has been employed to map correlated observable signals such as voltage, current and temperature to SOC values. In recent studies, deep learning (DL) has been a prominent ML approach outperforming many existing methods for SOC estimation. However, yielding optimal performance from DL models relies heavily on appropriate selection of hyperparameters. At present, researchers relied on established heuristics to select hyperparameters through manual tuning or exhaustive search methods such as grid search (GS) and random search (RS). This results in lengthy development time in addition to less accurate and inefficient models. This study proposes a systematic and automated approach to hyperparameter selection with a Bayesian optimization strategy known as Tree Parzen Estimator (TPE) in combination with the Hyperband pruning algorithm. The TPE optimization is run on various DL models such as BGRU, GRU, LSTM, CNN, FCN and DNN to optimize for the best hyperparameter combination for each architecture. The Hyperband pruning algorithm is used to prune unpromising trials during the TPE run which results in time and computational resource savings. Experiment results show that the best performing model, BGRU-TPE incurs a low computation cost with no compromise in model accuracy, maintaining the best balance of both. The proposed model BGRU-TPE achieves the lowest RMSE and MAE error at 0.8% and 0.56% respectively on various electric vehicle drive cycles at varying ambient temperatures and maintains a low computational cost at 15,600 FLOPS with model size of only 193.3 kilobytes.

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1109/TIA.2022.3180282