A data-driven strategy using long short term memory models and reinforcement learning to predict building electricity consumption

Publication Name

Applied Energy

Abstract

Data-driven modeling emerges as a promising approach to predicting building electricity consumption and facilitating building energy management. However, the majority of the existing models suffer from performance degradation during the prediction process. This paper presents a new strategy that integrates Long Short Term Memory (LSTM) models and Reinforcement Learning (RL) agents to forecast building next-day electricity consumption and peak electricity demand. In this strategy, LSTM models were first developed and trained using the historical data as the base models for prediction. RL agents were further constructed and introduced to learn a policy that can dynamically tune the parameters of the LSTM models according to the prediction error. This strategy was tested using the electricity consumption data collected from a group of university buildings and student accommodations. The results showed that for the student accommodations which showed relatively large monthly variations in daily electricity consumption, the proposed strategy can increase the prediction accuracy by up to 23.5% as compared with the strategy using the LSTM models only. However, when it was applied to the buildings with insignificant monthly variations in the daily electricity consumption, the prediction accuracy did not show an obvious improvement when compared with the use of the LSTM models alone. This study demonstrated how to use LSTM models and reinforcement learning with self-optimization capability to likely provide more reliable prediction in daily electricity consumption and thus to facilitate building optimal operation and demand side management.

Open Access Status

This publication is not available as open access

Volume

306

Article Number

118078

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1016/j.apenergy.2021.118078