IEEE Access (Jan 2025)

Enhancing Reinforcement Learning-Based Energy Management Through Transfer Learning With Load and PV Forecasting

  • Chang Xu,
  • Masahiro Inuiguchi,
  • Naoki Hayashi,
  • Wong Jee Keen Raymond,
  • Hazlie Mokhlis,
  • Hazlee Azil Illias

DOI
https://doi.org/10.1109/access.2025.3548990
Journal volume & issue
Vol. 13
pp. 43956 – 43972

Abstract

Read online

Effective energy management in microgrids with renewable energy sources is crucial for maintaining system stability while minimizing operational costs. However, traditional Reinforcement Learning (RL) controllers often encounter challenges, including long training time and instability during the training process. This study introduces a novel approach that integrates Transfer Learning (TL) techniques with RL controllers to address these issues. By using synthetic datasets generated by advanced forecasting models, such as ResNet18+BiLSTM, the proposed method pre-trains RL agents, embedding domain knowledge to enhance performance. The results, based on one year of operational data, show that TL-enhanced RL controllers significantly reduce cumulative operation costs and system imbalance, achieving up to a 62.63% reduction in costs and an 80% improvement in balance compared to baseline models. Furthermore, the proposed method improves initial performance and shortens the training duration needed to reach operational thresholds. This approach demonstrates the potential of combining TL with RL to develop efficient, cost-effective solutions for real-time energy management in complex power systems.

Keywords