
Research on Energy Management of Microgrid in Power Supply System Using Deep Reinforcement Learning
Author(s) -
Xianzhi Jin,
Fei Lin,
Ye Wang
Publication year - 2021
Publication title -
iop conference series. earth and environmental science
Language(s) - English
Resource type - Journals
eISSN - 1755-1307
pISSN - 1755-1315
DOI - 10.1088/1755-1315/804/3/032042
Subject(s) - microgrid , reinforcement learning , computer science , energy storage , scheduling (production processes) , energy management , control engineering , artificial intelligence , control (management) , power (physics) , engineering , energy (signal processing) , operations management , physics , statistics , mathematics , quantum mechanics
Aiming at the energy storage scheduling problem of microgrid system with wind power generation, this paper proposes an energy management strategy of microgrid based on deep reinforcement learning. In order to analyze the influence of different scenario combination models on the microgrid energy storage dispatching strategy, taking the residential user microgrid system as an example, a microgrid dispatching problem environment model was constructed. In the case of changes in multiple factors such as power generation and load, the coordinated control of compound energy is a complex optimization decision-making problem. Different schemes may affect the stability of the system’s power supply, utilization efficiency and economic benefits. To this end, this article uses three deep reinforcement learning algorithms and compares them with experiments. The results show that there are obvious differences in the ability of deep reinforcement learning algorithms to converge to the optimal strategy, but it also proves the feasibility and effectiveness of deep reinforcement learning in the coordinated control of composite energy storage problems, which has strong academic significance and engineering value, and can be used to deal with similar problems.