z-logo
open-access-imgOpen Access
Local Energy Trading Behavior Modeling With Deep Reinforcement Learning
Author(s) -
Tao Chen,
Wencong Su
Publication year - 2018
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2018.2876652
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
In this paper, we model prosumers’ energy trading behavior, with the operation of an energy storage system, in a proposed event-driven local energy market. Through modeling local energy trading strategies of a prosumer in the proposed holistic market model, the prosumer’s decision-making process will be built as a Markov decision process with many continuous variables. Then, this decision-making process of local market participation will be solved by deep reinforcement learning technology with experience replay mechanism. Specifically, a deep Q-learning for local energy trading algorithm is modified from deep Q-network to facilitate such a decision-making within an intelligent energy system and promote prosumers’ willingness to participate in the localized energy ecosystem.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom