On the Practical Art of State Definitions for Markov Decision Process Construction
Author(s) -
William T. Scherer,
Stephen Adams,
Peter A. Beling
Publication year - 2018
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2018.2819940
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Many problems faced by decision makers today involve the management of large scale, complex systems that can be modeled as state-based control problems, specifically discrete Markov decision process (MDP). Typical examples include transportation systems, defense systems, healthcare networks, financial organizations, and general infrastructure problems. In all of these problems, decision makers have difficulty in forecasting the state of their system in the future and capturing the dynamics of the states over time. In this paper, we discuss, via numerous examples, practical experiences in trying to build such models. Much of the literature discusses theoretical issues of solution convergence and algorithm performance; unfortunately, much of this research does not help with the practical business of building an actual MDP model. Thus, numerous books begin with statement of the nature: "given the state space S. . .." A critical question to the practitioner is the creation of this state space "S." We focus on this first step in the MDP modeling process, an often neglected and difficult step, and we discuss the practical implications and issues associated with the state definition, illustrating these issues with numerous examples. This paper is not meant to be a survey of "state-based" applications or MDP applications, but an overview of experiences building many of these models in diverse applications.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom