z-logo
Premium
A COMPARISON OF SOME DYNAMIC, LINEAR AND POLICY ITERATION METHODS FOR RESERVOIR OPERATION 1
Author(s) -
Louoks D. P.,
Falkson L. M.
Publication year - 1970
Publication title -
jawra journal of the american water resources association
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.957
H-Index - 105
eISSN - 1752-1688
pISSN - 1093-474X
DOI - 10.1111/j.1752-1688.1970.tb00489.x
Subject(s) - inflow , dynamic programming , mathematical optimization , stochastic programming , linear programming , markov chain , markov decision process , computer science , markov process , flow (mathematics) , discrete time and continuous time , volume (thermodynamics) , mathematics , function (biology) , geology , statistics , oceanography , geometry , physics , quantum mechanics , machine learning , evolutionary biology , biology
Within the past few years, a number of papers have been published in which stochastic mathematical programming models, incorporating first order Markov chains, have been used to derive alternative sequential operating policies for a multiple purpose reservoir. This paper attempts to review and compare three such mathematical modeling and solution techniques, namely dynamic programming, policy iteration, and linear programming. It is assumed that the flows into the reservoir are serially correlated stochastic quantities. The design parameters are assumed fixed, i.e., the reservoir capacity and the storage and release targets, if any, are predetermined. The models are discrete since the continuous variables of time, volume, and flow are approximated by discrete units. The problem is to derive an optimal operating policy. Such a policy defines the reservoir release as a function of the current storage volume and inflow. The form of the solution and some of the advantages, limitations and computational efficiencies of each of the models and their algorithms are compared using a simplified numerical example.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here