Premium
Canonicial Transformations and the Hamilton‐Jacobi Theorem in the Optimum Control Theory
Author(s) -
Djukić Djordje S.,
Vujanović Bozidar D.
Publication year - 1977
Publication title -
zamm ‐ journal of applied mathematics and mechanics / zeitschrift für angewandte mathematik und mechanik
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.449
H-Index - 51
eISSN - 1521-4001
pISSN - 0044-2267
DOI - 10.1002/zamm.19770571105
Subject(s) - mathematics , hamilton–jacobi equation , hamilton–jacobi–bellman equation , optimal control , bellman equation , partial differential equation , maximum principle , pontryagin's minimum principle , hamiltonian (control theory) , dynamic programming , first order partial differential equation , differential equation , mathematical analysis , mathematical optimization
Abstract This paper deals with the problem to establish a profound relationship between Pontryagin's maximum principle and Bellman 's dynamic programming method via the canonical transformations of the variables, as it is a case in classical mechanics. A rigorous form of the Hamilton‐Jacobi theorem is proved for optimal control systems. Further it is shown that the controlled systems may be treated by a partial differential equation of the Hamilton‐Jacobi type, which, however, is fundamentally different of the Bellman functional equation. The solution of this equation is a function of the generalized momenta, which appear in Pontryagin's theory, and the time. In some cases this equation can be advantageously used in comparison with the Bellman equation. An example is solved to illustrate the presented theory.