IPACS Electronic library

Dynamic Programming Approaches to the Hamilton-Jacobi-Bellman Theory of Energy Systems

Stanislaw Sieniutycz, Piotr Kuran
Dynamical problems of maximum power produced in thermal systems and associated problems of minimum entropy production are governed by Hamilton–Jacobi–Bellman (HJB) equations which describe corresponding optimal functions and associated controls. Yet, often the optimal relaxation curve is non-exponential and governing HJB equations cannot be solved analytically. Systems with nonlinear kinetics (e.g. radiation engines) are particularly difficult, thus, discrete counterparts of original HJB equations and numerical approaches are applied. Discrete algorithms of dynamic programming (DP), which approximate the original continuous algorithms and lead to work limits and generalized availabilities, are effective.
We investigate convergence of discrete algorithms to solutions of HJB equations, discrete approximations of controls, and role of Lagrange multiplier associated with the duration constraint. In analytical discrete schemes, the Legendre transformation is a significant tool leading to the original work function. We also describe numerical algorithms of dynamic programming and consider dimensionality reduction in these algorithms.
File: download
Copyright © 2003—2015 The Laboratory "Control of Complex Systems", IPME RAS