Control of an stochastic nonlinear system by the method of dynamic programming

  • Choi, Wan-Sik (TT&C Section Satellite Communication Division, ETRI, Yusong P.O.Box 106, Taejon, 305-600)
  • 발행 : 1994.10.01

초록

In this paper, we consider an optimal control problem of a nonlinear stochastic system. Dynamic programming approach is employed for the formulation of a stochastic optimal control problem. As an optimality condition, dynamic programming equation so called the Bellman equation is obtained, which seldom yields an analytical solution, even very difficult to solve numerically. We obtain the numerical solution of the Bellman equation using an algorithm based on the finite difference approximation and the contraction mapping method. Optimal controls are constructed through the solution process of the Bellman equation. We also construct a test case in order to investigate the actual performance of the algorithm.

키워드