Optimal Control of Stochastic Bilinear Systems

확률적 이선형시스템의 최적제

  • Published : 1982.07.01

Abstract

We derived an optimal control of the Stochastic Bilinear Systems. For that we, firstly, formulated stochastic bilinear system and estimated its state when the system state is not directly observable. Optimal control problem of this system is reviewed on the line of three optimization techniques. An optimal control is derived using Hamilton-Jacobi-Bellman equation via dynamic programming method. It consists of combination of linear and quadratic form in the state. This negative feedback control, also, makes the system stable as far as value function is chosen to be a Lyapunov function. Several other properties of this control are discussed.

Keywords