Taylor Series Discretization Method for Input-Delay Nonlinear Systems

  • 장정 (전북대학 전자정보학과) ;
  • 정길도 (전북대학 전자정보학과)
  • Published : 2007.04.27

Abstract

Anew discretization method for the input-driven nonlinear continuous-time system with time delay is proposed. It is based on the combination of Taylor series expansion and first-order hold assumption. The mathematical structure of the new discretization scheme is explored. The performance of the proposed discretization procedure is evaluated by case studies. The results demonstrate that the proposed discretization scheme can assure the system requirements even though under a large sampling period. A comparison between first order hold and zero-order hold is simulated also.

Keywords