An improved plasma model by optimizing neuron activation gradient

뉴런 활성화 경사 최적화를 이용한 개선된 플라즈마 모델

  • 김병환 (전남대학교 전기공학과) ;
  • 박성진 (전남대학교 전기공학과)
  • Published : 2000.10.01

Abstract

Back-propagation neural network (BPNN) is the most prevalently used paradigm in modeling semiconductor manufacturing processes, which as a neuron activation function typically employs a bipolar or unipolar sigmoid function in either hidden and output layers. In this study, applicability of another linear function as a neuron activation function is investigated. The linear function was operated in combination with other sigmoid functions. Comparison revealed that a particular combination, the bipolar sigmoid function in hidden layer and the linear function in output layer, is found to be the best combination that yields the highest prediction accuracy. For BPNN with this combination, predictive performance once again optimized by incrementally adjusting the gradients respective to each function. A total of 121 combinations of gradients were examined and out of them one optimal set was determined. Predictive performance of the corresponding model were compared to non-optimized, revealing that optimized models are more accurate over non-optimized counterparts by an improvement of more than 30%. This demonstrates that the proposed gradient-optimized teaming for BPNN with a linear function in output layer is an effective means to construct plasma models. The plasma modeled is a hemispherical inductively coupled plasma, which was characterized by a 24 full factorial design. To validate models, another eight experiments were conducted. process variables that were varied in the design include source polver, pressure, position of chuck holder and chroline flow rate. Plasma attributes measured using Langmuir probe are electron density, electron temperature, and plasma potential.

Keywords