DOI QR코드

DOI QR Code

대면적 서셉터의 온도 균일도 검증 알고리즘

A Verification Algorithm for Temperature Uniformity of the Large-area Susceptor

  • 투고 : 2014.05.22
  • 심사 : 2014.08.27
  • 발행 : 2014.10.01

초록

Performance of next generation susceptor is affected by temperature uniformity in order to produce reliably large-sized flat panel display. In this paper, we propose a learning estimation model of susceptor to predict and appropriately assess the temperature uniformity. Artificial Neural Networks (ANNs) and Support Vector Machines (SVMs) are compared for the suitability of the learning estimation model. It is proved that SVMs provides more suitable verification of uniformity modeling than ANNs during each stage of temperature variations. Practical procedure for uniformity estimation of susceptor temperature was developed using the SVMs prediction algorithm.

키워드

참고문헌

  1. Kim, S. K. and Cho, C. K., "Development of High Performance Susceptor for Manufacturing of Display," Hoseo R&D Report, Hoseo University, 2014.
  2. Yang, H. J., Kim, H. T., and Kim, S. K., "Comparative Study of Modeling of Hand Motion by Neural Network and Kernel Regression," Trans. Korean Soc. Mech. Eng. A, Vol. 34, No. 4, pp. 399-405, 2010.
  3. Yang, H. J. and Kim S. K., "Design of Wafer Handling Robot Using Kernel Regression and Neural Network," Proc. of KSME Spring Conference, pp. 67-68, 2010.
  4. Hines, J. W., Garvey, D., Seibert, R., and Usynim, A., "Technical Review of On-line Monitoring Techniques for Performance Assessment Volume 2: Theoretical Issues," U.S.NRC, Document ID: NUREG/CR-6895, 2008.
  5. Yang, H. J. and Kim, S. K., "Verification of Wafer Handler Design Using Support Vector Machines and Neural Network," Proc. of KSME Autumn Conference, pp. 765-766, 2010.
  6. Yang, H. J. and Kim S. K., "A Comparison between Kernel Regression and SVM for Robot Hand Model," Proc. of KSME Spring Conference, pp. 443-444, 2011.
  7. Battiti, R., "Using Mutual Information for Selection Features in Supervised Neural Net Learning," IEEE Transaction on Neural Networks, Vol. 5, No. 4, pp. 537-550, 1994. https://doi.org/10.1109/72.298224
  8. Back, A. D. and Trappenberg, T. P., "Input Variable Selection Using Independent Component Analysis," Proc. of International Joint Conference on Neural Networks, Vol. 2, pp. 989-992, 1999.
  9. Fernando, T. M. K. G., Maier, H. R., and Dandy, G. C., "Selection of Input Variables for Data Driven Models: An Average Shifted Histogram Partial Mutual Information Estimator Approach," Journal of Hydrology, Vol. 367, No. 3, pp. 165-176, 2009. https://doi.org/10.1016/j.jhydrol.2008.10.019
  10. An, S. H., Heo, G. Y., and Chang, S. H., "Detection of Process Anomalies using an Improved Statistical Learning Framework," Expert Systems with Applications, Vol. 38, No. 3, pp. 1356-1363, 2011. https://doi.org/10.1016/j.eswa.2010.07.031
  11. Denoeux, T., Lengelle, R., and Canu, S., "Initialization of Weights in a Feedforward Neural Network using Prototypes," Proc. of the International Conference on Artificial Neural Networks, 1991.
  12. Hagiwara, M., "Theoretical derivation of momentum term in back-propagation," International Joint Conference on Neural Networks, Vol. 1, pp. 682-686, 1992.
  13. Cybenko, G., "Approximation by Superpositions of a Sigmoidal Function," Mathematics of Control, Signals and Systems, Vol. 2, No. 4, pp. 303-314, 1989. https://doi.org/10.1007/BF02551274
  14. Hsieh, W. W., "Machine Learning Methods in the Environmental Sciences: Neural Networks and Kernels," Cambridge University press, pp. 125-145, 2009.
  15. Burges, C. J. C., "A Tutorial on Support Vector Machines for Pattern Recognition," Data Mining and Knowledge Discovery, Vol. 2, No. 2, pp. 121-167, 1998. https://doi.org/10.1023/A:1009715923555
  16. Vapnik, V. N., "An Overview of Statistical Learning Theory," IEEE Transactions on Neural Networks, Vol. 10, No. 5, pp. 998-999, 1999.