DOI QR코드

DOI QR Code

Supervised Learning Artificial Neural Network Parameter Optimization and Activation Function Basic Training Method using Spreadsheets

스프레드시트를 활용한 지도학습 인공신경망 매개변수 최적화와 활성화함수 기초교육방법

  • Hur, Kyeong (Department of Computer Education, Gyeong-In National University of Education)
  • 허경 (경인교육대학교 컴퓨터교육과)
  • Received : 2021.07.31
  • Accepted : 2021.08.18
  • Published : 2021.08.31

Abstract

In this paper, as a liberal arts course for non-majors, we proposed a supervised learning artificial neural network parameter optimization method and a basic education method for activation function to design a basic artificial neural network subject curriculum. For this, a method of finding a parameter optimization solution in a spreadsheet without programming was applied. Through this training method, you can focus on the basic principles of artificial neural network operation and implementation. And, it is possible to increase the interest and educational effect of non-majors through the visualized data of the spreadsheet. The proposed contents consisted of artificial neurons with sigmoid and ReLU activation functions, supervised learning data generation, supervised learning artificial neural network configuration and parameter optimization, supervised learning artificial neural network implementation and performance analysis using spreadsheets, and education satisfaction analysis. In this paper, considering the optimization of negative parameters for the sigmoid neural network and the ReLU neuron artificial neural network, we propose a training method for the four performance analysis results on the parameter optimization of the artificial neural network, and conduct a training satisfaction analysis.

본 논문에서는 비전공자들을 위한 교양과정으로, 기초 인공신경망 과목 커리큘럼을 설계하기 위해, 지도학습 인공신경망 매개변수 최적화 방법과 활성화함수에 대한 기초 교육 방법을 제안하였다. 이를 위해, 프로그래밍 없이, 매개 변수 최적화 해를 스프레드시트로 찾는 방법을 적용하였다. 본 교육 방법을 통해, 인공신경망 동작 및 구현의 기초 원리 교육에 집중할 수 있다. 그리고, 스프레드시트의 시각화된 데이터를 통해 비전공자들의 관심과 교육 효과를 높일 수 있다. 제안한 내용은 인공뉴런과 Sigmoid, ReLU 활성화 함수, 지도학습데이터의 생성, 지도학습 인공신경망 구성과 매개변수 최적화, 스프레드시트를 이용한 지도학습 인공신경망 구현 및 성능 분석 그리고 교육 만족도 분석으로 구성되었다. 본 논문에서는 Sigmoid 뉴런 인공신경망과 ReLU 뉴런 인공신경망에 대해 음수허용 매개변수 최적화를 고려하여, 인공신경망 매개변수 최적화에 대한 네가지 성능분석결과를 교육하는 방법을 제안하고 교육 만족도 분석을 실시하였다.

Keywords

References

  1. Y. Bengio, I. Goodfellow, and A. Courville, "Deep learning," MIT Press, 2017.
  2. K. Hornik, M. Stinchcombe, and H. White, "Multilayer feedforward networks are universal approximators," Neural Networks, vol. 2, no. 5, pp. 359-366, 1989. https://doi.org/10.1016/0893-6080(89)90020-8
  3. M. Roodschild, J. Gotay Sardinas, and A. will, "A new approach for the vanishing gradient problem on sigmoid activation," Springer Nature, pp. 351-360, 2020.
  4. V. Nair and G. Hinton, "Rectified linear units improve restricted boltzmann machines," International Conference on Machine Learning, vol. 9, pp. 807-814, 2010.
  5. Y. Qin, X. Wang, and J. Zou, "The optimized deep belief networks with improved logistic Sigmoid units and their application in fault diagnosis for planetary gearboxes of wind turbines," IEEE Transactions on Industrial Electronics, vol. 66, no. 5, pp. 3814-3824, July 2018. https://doi.org/10.1109/tie.2018.2856205
  6. X. Wang, Y. Qin, Y. Wang, S. Xiang, and H. Chen, "ReLTanh: An activation function with vanishing gradient resistance for SAE-based DNNs and its application to rotating machinery fault diagnosis," Neurocomputing, vol. 363, pp. 88-98, 2019. https://doi.org/10.1016/j.neucom.2019.07.017
  7. S. Kong and M. Takatsuka, "Hexpo: A vanishing-proof activation function," International Joint Conference on Neural Networks, pp. 2562-2567, 2017.
  8. W. Yoshiyuki and W. Sadami, "Deep learning with excel," Seoul : Seongandang, 2020.