Browse > Article
http://dx.doi.org/10.14702/JPEE.2021.233

Supervised Learning Artificial Neural Network Parameter Optimization and Activation Function Basic Training Method using Spreadsheets  

Hur, Kyeong (Department of Computer Education, Gyeong-In National University of Education)
Publication Information
Journal of Practical Engineering Education / v.13, no.2, 2021 , pp. 233-242 More about this Journal
Abstract
In this paper, as a liberal arts course for non-majors, we proposed a supervised learning artificial neural network parameter optimization method and a basic education method for activation function to design a basic artificial neural network subject curriculum. For this, a method of finding a parameter optimization solution in a spreadsheet without programming was applied. Through this training method, you can focus on the basic principles of artificial neural network operation and implementation. And, it is possible to increase the interest and educational effect of non-majors through the visualized data of the spreadsheet. The proposed contents consisted of artificial neurons with sigmoid and ReLU activation functions, supervised learning data generation, supervised learning artificial neural network configuration and parameter optimization, supervised learning artificial neural network implementation and performance analysis using spreadsheets, and education satisfaction analysis. In this paper, considering the optimization of negative parameters for the sigmoid neural network and the ReLU neuron artificial neural network, we propose a training method for the four performance analysis results on the parameter optimization of the artificial neural network, and conduct a training satisfaction analysis.
Keywords
Activation function; Artificial neural network; AI education; Deep learning; Non-major undergraduates; Parameter optimization; Supervised learning;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Y. Bengio, I. Goodfellow, and A. Courville, "Deep learning," MIT Press, 2017.
2 V. Nair and G. Hinton, "Rectified linear units improve restricted boltzmann machines," International Conference on Machine Learning, vol. 9, pp. 807-814, 2010.
3 S. Kong and M. Takatsuka, "Hexpo: A vanishing-proof activation function," International Joint Conference on Neural Networks, pp. 2562-2567, 2017.
4 W. Yoshiyuki and W. Sadami, "Deep learning with excel," Seoul : Seongandang, 2020.
5 M. Roodschild, J. Gotay Sardinas, and A. will, "A new approach for the vanishing gradient problem on sigmoid activation," Springer Nature, pp. 351-360, 2020.
6 X. Wang, Y. Qin, Y. Wang, S. Xiang, and H. Chen, "ReLTanh: An activation function with vanishing gradient resistance for SAE-based DNNs and its application to rotating machinery fault diagnosis," Neurocomputing, vol. 363, pp. 88-98, 2019.   DOI
7 K. Hornik, M. Stinchcombe, and H. White, "Multilayer feedforward networks are universal approximators," Neural Networks, vol. 2, no. 5, pp. 359-366, 1989.   DOI
8 Y. Qin, X. Wang, and J. Zou, "The optimized deep belief networks with improved logistic Sigmoid units and their application in fault diagnosis for planetary gearboxes of wind turbines," IEEE Transactions on Industrial Electronics, vol. 66, no. 5, pp. 3814-3824, July 2018.   DOI