Browse > Article
http://dx.doi.org/10.3745/KTSDE.2020.9.7.213

Performance Improvement Method of Convolutional Neural Network Using Agile Activation Function  

Kong, Na Young (전주대학교 문화기술학과)
Ko, Young Min (전주대학교 인공지능학과)
Ko, Sun Woo (전주대학교 스마트미디어학과)
Publication Information
KIPS Transactions on Software and Data Engineering / v.9, no.7, 2020 , pp. 213-220 More about this Journal
Abstract
The convolutional neural network is composed of convolutional layers and fully connected layers. The nonlinear activation function is used in each layer of the convolutional layer and the fully connected layer. The activation function being used in a neural network is a function that simulates the method of transmitting information in a neuron that can transmit a signal and not send a signal if the input signal is above a certain criterion when transmitting a signal between neurons. The conventional activation function does not have a relationship with the loss function, so the process of finding the optimal solution is slow. In order to improve this, an agile activation function that generalizes the activation function is proposed. The agile activation function can improve the performance of the deep neural network in a way that selects the optimal agile parameter through the learning process using the primary differential coefficient of the loss function for the agile parameter in the backpropagation process. Through the MNIST classification problem, we have identified that agile activation functions have superior performance over conventional activation functions.
Keywords
Convolutional Neural Network; Agile Activation Function; Backpropagation; Learning;
Citations & Related Records
연도 인용수 순위
  • Reference
1 D. Hubel and T. Wiesel. "Receptive Fields of Single Neurons in the cat's Striate Cortex," The Journal of Physiology, Vol.124, No.3, pp.574-591, 1959.   DOI
2 A. Krizhevsky, I. Sutskever and G. Hinton, "Imagenet Classification with Deep Convolution Neural Networks," NIPS Conference, pp.1097-1107, 2012.
3 Charu C. Aggarwal, "Neural Networks and Deep Learning: A Textbook," Springer International Publishing AG.
4 Chigozie Enyinna Nwankpa, Winifred Ijomah, Anthony Gachagan, and Stephen Marshall, "Activation Functions: Comparison of Trends in Practice and Research for Deep Learning," arXiv:1811.03378v1 [cs.LG] 8 Nov. 2018.
5 N. Y. Kong and S. W. Ko, "Agile Activation Functions in Deep Neural Networks," Working Paper, 2020.
6 Nello Cristianini and John Shawe-Taylor, "An introduction to Support Vector Machines: and other kernel-based learning methods," Cambridge University Press, New York, NY, USA, 2000.
7 Bekir Karlik and A Vehbi Olgac, "Performance Analysis of Various Activation Functions in Generalized MLP Architectures of Neural Networks," International Journal of Artificial Intelligence And Expert Systems (IJAE), Volume (1): Issue (4), 2011.