1 |
Y. M. Ko, P. H. Li, and S. W. Ko, "Performance improvement method of fully connected neural network using combined parametric activation functions," KIPS Transactions on Software and Data Engineering, Vol.11, No.1, pp.1-10, 2022.
DOI
|
2 |
N. Y. Kong, Y. M. Ko, and S. W. Ko, "Performance improvement method of convolutional neural network using agileactivation function," KIPS Transactions on Software and Data Engineering, Vol.9, No.7, pp.213-220, 2020.
DOI
|
3 |
V. Nair and G. Hinton, "Rectified linear units improve restricted boltzmann machines," In Proceedings of the 27th International Conference on International Conference on Machine Learning (ICML), pp.807-814, 2010.
|
4 |
S. Qian, H. Liu, C. Liu, S. Wu, and H. Wong, "Adaptive activation functions in convolutional neural networks," Neurocomputing, Vol.272, pp.204-212, 2017.
DOI
|
5 |
M. Roodschild, J. Gotay Sardinas, and A. Will, "A new approach for the vanishing gradient problem on sigmoid activation," Springer Nature, Vol.20, Iss.4, pp.351-360, 2020.
|
6 |
B. Xu, N. Wang, T. Chen, and M. Li, "Empirical evaluation of rectified activations in convolutional network," arXiv: 1505.00853, 2015.
|
7 |
Y. M. Ko and S. W. Ko, "Alleviation of vanishing gradient problem using parametric activation functions," KIPS Transactions on Softward and Data Engineering, Vol.10, No. 10, pp.407-420, 2021.
|
8 |
Y. Qin, X. Wang, and J. Zou, "The optimized deep belief networkswith improved logistic Sigmoid units and their application in faultdiagnosis for planetary gearboxes of wind turbines," IEEE Transactions on Industrial Electronics, Vol.66, No.5, pp.3814-3824, 2018.
DOI
|
9 |
X. Wang, Y. Qin, Y. Wang, S. Xiang, and H. Chen, "ReLTanh: An activation function with vanishing gradient resistance for SAE-based DNNs and its application to rotating machinery fault diagnosis," Neurocomputing, Vol.363, pp.88-98, 2019.
DOI
|
10 |
D. Clevert, T. Unterthiner, and S. Hochreiter, "Fast and accurate deep network learning by exponential linear units (ELUs)," arXiv:1511.07289, 2016.
|
11 |
Y. Bengio, I. Goodfellow, and A. Courville, "Deep learning," MIT Press, 2017.
|
12 |
C. A. Charu, "Neural Networks and Deep Learning: A Textbook," Springer International Publishing AG, 2018.
|
13 |
N. Y. Kong and S. W. Ko, "Performance improvement method of deep neural network using parametric activation functions," Journal of the Korea Contents Association, Vol.21, No.3, pp616-625, 2021.
DOI
|
14 |
A. Apicella, F. Donnarumma, F. Isgro, and R. Prevete, "A survey on modern trainable activation functions," Neural Networks, Vol.138, pp.14-32, 2021.
DOI
|
15 |
K. He, X. Zhang, S. Ren, and J. Sun, "Delving deep into rectifiers: Surpassing human-level performance on imagenet classification," arXiv:1502.01852, 2015.
|
16 |
S. Hochreiter, "The vanishing gradient problem during learning recurrent neural nets and problem solutions," International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, Vol.6, No.2, pp.107-116, 1998.
DOI
|
17 |
S. Kong and M. Takatsuka, "Hexpo: A vanishing-proof activation function," International Joint Conference on Neural Networks, pp.2562-2567, 2017.
|