1 |
Y. Bengio, I. Goodfellow, and A. Courville, "Deep learning," MIT Press, 2017.
|
2 |
V. Nair and G. Hinton, "Rectified linear units improve restricted boltzmann machines," International Conference on Machine Learning, vol. 9, pp. 807-814, 2010.
|
3 |
S. Kong and M. Takatsuka, "Hexpo: A vanishing-proof activation function," International Joint Conference on Neural Networks, pp. 2562-2567, 2017.
|
4 |
W. Yoshiyuki and W. Sadami, "Deep learning with excel," Seoul : Seongandang, 2020.
|
5 |
M. Roodschild, J. Gotay Sardinas, and A. will, "A new approach for the vanishing gradient problem on sigmoid activation," Springer Nature, pp. 351-360, 2020.
|
6 |
X. Wang, Y. Qin, Y. Wang, S. Xiang, and H. Chen, "ReLTanh: An activation function with vanishing gradient resistance for SAE-based DNNs and its application to rotating machinery fault diagnosis," Neurocomputing, vol. 363, pp. 88-98, 2019.
DOI
|
7 |
K. Hornik, M. Stinchcombe, and H. White, "Multilayer feedforward networks are universal approximators," Neural Networks, vol. 2, no. 5, pp. 359-366, 1989.
DOI
|
8 |
Y. Qin, X. Wang, and J. Zou, "The optimized deep belief networks with improved logistic Sigmoid units and their application in fault diagnosis for planetary gearboxes of wind turbines," IEEE Transactions on Industrial Electronics, vol. 66, no. 5, pp. 3814-3824, July 2018.
DOI
|