Browse > Article

Dropout 알고리즘에 대한 이해  

Choe, Hui-Yeol (삼성전자 종합기술원)
Min, Yun-Hong (삼성전자 종합기술원)
Keywords
Citations & Related Records
연도 인용수 순위
  • Reference
1 G. Hinton, R. Salakhutdinov, "Reducing the dimensionality of data with neural networks," Science, 313(5786), pp. 504-507, Jul. 2006.   DOI
2 J. Schmidhuber, "Deep Learning in Neural Networks: An Overview," Technical Report IDSIA-03-14, 2014.
3 J. Markoff, "How Many Computers to Identify a Cat? 16,000, " New York Times. June 25, 2012.
4 J. Markoff, "Scientists See Promise in Deep-Learning Programs," New York Times. November 24, 2012.
5 G. Marcus, "Is 'Deep Learning' a Revolution in Artificial Intelligence?" The New Yorker, November 25, 2012.
6 G. Hinton, S. Osindero, Y. Teh, "A fast learning algorithm for deep belief nets," Neural Computation, Vol.18, pp. 1527-1554, 2006.   DOI
7 G. Hinton, N. Srivastava, A. Krizhevsky, I. Suskever, and R. Salakhutdinov, "Improving neural networks by preventing co-adaptation of feature detector", http://arxiv.org/abs/1207.0580, 2012.
8 P. Baldi, P. J. Sadowski, "Understanding dropout," Advances in Neural Information Processing Systems (NIPS), (2013)
9 J. Tompson, R. Goroshin, A. Jain, Y. LeCun, C. Bregler, "Efficient Object Localization using Convolutional Networks", Computer Vision and Pattern Recognition (CVPR) 2015, pp. 648-656.
10 V. Pham, T. Bluche, C. Kermorvant, and J. Louradour, "Dropout improves recurrent neural networks for handwriting recognition," ICFHR (2014)
11 W. Zaremba, I. Sutskever, and O. Vinyals, "Recurrent Neural Network Regularization," http://arxiv.org/abs/1409.2329v5
12 T. Moon, H. Choi, H. Lee, I. Song, "RnnDrop: ANovel Dropout for RNNs in ASR", Automatic Speech Recognition and Understanding (ASRU) (2015), submitted.
13 A. Graves, N. Jaitly, and A. Mohamed, "Hybrid speech recognition with deep bi-directional LSTM," Automatic Speech Recognition and Understanding (ASRU) (2013)
14 A. Krizhevsky, I. Sutskever, G. Hinton, "ImageNet classification with deep convolutional neural networks," Advances in Neural Information Processing (NIPS), Lake Taho, NY, (2012)
15 S. Wager, S. Wang, and P. Liang, "Dropout training as adaptive regularization", Advances in Neural Information Processing (NIPS), (2013)
16 G. Dahl, T. N. Sainath, and G. Hinton, "Improving deep neural networks for LVCSR using rectified linear units and dropout", International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2013).
17 L. Wan, M. Zeiler, S. Zhang, Y. LeCun, and R. Fergus, "Regularization of neural networks using dropconnect," International Conference on Machine Learning (ICML) (2013)
18 A. Krogh, J. Hertz, "A simple weight decay can improve generalization," Advances in Neural Information Processing (NIPS), (1991)
19 C. Bishop, "Training with noise is equivalent to Tikhnov regularization," Neural Computation, 7(1), 1995
20 B. Olshausen, and D. Field, "Emergence of simple-cell receptive field properties by learning a sparse code for natural images", Nature, 381(6583),1996.
21 Y. Bengio, A. Courville, and P. Vincent,"Representation learning: A review and new perspectives," IEEE Trans. On Pattern Analysis and Machine Intelligence, 35(8), 2013