Acknowledgement
Grant : 언어학습을 위한 자유발화형 음성대화처리 원천기술 개발
Supported by : 정보통신기술진흥센터
References
- Y. Bengio, R. Ducharme, P. Vincent, A neural probabilistic language model, Journal of Machine Learning Research, 3:1137-1155, 2003.
- T. Mikolov, A. Deoras, D. Povey, L. Burget and J. Cernocky, "Strategies for training large scale neural network language models," Automatic Speech Recognition and Understanding (ASRU), 2011.
- T. Mikolov, K. Chen, G. Corrado and J. Dean, "Efficient Estimation of Word Representations in Vector Space," International Conference of Learning Representations (ICLR), 2013.
- A. Neelakantan, J. Shankar, A. Passos and A. McCallum, "Efficient non-parametric estimation of multiple embeddings per word in vector space," Empirical Methods on Natural Language Processing (EMNLP), 2014.
- E. Chung and J. Park, "Word Embedding based Class Language Model," Human & Cognitive Language Technology (HCLT), 2015.
- S. F. Chen and J. Goodman, "An empirical study of smoothing techniques for language modeling," Technical Report TR-10-98, Computer Science Group, Harvard University, 1998.
- X. Rong, "word2vec Explained: Deriving Mikolov et al.'s Negative-Sampling Word-Embedding Method," arXiv:1402.3722 [cs.CL], 2014.
- X. Rong, "word2vec Parameter Learning Explained," arXiv:1411.2738 [cs.CL], 2016.
- C. Lee, J. Kim and J. Kim, "Korean Dependency Parsing using Deep Learning," Human & Cognitive Language Technology (HCLT), 2014.
- D. Q. Nguyen, D. D. Pham and S. B. Pham, "RDRPOSTagger: A Ripple Down Rules-based Part-Of-Speech Tagger," European Chapter of the Association for Computational Linguistics (EACL), pp. 17-20, 2014.