DOI QR코드

DOI QR Code

순환 신경망을 이용한 전이 기반 한국어 의존 구문 분석

Korean Transition-based Dependency Parsing with Recurrent Neural Network

  • 이건일 (포항공과대학교 컴퓨터공학과) ;
  • 이종혁 (포항공과대학교 컴퓨터공학과)
  • 투고 : 2015.03.27
  • 심사 : 2015.05.29
  • 발행 : 2015.08.15

초록

기존의 전이 기반 한국어 의존 구문 분석 방법론들은 사용 될 자질의 설계에 많은 노력이 필요하다. 최근에 인공 신경망을 이용하여 자질 설계 단계에서의 시간과 노력을 줄이는 연구들이 많이 수행되었으나 제한된 context의 정보들만 보고 전이 과정에 필요한 decision을 내려야 하는 문제점들이 있다. 본 논문에서는 순환 신경망 모델을 이용하여 자질 설계에 필요한 노력을 줄이고 순환 구조로 먼 거리 의존관계를 고려하였다. 실험을 진행한 결과 일반적인 다층 신경망에 비해 0.51%의 성능향상을 이루었으며 UAS 90.33%의 성능을 선보인다.

Transition-based dependency parsing requires much time and efforts to design and select features from a very large number of possible combinations. Recent studies have successfully applied Multi-Layer Perceptrons (MLP) to find solutions to this problem and to reduce the data sparseness. However, most of these methods have adopted greedy search and can only consider a limited amount of information from the context window. In this study, we use a Recurrent Neural Network to handle long dependencies between sub dependency trees of current state and current transition action. The results indicate that our method provided a higher accuracy (UAS) than an MLP based model.

키워드

과제정보

연구 과제 주관 기관 : 정보통신기술진흥센터, 한국연구재단

참고문헌

  1. M. Collins, B. Roark, "Incremental Parsing with the Perceptron Algorithm," Proc. of the Association for Computational Linguistics 2004, pp. 111-118, 2004.
  2. L. Huang, K. Sagae, "Dynamic Programming for Linear-Time Incremental Parsing," Proc. of the Association for Computational Linguistics 2010, pp. 1077-1086, 2010.
  3. X. Zheng, H. Chen, T. Xu, "Deep Learning for Chinese Word Segmentation," Proc. of the Empirical Methods in Natural Language Processing 2013, pp. 647-657, 2013.
  4. R. Socher, A. Perelygin, J. Y. Wu, J. Chuang, C. D. Manning, A. Y. Ng, C. Potts "Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank," Proc. of the Empirical Methods in Natural Language Processing 2013, pp. 1631-1642, 2013.
  5. J. Devlin, R. Zbib, Z. Huang, T. Lamar, R. Schwartz, J. Makhoul, "Fast and Robust Neural Network Joing Models for Statistical Machine Translation," Proc. of the Association for Computational Linguistics 2014, pp. 1370-1380, 2014.
  6. T. Mikolov, M. Karafiat, L. Burget, J. Cernocky, S. Khudanpur, "Recurrent Neural Network based Language Model," Proc. of the INTERSPEECH 2010, pp. 1045-1048, 2010.
  7. D. Chen, C. D. Manning, "A Fast and Accurate Dependency Parser using Neural Networks," Proc. of the Empirical Methods in Natural Language Processing 2014, pp. 740-750, 2014.
  8. C. Lee, J. Kim, J. Kim, "Korean Dependency Parsing using Deep Learning," Proc. of the 26th Annual Conference on Human and Cognitive Lanugage Technology, pp. 87-91, 2014. (In Korean)
  9. J. Li, J. Lee, "Morpheme-based Korean Dependency Parsing with Deep Neural Network," Proc. of the 41st KIISE Winter Conference, pp. 432-434, 2014.
  10. B. Bohnet, "Very High Accuracy and Fast Dependency Parsing is not a Contradiction," Proc. of the 23rd International Conference on Computational Liguistics 2010, pp. 89-97, 2010.
  11. D. E. Rumelhart, G. E. Hinton, R. J. Williams, "Learning Internal Representations by Error Propagation," Nature, Vol. 323, No. 9, pp. 533-536, Oct. 1986. https://doi.org/10.1038/323533a0
  12. Y. Bengio, R. Ducharme, P. Vincent, C. Janvin, "A Neural Probabilistic Language Model," The Journal of Machine Learning Research, Vol. 3, pp. 1137-1155, Mar. 2003.
  13. G. B. Orr, K-R. Muller, Neural Networks: Tricks of the trade, pp. 9-50, Springer-Verlag, Berlin Heidelberg, 1988.
  14. S. Hochreiter, J. Schmidhuber, "Long Short-Term Memory," Neural Computation, Vol. 9, No. 8, pp. 1735-1780, 1997. https://doi.org/10.1162/neco.1997.9.8.1735
  15. J. Yoon, K. Choi, "Study on KAIST Corpus," In CS-TR-99-139 KAIST CS, pp. 285-288, 1999.
  16. F. Bastien, P. Lamblin, R. Pascanu, J. Bergstra, I. Goodfellow, A. Bergeron, N. Bouchard, D. Warde-Farley, Y. Bengio, "Theano: New Features and Speed Improvements," Proc. of Advances in Neural Information Processing Systems 2012 Deep Learning Workshop, 2012.
  17. J. Bergstra, O. Breuleux, F. Bastien, P. Lamblin, R. Pascanu, G. Desjardins, J. Turian, D. Warde- Farley, Y. Bengio, "Theano: A CPU and GPU Math Compiler in Python," Proc. of the Python for Scientific Computing Conference (SciPy) 2010, pp. 3-9, 2010.