Browse > Article
http://dx.doi.org/10.3745/KTSDE.2021.10.6.235

Korean Dependency Parsing Using Stack-Pointer Networks and Subtree Information  

Choi, Yong-Seok (충남대학교 전자전파정보통신공학과)
Lee, Kong Joo (충남대학교 전파정보통신공학과)
Publication Information
KIPS Transactions on Software and Data Engineering / v.10, no.6, 2021 , pp. 235-242 More about this Journal
Abstract
In this work, we develop a Korean dependency parser based on a stack-pointer network that consists of a pointer network and an internal stack. The parser has an encoder and decoder and builds a dependency tree for an input sentence in a depth-first manner. The encoder of the parser encodes an input sentence, and the decoder selects a child for the word at the top of the stack at each step. Since the parser has the internal stack where a search path is stored, the parser can utilize information of previously derived subtrees when selecting a child node. Previous studies used only a grandparent and the most recently visited sibling without considering a subtree structure. In this paper, we introduce graph attention networks that can represent a previously derived subtree. Then we modify our parser based on the stack-pointer network to utilize subtree information produced by the graph attention networks. After training the dependency parser using Sejong and Everyone's corpus, we evaluate the parser's performance. Experimental results show that the proposed parser achieves better performance than the previous approaches at sentence-level accuracies when adopting 2-depth graph attention networks.
Keywords
Stack-Pointer Networks; Korean Dependency Parser; Graph Attention Networks; Subtree; Pre-trained Word Representation Model; Everyone's Corpus;
Citations & Related Records
연도 인용수 순위
  • Reference
1 X. Ma, Z. Hu, J. Liu, N. Peng, G. Neubig, and E. Hovy, "Stack-pointer networks for dependency parsing," arXiv preprint arXiv:1805.01087, 2018.
2 O. Vinyals, M. Fortunato, and N. Jaitly, "Pointer networks," in Advances in Neural Information Processing Systems, pp.2692-2700, 2015.
3 T. Ji, Y. Wu, and M. Lan, "Graph-based dependency parsing with graph neural networks," in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp.2475-2485, 2019.
4 P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, "Graph attention networks," arXiv preprint arXiv:1710.10903, 2017.
5 J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, "Bert: Pre-training of deep bidirectional transformers for language understanding," arXiv preprint arXiv:1810.04805, 2018.
6 Y. Liu, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen, O. Levy, M. Lewis, L. Zettlemoyer, and V. Stoyanov, "Roberta: A robustly optimized bert pretraining approach," arXiv preprint arXiv:1907.11692, 2019.
7 Y. Wu, M. Schuster, Z. Chen, Q.V. Le, M. Norouzi, W. Macherey, M. Krikun, Y. Cao, Q. Gao, and K. Macherey, "Google's neural machine translation system: Bridging the gap between human and machine translation," arXiv preprint arXiv:1609.08144, 2016.
8 CORPUS, "21st Century Sejong Project". 2010: The National Institute of the Korean Language.
9 C. Lee, J. Kim, and J. Kim, "Korean Dependency Parsing using Deep Learning," in Proceedings of the 26th Annual Conference on Human and Cognitive Language Technology, pp.87-91, 2014.
10 CORPUS. National Institute of the Korean Language Morphological Analysis Corpus (Version 1.0). 2020, [Internet] https://corpus.korean.go.kr/.
11 CORPUS. National Institute of the Korean Language Parsing corpus (Version 1.0). 2020, [Internet] https://corpus.korean.go.kr/.
12 C. Park, H. Hwang, C. Lee, and H. Kim, "Korean Dependency Parsing with Multi-layer Pointer Networks," in Proceedings of the 29th Annual Conference on Human and Cognitive Language Technology, pp.92-96, 2017.
13 J. Min, S.-H. Na, J.-H. Shin, and Y.-K. Kim, "RoBERTa for Korean Natural Language Processing: Named Entity Recognition, Sentiment Analysis, Dependency Parsing," in Proceedings of the KIISE Korea Software Congress, pp.407-409, 2019.
14 J.-W. Min and S.-H. Na, "SyntaxNet Models using Transition Based Recurrent Unit for Korean Dependency Parsing," in Proceedings of the KIISE Korea Computer Congress, pp.602-604, 2017.
15 S.-H. Na, J. Li, J.-H. Shin, and K. Kim, "Deep Biaffine Attention for Korean Dependency Parsing," in Proceedings of the KIISE Korea Computer Congress, pp.584-586, 2017.
16 J.-H. Lim, Y. Bae, H. Kim, Y. Kim, and K.-C. Lee, "Korean Dependency Guidelines for Dependency Parsing and ExoBrain Language Analysis Corpus," in Proceedings of the 27th Annual Conference on Human and Cognitive Language Technology, pp.234-239, 2015.
17 Y. Choi and K. J. Lee, "Korean Dependency Parser using Higher-order features and Stack-Pointer Networks," Journal of KIISE, Vol.46, No.7, pp.636-643, 2019.   DOI
18 C. Park, C. Lee, J.-H. Lim, and H. Kim, "Korean Dependency Parsing with BERT," in Proceedings of the KIISE Korea Computer Congress, pp.530-532, 2019.
19 J.-W. Min, S.-Y. Hong, Y.-H. Lee, and S.-H. Na, "Graph Neural Networks for Korean Dependency Parsing," in Proceedings of the 31th Annual Conference on Human and Cognitive Language Technology, pp.537-539, 2019.
20 T. Dozat and C.D. Manning, "Deep biaffine attention for neural dependency parsing," arXiv preprint arXiv:1611.01734, 2016.
21 C. Dyer, M. Ballesteros, W. Ling, A. Matthews, and N.A. Smith, "Transition-based dependency parsing with stack long short-term memory," arXiv preprint arXiv:1505.08075, 2015.
22 S.-H. Na, J. Li, J.-H. Shin, and K. Kim, "Stack LSTMs with Recurrent Controllers for Korean Dependency Parsing," in Proceedings of the 43th Annual Conference on KIISE, pp.446-448, 2016.