Browse > Article
http://dx.doi.org/10.4218/etrij.2018-0456

Linear-Time Korean Morphological Analysis Using an Action-based Local Monotonic Attention Mechanism  

Hwang, Hyunsun (Department of Computer Science, Kangwon National University)
Lee, Changki (Department of Computer Science, Kangwon National University)
Publication Information
ETRI Journal / v.42, no.1, 2020 , pp. 101-107 More about this Journal
Abstract
For Korean language processing, morphological analysis is a critical component that requires extensive work. This morphological analysis can be conducted in an end-to-end manner without requiring a complicated feature design using a sequence-to-sequence model. However, the sequence-to-sequence model has a time complexity of O(n2) for an input length n when using the attention mechanism technique for high performance. In this study, we propose a linear-time Korean morphological analysis model using a local monotonic attention mechanism relying on monotonic alignment, which is a characteristic of Korean morphological analysis. The proposed model indicates an extreme improvement in a single threaded environment and a high morphometric F1-measure even for a hard attention model with the elimination of the attention mechanism formula.
Keywords
deep learning; korean morphological analysis; local attention mechanism; natural language processing; sequence-to-sequence learning;
Citations & Related Records
연도 인용수 순위
  • Reference
1 D. Bahdanau, K. Cho, and Y. Bengio, Neural machine translation by jointly learning to align and translate, arXiv preprint, 2014, arXiv:1409.0473.
2 G. Jiatao et al., Incorporating copying mechanism in sequence-to-sequence learning, arXiv preprint, 2016, arXiv:1603.06393.
3 S.-H. Na, Conditional random fields for Korean morpheme segmentation and POS tagging, ACM Trans. Asian Low-Resource Language Inf. Process. 14 (2015) no. 3, 10:1-10.
4 C. Lee, Joint models for Korean word spacing and POS tagging using structural SVM, J. Korean Inf. Sci. Soc.: Softw. Applicat. 40 (2013), no. 12, 826-832. (in Korean).
5 M.-T. Luong, H. Pham, and C.D. Manning, Effective approaches to attention-based neural machine translation, arXiv preprint, 2015, arXiv:1508.04025.
6 R. Aharoni and Y. Goldberg, Morphological inflection generation with hard monotonic attention, arXiv preprint, 2016, arXiv:1611.01487.
7 C.C. Chiu and C. Raffel, Monotonic chunkwise attention, in Proc. Int. Conf. Learning Representations, Vancouver, Canada, 2018, pp. 1-16.
8 D.-G. Lee and H.-C. Rim, Probabilistic models for Korean morphological analysis, in Proc. Int. Joint Conf. Natural Language Process., Jeju Island, Rep. of Korea, Oct. 2005, pp. 197-202.
9 K. Cho et al., Learning phrase representations using RNN encoder-decoder for statistical machine translation, arXiv preprint, 2014, arXiv:1406.1078.
10 I. Sutskever, O. Vinyals, and Q.V. Le, Sequence to sequence learning with neural networks, Adv. Neural Inf. Process. Syst. 27 (2014), 3104-3112.
11 S. Jung, C. Lee, and H. Hwang, End-to-end Korean part-of-speech tagging using copying mechanism, ACM Trans. Asian Low-Resource Language Inf. Process. 17 (2018) no. 3, 19:1-28.