1 |
T. Nguyen and J. Salazar, "Transformers without tears: Improving the normalization of self-attention," in Proceedings of the 16th International Workshop on Spoken Language Translation, 2019.
|
2 |
A. Graves, "Sequence transduction with recurrent neural networks," in Proceedings of the 29th International Conference on Machine Learning Workshop on Representation Learning, Edinburgh, Scotland, 2012.
|
3 |
D. Ra, M. Cho, and Y. Kim, "Enhancing a Korean part-of-speech tagger based on a maximum entropy model," Journal of the Korean Data Analysis Society, Vol.9, No.4, pp.1623-1638, 2007.
|
4 |
K. Cho, et al., "Learning phrase representations using RNN Encoder-decoder for statistical machine translation," in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp.1724-1734, 2014.
|
5 |
A. Vaswani, et al., "Attention is all you need," in Advances in Neural Information Processing Systems, pp.6000-6010, 2017.
|
6 |
I. Sutskever, O. Vinyals, and Q. V. Le, "Sequence to sequence learning with neural networks," in Advances in Neural Information Processing Systems, pp.3104-3112, 2014.
|
7 |
D. Bahdanau, K. Cho, and Y. Bengio, "Neural machine translation by jointly learning to align and translate," in Proceedings of the International Conference on Learning Representations, San Diego, California, 2015.
|
8 |
T. Luong, H. Pham, and C. D. Manning, "Effective approaches to attention-based neural machine translation," in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp.1412-1421, 2015.
|
9 |
J. Zhu, et al., "Incorporating BERT into neural machine translation," in Proceedings of the International Conference on Learning Representations, Addis Ababa, Ethiopia, 2020.
|
10 |
Q. Wang, et al., "Learning Deep Transformer Models for Machine Translation," in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp.1810-1822, 2019.
|
11 |
J. Devlin, M. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of deep bidirectional transformers for language understanding," in Proceedings of NAACL-HLT, Minneapolis, Minnesota, pp.4171-4186, 2019.
|
12 |
S.-W. Kim and S.-P. Choi, "Research on joint models for Korean word spacing and POS (Part-Of-Speech) tagging based on bidirectional LSTM-CRF," Journal of Korean Institute of Information Scientists and Engineers, Vol.45, No.8, pp.792-800, 2018.
|
13 |
B. Choe, I.-h. Lee, and S.-g. Lee, "Korean morphological analyzer for neologism and spacing error based on sequence-to-sequence," Journal of Korean Institute of Information Scientists and Engineers, Vol.47, No.1, pp.70-77, 2020.
|
14 |
M. Freitag and Y. Al-Onaizan, "Beam search strategies for neural machine translation," in Proceedings of the First Workshop on Neural Machine Translation, Vancouver, Canada, pp.56-60, 2017.
|
15 |
E. Battenberg, et al., "Exploring neural transducers for end-to-end speech recognition," in Proceedings of 2017 IEEE Automatic Speech Recognition and Understanding Workshop, Okinawa, Japan, pp.206-213, 2017.
|
16 |
H. S. Hwang and C. K. Lee, "Korean morphological analysis using sequence-to-sequence learning with copying mechanism," in Proceedings of the Korea Computer Congress 2016, pp.443-445, 2016.
|
17 |
J. Li, E. H. Lee, and J.-H. Lee, "Sequence-to-sequence based morphological analysis and part-of-speech tagging for Korean language with convolutional features," Journal of Korean Institute of Information Scientists and Engineers, Vol.44, No.1, pp.57-62, 2017.
|
18 |
J. Min, S.-H. Na, J.-H. Shin, and Y.-K. Kim, "Stack pointer network for Korean morphological analysis," in Proceedings of the Korea Computer Congress 2020, pp.371-373, 2020.
|
19 |
J. Y. Youn and J. S. Lee, "A pipeline model for Korean morphological analysis and part-of-speech tagging using sequence-to-sequence and BERT-LSTM," in Proceedings of the 32nd Annual Conference on Human & Cognitive Language Technology, pp.414-417, 2020.
|
20 |
Y. Choi and K. J. Lee, "Performance analysis of Korean morphological analyzer based on transformer and BERT," Journal of Korean Institute of Information Scientists and Engineers, Vol.47, No.8, pp.730-741, 2020.
|