References
- R. Sasano, D. Kawahara, and S. Kurohashi, A fully-lexicalized probabilistic model for Japanese zero anaphora resolution, in Proc. 22nd Int. Conf. Comput. Linguistics (Manchester, UK), Aug. 2008, pp. 769-776.
- H. Nakaiwa and S. Shirai, Anaphora resolution of Japanese zero pronouns with deictic reference, in Proc. Int. Conf. Comput. Linguistics. (Madrid, Spain), 1996, pp. 812-817.
- R. Iida, I. Kentaro, and Y. Matsumoto, Zero anaphora resolution by learning rich syntactic pattern features, ACM Trans. Asian Language Inform. Process. 6 (2007), no. 4, 12:1-22.
- J. Devlin et al., BERT: Pre-training of deep bidirectional transformers for language understanding, in Proc. North American Chap. Assoc. Comput. Linguistics (Minneapolis, MN, USA), 2019, pp. 4171-4186.
- A. Vaswani et al., Attention is all you need, Advances Neural Inform. Process. Syst. 30 (2017), 6000-6010.
- I. Tsochantaridis et al., Support vector machine learning for interdependent and structured output spaces, in Proc. 21st Int. Conf. Mach. Learn. (Banff, Canada), 2004, pp. 823-830.
- O. Vinyals, M. Fortunato, and N. Jaitly, Pointer networks, Advances in Neural Inform. Process. Syst. 28 (2015), 2674-2682.
- C. Park and C. Lee, Co-reference resolution for Korean pronouns using pointer networks, J. Korean Inst. Inform. Sci. Eng. 44 (2017), 496-502.
- C. Chen and V. Ng, Chinese zero pronoun resolution with deep neural networks, in Proc. Annu. Meet. Assoc. Comput. Linguistics, (Berlin, Germany), 2016, pp. 778-788.
- Q. Yin et al., Zero pronoun resolution with attention-based neural network, in Proc. Int. Conf. Comput. Linguistics (Santa Fe, NM, USA), 2018, pp. 13-23.
- R. Iida et al., Intra-Sentential subject zero anaphora resolution using multi-column convolutional neural network, in Proc. Conf. Empirical Methods in Natural Language Process. (Austin, TX), 2016, pp. 1244-1254.
- M. Okumura and K. Tamura, Zero pronoun resolution in Japanese discourse based on centering theory, in Proc. Int. Conf. Computat. Linguistics. (Copenhagen, Denmark), 1996, pp. 871-887.
- M. Murata and M. Nagao, Resolution of verb ellipsis in Japanese sentences using surface expressions and examples, in Proc. Natural Language Process. Pacific Rim Symp. (Bangkok, Thailand), 1997, pp. 75-80.
- S. Nariyama, Grammar for ellipsis resolution in Japanese, in Proc. Int. Conf. Theoret. Methodol. Issues Mach. Transl. (Keihanna, Japan), 2020, pp. 135-145.
- K. Seki, A. Fujii, and T. Ishikawa, A probabilistic method for analyzing Japanese anaphora integrating zero pronoun detection and resolution, in Proc. Int. Conf. Comput. Linguistics (Taipei, Taiwan), 2002, pp. 911-917.
- S. Lim, C. Lee, and M. Jang, Restoring an elided entry word in a sentence for encyclopedia QA system, in Proc. Int. Joint Conf. Natural Language Process. (Jeju, Rep. of Korea), 2005, pp. 215-219.
- S. Zhao and H. T. Ng, Identification and resolution of Chinese zero pronouns: A machine learning approach, in Proc. Joint Conf. Empirical Methods Natural Language Process. Comput. Natural Language Learn. (Prague, Czech), 2007, pp. 541-550.
- R. Iida and M. Poesio, A cross-lingual ILP solution to zero anaphora resolution, in Proc. Annu. Meet. Assoc. Comput. Linguistics (Portland, OR, USA), 2011, pp. 804-813.
- S. Jung and C. Lee, Deep neural architecture for recovering dropped pronouns in Korean, ETRI J. 40 (2018), 257-264. https://doi.org/10.4218/etrij.2017-0085
- Z. Lan et al., ALBERT: A lite BERT for self-supervised learning of language representations, in Proc. Int. Conf. Learn. Represent. (Addis Ababa, Ethiopia), 2020.
- Z. Yang et al., XLNet: Generalized autoregressive pretraining for language understanding, arXiv preprint, CoRR, 2020, arXiv:1906.08237.
- Y. Liu et al., Roberta: A robustly optimized BERT pretraining approach, arXiv preprint, CoRR, 2019, arXiv:1907.11692.
- Q. Yin et al., Chinese zero pronoun resolution: A collaborative filtering-based Approach, ACM Trans. Asian Low-Resour. Lang. Inf. Process. 19 (2019) no. 1, 3:1-20.
- F. Kong, M. Zhang, and G. Zhou, Chinese zero pronoun resolution: A chain-to-chain approach, ACM Trans. Asian Low-Resour. Lang. Inf. Process. 19 (2019), no. 1, 2:1-21.
- I. Goodfellow, Y. Bengio, and A. Courville, Deep learning, MIT Press, Cambridge, MA, USA 2016.
- S. Hochreiter, and J. Schmidhuber, Long short-term memory, Neural Comput. 9 (1997), 1735-1780. https://doi.org/10.1162/neco.1997.9.8.1735
- J. Chung et al., Empirical evaluation of gated recurrent neural networks on sequence modeling, in Proc. Neural Inform. Process. Syst., Workshop Deep Learn. Dec. 2014.
- I. Sutskever, O. Vinyals, and Q. V. Le, Sequence to sequence learning with neural networks, in Proc. Int. Conf. Neural Inform. Process. Syst. (Montreal, Canada), 2014, pp. 3104-3112.
- D. Bahdanau, K. H. Cho, and Y. Bengio, Neural machine translation by jointly learning to align and translate, in Proc. Int. Conf. Learn. Represent. (San Diego, CA, USA), 2015.
- J. L. Ba, J. R. Kiros, and G. E. Hinton, Layer normalization, ArXiv Preprint, CoRR, 2016, ArXiv:1607.06450.
- Y. Wu et al., Google's neural machine translation system: Bridging the gap between human and machine translation, arXiv preprint, 2016, arXiv:1609.08144.
- M. Schuster and K. Nakajima, Japanese and Korean voice search, in Proc. IEEE Int. Conf. Acoust., Speech Signal Process. (Kyoto, Japan), 2012, pp. 5149-5152.
- R. Sennrich, B. Haddow, and A. Birch, Neural machine translation of rare words with subword units, in Proc. Annu. Meet. Assoc. Comput. Linguistics (Berlin, Germany), 2016, pp. 1715-1725.
- Google, TensorFlow code and pre-trained models for BERT, available at https://github.com/google-research/bert.
- ETRI, Public Artificial Intelligence Open API DATA, Korean BERT language model, available at http://aiopen.etri.re.kr/service_dataset.php. (Korean).
- Facebook, fastText-Library for text representation and classification, available at https://fasttext.cc/.