Browse > Article
http://dx.doi.org/10.15207/JKCS.2021.12.5.023

A study on performance improvement considering the balance between corpus in Neural Machine Translation  

Park, Chanjun (Department of Computer Science and Engineering, Korea University)
Park, Kinam (Creative Information and Computer Institute, Korea University)
Moon, Hyeonseok (Department of Computer Science and Engineering, Korea University)
Eo, Sugyeong (Department of Computer Science and Engineering, Korea University)
Lim, Heuiseok (Department of Computer Science and Engineering, Korea University)
Publication Information
Journal of the Korea Convergence Society / v.12, no.5, 2021 , pp. 23-29 More about this Journal
Abstract
Recent deep learning-based natural language processing studies are conducting research to improve performance by training large amounts of data from various sources together. However, there is a possibility that the methodology of learning by combining data from various sources into one may prevent performance improvement. In the case of machine translation, data deviation occurs due to differences in translation(liberal, literal), style(colloquial, written, formal, etc.), domains, etc. Combining these corpora into one for learning can adversely affect performance. In this paper, we propose a new Corpus Weight Balance(CWB) method that considers the balance between parallel corpora in machine translation. As a result of the experiment, the model trained with balanced corpus showed better performance than the existing model. In addition, we propose an additional corpus construction process that enables coexistence with the human translation market, which can build high-quality parallel corpus even with a monolingual corpus.
Keywords
Machine Translation; Parallel Corpus; Human Translation; High Quality Data; Deep Learning; Language Conversion;
Citations & Related Records
연도 인용수 순위
  • Reference
1 A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N Gomez, L. Kaiser & I. Polosukhin. (2017). Attention is all you need. In Advances in neural information processing systems, 5998-6008.
2 C. Park, Y. Yang, K. Park & H. Lim. (2020). Decoding strategies for improving low-resource machine translation. Electronics, 9(10), 1562.   DOI
3 C. Park, C. Lee, Y. Yang & H. Lim. (2020). Ancient Korean Neural Machine Translation. IEEE Access, 8, 116617-116625.   DOI
4 C. Park & H. Lim. (2020). A Study on the Performance Improvement of Machine Translation Using Public Korean-English Parallel Corpus. Journal of Digital Convergence, 18(6), 271-277. DOI : 10.14400/JDC.2020.18.6.271   DOI
5 K. Papineni, S. Roukos, T. Ward & W. J. Zhu. (2002, July). Bleu: a method for automatic evaluation of machine translation. In Proceedings of the 40th annual meeting of the Association for Computational Linguistics 311-318.
6 Sen, Sukanta, Asif Ekbal, and Pushpak Bhattacharyya. "Parallel Corpus Filtering based on Fuzzy String Matching." Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2). 2019.
7 K. Song, X. Tan, T. Qin, J. Lu & T. Y. Liu. (2019). Mass: Masked sequence to sequence pre-training for language generation. arXiv preprint arXiv:1905.02450.
8 Y. Liu, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen & V. Stoyanov. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
9 C. Park, Y. Lee, C. Lee & H Lim, (2020). "Quality, not Quantity? : Effect of parallel corpus quantity and quality on Neural Machine Translation," The 32st Annual Conference on Human Cog-nitive Language Technology.
10 P. Koehn, V. Chaudhary, A. El-Kishky, N. Goyal, P. J. Chen & F. Guzman. (2020, November). Findings of the WMT 2020 shared task on parallel corpus filtering and alignment. In Proceedings of the Fifth Conference on Machine Translation 726-742.
11 C. J. Park, Y. D. Oh, J. K. Choi, D. P. Kim & H. Lim. (2020). Toward High Quality Parallel Corpus Using Monolingual Corpus. The 10th International Conference on Convergence Technology (ICCT 2020), Volume 10, 146-147.
12 S. Edunov et al. (2018). "Understanding back-translation at scale." arXiv preprint arXiv:1808.09381.
13 T. B. Brown, B. Mann, N. Ryder, M. Subbiah, J. Kaplan, P. Dhariwal ... & D. Amodei. (2020). Language models are few-shot learners. arXiv preprint arXiv:2005.14165.
14 H. Yang, M. Wang, D. Wei, H. Shang, J. Guo, Z. Li, ... & Y. Chen. (2020, November). HW-TSC's Participation at WMT 2020 Automatic Post Editing Shared Task. In Proceedings of the Fifth Conference on Machine Translation (pp. 797-802).
15 E. Fonseca et al. (2019). "Findings of the WMT 2019 Shared Tasks on Quality Estimation." Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2). 2019.