DOI QR코드

DOI QR Code

The Verification of the Transfer Learning-based Automatic Post Editing Model

전이학습 기반 기계번역 사후교정 모델 검증

  • Moon, Hyeonseok (Department of Computer Science and Engineering, Korea University) ;
  • Park, Chanjun (Department of Computer Science and Engineering, Korea University) ;
  • Eo, Sugyeong (Department of Computer Science and Engineering, Korea University) ;
  • Seo, Jaehyung (Department of Computer Science and Engineering, Korea University) ;
  • Lim, Heuiseok (Department of Computer Science and Engineering, Korea University)
  • 문현석 (고려대학교 컴퓨터학과) ;
  • 박찬준 (고려대학교 컴퓨터학과) ;
  • 어수경 (고려대학교 컴퓨터학과) ;
  • 서재형 (고려대학교 컴퓨터학과) ;
  • 임희석 (고려대학교 컴퓨터학과)
  • Received : 2021.08.03
  • Accepted : 2021.10.20
  • Published : 2021.10.28

Abstract

Automatic post editing is a research field that aims to automatically correct errors in machine translation results. This research is mainly being focus on high resource language pairs, such as English-German. Recent APE studies are mainly adopting transfer learning based research, where pre-training language models, or translation models generated through self-supervised learning methodologies are utilized. While translation based APE model shows superior performance in recent researches, as such researches are conducted on the high resource languages, the same perspective cannot be directly applied to the low resource languages. In this work, we apply two transfer learning strategies to Korean-English APE studies and show that transfer learning with translation model can significantly improves APE performance.

기계번역 사후교정 (Automatic Post Editing, APE)이란 번역 시스템을 통해 생성한 번역문을 교정하는 연구 분야로, 영어-독일어와 같이 학습데이터가 풍부한 언어쌍을 중심으로 연구가 진행되고 있다. 최근 APE 연구는 전이학습 기반 연구가 주로 이루어지는데, 일반적으로 self supervised learning을 통해 생성된 사전학습 언어모델 혹은 번역모델이 주로 활용된다. 기존 연구에서는 번역모델에 전이학습 시킨 APE모델이 뛰어난 성과를 보였으나, 대용량 언어쌍에 대해서만 이루어진 해당 연구를 저 자원 언어쌍에 곧바로 적용하기는 어렵다. 이에 본 연구에서는 언어 혹은 번역모델의 두 가지 전이학습 전략을 대표적인 저 자원 언어쌍인 한국어-영어 APE 연구에 적용하여 심층적인 모델 검증을 진행하였다. 실험결과 저 자원 언어쌍에서도 APE 학습 이전에 번역을 한차례 학습시키는 것이 유의미하게 APE 성능을 향상시킨다는 것을 확인할 수 있었다.

Keywords

Acknowledgement

This research was supported by the MSIT(Ministry of Science and ICT), Korea, under the ITRC(Information Technology Research Center) support program(IITP-2018-0-01405) supervised by the IITP(Institute for Information & Communications Technology Planning & Evaluation).

References

  1. C. Park & H. Lim. (2020). Automatic Post Editing Research. Journal of the Korea Convergence Society, 11(5), 1-8. DOI : 10.15207/JKCS.2020.11.5.001
  2. S. Pal, N. Herbig, A. Kruger & J. van Genabith. (2018, October). A transformer-based multi-source automatic post-editing system. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers (pp. 827-835).
  3. P. Isabelle, C. Goutte & M. Simard. (2007). Domain adaptation of MT systems through automatic post-editing. MT Summit XI, 102.
  4. S. Chollampatt, R. H. Susanto, L. Tan & E. Szymanska. (2020). Can Automatic Post-Editing Improve NMT?. arXiv preprint arXiv:2009.14395.
  5. R Chatterjee, M. Freitag, M. Negri & M. Turchi. (2020, November). Findings of the WMT 2020 Shared Task on Automatic Post-Editing. In Proceedings of the Fifth Conference on Machine Translation (pp. 646-659).
  6. A. V. Lopes, M. A. Farajian, G. M. Correia, J., Trenous & A. F. Martins. (2019). Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing. arXiv preprint arXiv:1905.13068.
  7. J. Lee, W. Lee, J. Shin, B. Jung, Y. G. Kim & J. H. Lee. (2020, November). POSTECH-ETRI's Submission to the WMT2020 APE Shared Task: Automatic Post-Editing with Cross-lingual Language Model. In Proceedings of the Fifth Conference on Machine Translation (pp. 777-782).
  8. H. Yang et al. (2020, November). HW-TSC's Participation at WMT 2020 Automatic Post Editing Shared Task. In Proceedings of the Fifth Conference on Machine Translation (pp. 797-802).
  9. B. Zoph, D. Yuret, J. May & K. Knight. (2016). Transfer learning for low-resource neural machine translation. arXiv preprint arXiv:1604.02201.
  10. Y. Liu et al. (2020). Multilingual denoising pre-training for neural machine translation. Transactions of the Association for Computational Linguistics, 8, 726-742. https://doi.org/10.1162/tacl_a_00343
  11. A. Vaswani et al. (2017). Attention is all you need. In Advances in neural information processing systems (pp. 5998-6008).
  12. M. Negri, M. Turchi, R. Chatterjee & N. Bertoldi. (2018). ESCAPE: a large-scale synthetic corpus for automatic post-editing. arXiv preprint arXiv:1803.07274.
  13. W. Lee, B. Jung, J. Shin & J. H. Lee. (2021, April). Adaptation of Back-translation to Automatic Post-Editing for Synthetic Data Generation. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (pp. 3685-3691).
  14. W. Lee, J. Shin, B. Jung, J. Lee & J. H. Lee. (2020, November). Noising Scheme for Data Augmentation in Automatic Post-Editing. In Proceedings of the Fifth Conference on Machine Translation (pp. 783-788).
  15. R. Sennrich, B. Haddow & A. Birch. (2015). Improving neural machine translation models with monolingual data. arXiv preprint arXiv:1511.06709.
  16. J. Lim, H. Moon, C. Lee, C. Woo & H. Lim. (2021). An Automated Industry and Occupation Coding System using Deep Learning. Journal of the Korea Convergence Society, 12(4), 23-30. https://doi.org/10.15207/JKCS.2021.12.4.023
  17. S. Eo, C. Park, H. Moon, J. Seo & H. Lim. (2021). Comparative Analysis of Current Approaches to Quality Estimation for Neural Machine Translation. Applied Sciences, 11(14), 6584. https://doi.org/10.3390/app11146584
  18. J. Devlin, M. W. Chang, K. Lee & K. Toutanova. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
  19. A. Conneau & G. Lample. (2019). Cross-lingual language model pretraining. Advances in Neural Information Processing Systems, 32, 7059-7069.
  20. I. Sutskever, O. Vinyals & Q. V. Le. (2014). Sequence to sequence learning with neural networks. In Advances in neural information processing systems (pp. 3104-3112).
  21. C. Park, Y. Yang, K. Park & H. Lim. (2020). Decoding strategies for improving low-resource machine translation. Electronics, 9(10), 1562. https://doi.org/10.3390/electronics9101562
  22. H. Moon, C. Park, S. Eo, J. Park & H. Lim. (2021). Filter-mBART Based Neural Machine Translation Using Parallel Corpus Filtering. Journal of the Korea Convergence Society, 12(5), 1-7. DOI : 10.15207/JKCS.2021.12.5.001
  23. C. Park & H. Lim. (2020). A Study on the Performance Improvement of Machine Translation Using Public Korean-English Parallel Corpus. Journal of Digital Convergence, 18(6), 271-277. DOI : 10.14400/JDC.2020.18.6.271
  24. M. Snover, B. Dorr, R. Schwartz, L. Micciulla & J. Makhoul. (2006). A study of translation edit rate with targeted human annotation. In Proceedings of the 7th Conference of the Association for Machine Translation in the Americas: Technical Papers (pp. 223-231).
  25. K. Papineni, S. Roukos, T. Ward & W. J. Zhu. (2002, July). Bleu: a method for automatic evaluation of machine translation. In Proceedings of the 40th annual meeting of the Association for Computational Linguistics (pp. 311-318).
  26. T. Wolf et al. (2019). Huggingface's transformers: State-of-the-art natural language processing. arXiv preprint arXiv:1910.03771.
  27. T. Kudo & J Richardson. (2018). Sentencepiece: A simple and language independent subword tokenizer and detokenizer for neural text processing. arXiv preprint arXiv:1808.06226.
  28. C. Park, S. Eo, H. Moon & H. S. Lim. (2021, June). Should we find another model?: Improving Neural Machine Translation Performance with ONE-Piece Tokenization Method without Model Modification. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers (pp. 97-104).
  29. N. Houlsby et al. (2019, May). Parameter-efficient transfer learning for NLP. In International Conference on Machine Learning (pp. 2790-2799). PMLR.
  30. C. Park, J. Seo, S. Lee, C. Lee, H. Moon, S. Eo. & H. Lim. (2021). BTS: Back TranScription for Speech-to-Text Post-Processor using Text-to-Speech-to-Text In Proceedings of the 8th Workshop on Asian Translation (pp.106-116)