Acknowledgement
This research was supported by the MSIT(Ministry of Science and ICT), Korea, under the ITRC(Information Technology Research Center) support program(IITP-2018-0-01405) supervised by the IITP(Institute for Information & Communications Technology Planning & Evaluation).
References
- C. Park & H. Lim. (2020). Automatic Post Editing Research. Journal of the Korea Convergence Society, 11(5), 1-8. DOI : 10.15207/JKCS.2020.11.5.001
- S. Pal, N. Herbig, A. Kruger & J. van Genabith. (2018, October). A transformer-based multi-source automatic post-editing system. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers (pp. 827-835).
- P. Isabelle, C. Goutte & M. Simard. (2007). Domain adaptation of MT systems through automatic post-editing. MT Summit XI, 102.
- S. Chollampatt, R. H. Susanto, L. Tan & E. Szymanska. (2020). Can Automatic Post-Editing Improve NMT?. arXiv preprint arXiv:2009.14395.
- R Chatterjee, M. Freitag, M. Negri & M. Turchi. (2020, November). Findings of the WMT 2020 Shared Task on Automatic Post-Editing. In Proceedings of the Fifth Conference on Machine Translation (pp. 646-659).
- A. V. Lopes, M. A. Farajian, G. M. Correia, J., Trenous & A. F. Martins. (2019). Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing. arXiv preprint arXiv:1905.13068.
- J. Lee, W. Lee, J. Shin, B. Jung, Y. G. Kim & J. H. Lee. (2020, November). POSTECH-ETRI's Submission to the WMT2020 APE Shared Task: Automatic Post-Editing with Cross-lingual Language Model. In Proceedings of the Fifth Conference on Machine Translation (pp. 777-782).
- H. Yang et al. (2020, November). HW-TSC's Participation at WMT 2020 Automatic Post Editing Shared Task. In Proceedings of the Fifth Conference on Machine Translation (pp. 797-802).
- B. Zoph, D. Yuret, J. May & K. Knight. (2016). Transfer learning for low-resource neural machine translation. arXiv preprint arXiv:1604.02201.
- Y. Liu et al. (2020). Multilingual denoising pre-training for neural machine translation. Transactions of the Association for Computational Linguistics, 8, 726-742. https://doi.org/10.1162/tacl_a_00343
- A. Vaswani et al. (2017). Attention is all you need. In Advances in neural information processing systems (pp. 5998-6008).
- M. Negri, M. Turchi, R. Chatterjee & N. Bertoldi. (2018). ESCAPE: a large-scale synthetic corpus for automatic post-editing. arXiv preprint arXiv:1803.07274.
- W. Lee, B. Jung, J. Shin & J. H. Lee. (2021, April). Adaptation of Back-translation to Automatic Post-Editing for Synthetic Data Generation. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (pp. 3685-3691).
- W. Lee, J. Shin, B. Jung, J. Lee & J. H. Lee. (2020, November). Noising Scheme for Data Augmentation in Automatic Post-Editing. In Proceedings of the Fifth Conference on Machine Translation (pp. 783-788).
- R. Sennrich, B. Haddow & A. Birch. (2015). Improving neural machine translation models with monolingual data. arXiv preprint arXiv:1511.06709.
- J. Lim, H. Moon, C. Lee, C. Woo & H. Lim. (2021). An Automated Industry and Occupation Coding System using Deep Learning. Journal of the Korea Convergence Society, 12(4), 23-30. https://doi.org/10.15207/JKCS.2021.12.4.023
- S. Eo, C. Park, H. Moon, J. Seo & H. Lim. (2021). Comparative Analysis of Current Approaches to Quality Estimation for Neural Machine Translation. Applied Sciences, 11(14), 6584. https://doi.org/10.3390/app11146584
- J. Devlin, M. W. Chang, K. Lee & K. Toutanova. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805.
- A. Conneau & G. Lample. (2019). Cross-lingual language model pretraining. Advances in Neural Information Processing Systems, 32, 7059-7069.
- I. Sutskever, O. Vinyals & Q. V. Le. (2014). Sequence to sequence learning with neural networks. In Advances in neural information processing systems (pp. 3104-3112).
- C. Park, Y. Yang, K. Park & H. Lim. (2020). Decoding strategies for improving low-resource machine translation. Electronics, 9(10), 1562. https://doi.org/10.3390/electronics9101562
- H. Moon, C. Park, S. Eo, J. Park & H. Lim. (2021). Filter-mBART Based Neural Machine Translation Using Parallel Corpus Filtering. Journal of the Korea Convergence Society, 12(5), 1-7. DOI : 10.15207/JKCS.2021.12.5.001
- C. Park & H. Lim. (2020). A Study on the Performance Improvement of Machine Translation Using Public Korean-English Parallel Corpus. Journal of Digital Convergence, 18(6), 271-277. DOI : 10.14400/JDC.2020.18.6.271
- M. Snover, B. Dorr, R. Schwartz, L. Micciulla & J. Makhoul. (2006). A study of translation edit rate with targeted human annotation. In Proceedings of the 7th Conference of the Association for Machine Translation in the Americas: Technical Papers (pp. 223-231).
- K. Papineni, S. Roukos, T. Ward & W. J. Zhu. (2002, July). Bleu: a method for automatic evaluation of machine translation. In Proceedings of the 40th annual meeting of the Association for Computational Linguistics (pp. 311-318).
- T. Wolf et al. (2019). Huggingface's transformers: State-of-the-art natural language processing. arXiv preprint arXiv:1910.03771.
- T. Kudo & J Richardson. (2018). Sentencepiece: A simple and language independent subword tokenizer and detokenizer for neural text processing. arXiv preprint arXiv:1808.06226.
- C. Park, S. Eo, H. Moon & H. S. Lim. (2021, June). Should we find another model?: Improving Neural Machine Translation Performance with ONE-Piece Tokenization Method without Model Modification. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers (pp. 97-104).
- N. Houlsby et al. (2019, May). Parameter-efficient transfer learning for NLP. In International Conference on Machine Learning (pp. 2790-2799). PMLR.
- C. Park, J. Seo, S. Lee, C. Lee, H. Moon, S. Eo. & H. Lim. (2021). BTS: Back TranScription for Speech-to-Text Post-Processor using Text-to-Speech-to-Text In Proceedings of the 8th Workshop on Asian Translation (pp.106-116)