Acknowledgement
본 연구는 과학기술정보통신부 및 정보통신기획평가원의 소프트웨어중심대학사업(2021-0-01409)과 과학기술정보통신부 및 정보통신기획평가원의 인공지능융합혁신인재양성사업(IITP-2023-RS-2023-00256629), 대학ICT연구센터사업(IITP-2024-RS-2024-00437718)의 연구 결과로 수행되었음
References
- Park, K., Choe, Y. J., and Ham, J. "Jejueo datasets for machine translation and speech synthesis." arXiv preprint arXiv:1911.12071 (2019).
- Conneau, Alexis, and Guillaume Lample. "Cross-lingual language model pretraining." Advances in Neural Information Processing Systems, vol. 32, 2019.
- Khanittanan, Wilaiwan (aka Kanittanan). South Asian Languages: Structure, Convergence and Diglossia, New Delhi, Motilal Banarsidass, 1986, pp. 174-178.
- Zheng, F., Marrese-Taylor, E., and Matsuo, Y. "Improving Jejueo-Korean Translation With Cross-Lingual Pretraining Using Japanese and Korean." Proceedings of the 9th Workshop on Asian Translation, International Conference on Computational Linguistics, Gyeongju, Republic ofKorea, 2022, pp. 44-50.