Acknowledgement
This work was supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT). (No. NRF-2022R1F1A1074696)
References
- J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, "Bert: Pre-training of deep bidirectional tranformers for language understanding," in Proceedings of NAACL-HLT 2019, Minneapolis: MN, pp. 4171-4186, 2018.
- H. Shin, M. Kim, Y. M. Jo, H. Jang, and A. Cattle, "Annotation Scheme for Constructing Sentiment Corpus in Korean," in Proceedings of the 26th Pacific Asia Conference on Language, Information, and Computation, Bali, Indosesia, pp. 181-190, 2012.
- S. A. Lee and H. P. Shin, "A Method of Infusing Additional Features into Pre-Trained BERT Models for Sentiment Analysis," in Proceedings of the 2020 Korea Software Symposium, Online, pp. 275-277, 2020.
- SKTBrain, developed Kobert source code Guide [Internet]. Available: https://github.com/SKTBrain/KoBERT.
- J. B. Lee, "KcBERT: Korean comments BERT," in Proceedings of the 32nd Korean and Korean Information Processing Conference, Online, pp. 437-440, 2020.
- J. W. Park, developed KoELECTRA source code Guide [Internet]. Available: https://github.com/monologg/KoELECTRA.
- J. S. Bae, C. G. Lee, J. H. Lim, and H. K. Kim. "BERT-based Data Augmentation Techniques for Korean Semantic Role Labeling," in Proceedings of the Korean Computer Science and Technology Conference, Online, pp. 335-337, 2020.
- NamuWiki, Korean Opinion poll overview [Internet]. Available: https://namu.wiki/w/%EC%97%AC%EB%A1%A0%EC%A1%B0%EC%82%AC.