Acknowledgement
본 논문은 2020년도 정부(교육부)의 재원으로 한국연구재단의 지원을 받아 수행된 연구임.(2020R1I1A1A01058353)
References
- J. Lee, J. Eune, "A Study on the Preference Factors of KakaoTalk Emoticon", Cartoon & Animation Studies, no.51, pp.361-390, 2018. (in Korean)
- S. Suh, "A Study on way to Promote Learners' Participation in Real-Time Distance Education", Journal of Creative Information Culture, vol.6, no.3, pp.149-158, 2020. (in Korean) https://doi.org/10.32823/JCIC.6.3.202012.149
- S. Ahn, "Analysis of Changes in the Actual Condition of Distance Learning due to COVID-19", Journal of Creative Information Culture, vol.6, no.3, pp.189-197, 2020. (in Korean) https://doi.org/10.32823/JCIC.6.3.202012.189
- A. Cho, J. Lee, "An Exploratory Study on Vicious Cyber Replies", The Korea Journal of Youth Counseling, vol.18, no.2, pp.117-131, 2010. (in Korean) https://doi.org/10.35151/kyci.2010.18.2.008
- S. Lee, "A Swearword Filter System for Online Game Chatting", Journal of the Korea Institute of Information and Communication Engineering, vol.15, no.7, pp.1531-1536, 2011. (in Korean) https://doi.org/10.6109/JKIICE.2011.15.7.1531
- I. Na, S. Lee, J. Lee, J. Koh, "Abusive Detection Using Bidirectional Long Short-Term Memory Networks", The Korea Journal of BigData, vol.4, no.2, pp.35-45, 2019. (in Korean) https://doi.org/10.36498/KBIGDT.2019.4.2.35
- D. Jacob, M. Chang, K. Lee and K. Toutanova. "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," ArXiv: abs/1810.04805v2, 2019.
- A. Radford, K. Narasimhan, T. Salimans, I. Sutskever, "Improving language understanding by generative pre-training". [Online]. Avaliable: https://www.cs.ubc.ca/~amuham01/LING530/papers/radford2018improving.pdf (visited March. 2021).
- E. Jang, H. Choi, H. Lee, "Stock prediction using combination of BERT sentiment Analysis and Macro economy index", Journal of The Korea Society of Computer and Information, vol.25 no.5, pp.47-56, 2020. (in Korean) https://doi.org/10.9708/JKSCI.2020.25.05.047
- A. Vaswani, N.Shazeer, N. Parmar, J. Uszkoreit, L.Jones, A.-N. Gomez, L. Kaiser, I. Polosukhin, "Attention is all you need", Advances in Neural Information Processing Systems, pp.5998-6008, 2017.
- M. E. Peters, M. Neumann, M. lyyer, M. Gardner, "Deep contextualized word representations", arXivpreprint arXiv:1802.05365, 2018.
- Y. Wu, M. Schuster, Z. Chen, Q.-V. Le, M. Norouzi, W. Macherey, M. Krikun, Y. Cao, Q. Gao, K. Macherey, J. Klingner, A. Shah, M. Johnson, X. Liu, L. Kaiser, S. Gouws, Y. Kato, T. Kudo, H. Kazawa, K. Stevens, G. Kurian, N. Patil, W. Wang, C. Young, J. Smith, J. Riesa, A. Rudnick, O. Vinyals, G. Corrado, M. Hughes, J. Dean, "Google's neural machine translation system: Bridging the gap between human and machine translation", arXivpreprint arXiv:1609.08144, 2016.
- Y. Choi, K. Lee, "Performance Analysis of Korean Morphological Analyzer based on Transformer and BERT", Journal of The Korean Institute of Information Scientists and Engineers, vol.47, no.8, pp.730-741, 2020. (in Korean)
- Y. Chun, "A Study on the Validation of Tr ansfer Learning Effect Using BERT Transformer and Deep Learning : Application of Legal Consultation Data Classification Problem", Journal of the Korea Management Engineers Society, vol.24 no.4, pp.77-89, 2019. (in Korean) https://doi.org/10.35373/kmes.24.4.5
- S. Hwang, D. Kim, "BERT-based Classification Model for Korean Documents", The Jounal of Society for e-Business Studies, vol.25, no.1, pp.203-214, 2020. (in Korean)
- S. Seo, S. Cho, "A Transfer Learning Method for Solving Imbalance Data of Abusive Sentence Classification", Journal of Korea Information Science Society, vol.44, no.12, pp.1275-1281, 2017. (in Korean)
- Sun C., Qiu X., Xu Y., Huang X., "How to Fine-Tune BERT for Text Classification?", Chinese Computational Linguistics Lecture Notes in Computer Science, vol.11856, pp.194-206, 2019. https://doi.org/10.1007/978-3-030-32381-3_16