참고문헌
- Kang, Hyungsuc & Yang, Janghoon (2020). Performance comparison of Word2Vec and fastText embedding models. Journal of Digital Contents Society, 21(7), 1335-1343. http://dx.doi.org/10.9728/dcs.2020.21.7.1335
- Lee, Da-Bin & Choi, Sung-Pil (2019). Comparative analysis of various Korean morpheme embedding models using massive textual resources. Journal of KIISE, 46(5), 413-418. https://doi.org/10.5626/JOK.2019.46.5.413
- Lee, Yong-Gu (2020). A study on book categorization in social sciences using kNN classifiers and table of contents text. Journal of the Korean Society for Information Management, 37(1), 1-21. https://doi.org/10.3743/KOSIM.2020.37.1.001
- Bojanowski, P., Grave, E., Joulin, A., & Mikolov, T. (2017). Enriching word vectors with subword information. Transactions of the Association for Computational Linguistics, 5, 135-146. https://doi.org/10.1162/tacl_a_00051
- Chen, Q. & Sokolova, M. (2021). Specialists, scientists, and sentiments: Word2Vec and Doc2Vec in analysis of scientific and medical texts. SN Computer Science, 2(5), 414. https://doi.org/10.1007/s42979-021-00807-1
- Devlin, J., Chang, M., Lee, K., & Toutanova, K. (2019). BERT: pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics, Volume 1, 4171-4186. https://doi.org/10.18653/v1/N19-1423
- Dharma, E. M., Gaol, F. L., Warnars, H. L. H. S., & Soewito, B. (2022). The accuracy comparison among Word2vec, GloVe, and fastText towards convolution neural network (CNN) text classification. Journal of Theoretical and Applied Information Technology, 100(2), 349-359. https://doi.org/10.29207/resti.v6i3.3711
- Goularas, D. & Kamis, S. (2019). Evaluation of deep learning techniques in sentiment analysis from twitter data. In 2019 International Conference on Deep Learning and Machine Learning in Emerging Applications(Deep-ML), 12-17. https://doi.org/0.1109/Deep-ML.2019.00011 https://doi.org/10.1109/Deep-ML.2019.00011
- Harris, Z. S. (1954). Distributional structure. Word, 10(2-3), 146-162. https://doi.org/10.1080/00437956.1954.11659520
- Hinton, G. E., McClelland, J. L., & Rumelhart, D. E. (1986). Distributed Representations. In Rumelhart, D. E., McClelland, J. L., & the PDP Research Group eds. Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume I. Cambridge: Massachusetts Institute of Technology Press, 77-109.
- McCann, B., Bradbury, J., Xiong, C., & Socher, R. (2017). Learned in translation: contextualized word vectors. Proceedings of the 31st International Conference on Neural Information Processing Systems, 6297-6308.
- Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013a). Efficient Estimation of Word Representations in Vector Space. https://doi.org/10.48550/arXiv.1301.3781
- Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., & Dean, J. (2013b). Distributed representations of words and phrases and their compositionality. Advances in Neural Information Processing Systems, 26.
- Park, S., Byun, J., Baek, S., Cho, Y., & Oh, A. (2018). Subword-level word vector representations for Korean. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2429-2438. https://doi.org/10.18653/v1/P18-1226
- Pennington, J., Socher, R., & Manning, C. D. (2014). GloVe: global vectors for word representation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, 1532-1543. https://doi.org/10.3115/v1/D14-1162
- Peters, M. E., Neumann, M., Iyyer, M., Gardner, M., Clark, C., Lee, K., & Zettlemoyer, L. (2018). Deep contextualized word representations. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics, Volume 1, 2227-2237. https://doi.org/10.18653/v1/N18-1202
- Salton, G., Wong, A., & Yang, C. S. (1975). A vector space model for automatic indexing. Communications of the Association for Computing Machinery, 18(11), 613-620. https://doi.org/10.1145/361219.361220
- Sitender, Sangeeta, Sushma, N. S. & Sharma, S. K. (2023). Effect of GloVe, Word2Vec and fastText embedding on english and hindi neural machine translation systems. In Proceedings of Data Analytics and Management 2022, 433-447. https://doi.org/10.1007/978-981-19-7615-5_37
- Wang, B., Wang, A., Chen, F., Wang, Y., & Kuo, C. C. J. (2019). Evaluating word embedding models: methods and experimental results. Asia Pacific Signal and Information Processing Association Transactions on Signal and Information Processing, 8, e19. https://doi.org/10.1017/ATSIP.2019.12
- Wang, C., Nulty, P., & Lillis, D. (2020). A comparative study on word embeddings in deep learning for text classification. In Proceedings of the 4th International Conference on Natural Language Processing and Information Retrieval, 37-46. https://doi.org/10.1145/3443279.3443304
- Yang, X., Macdonald, C., & Ounis, I. (2018). Using word embeddings in twitter election classification. Information Retrieval Journal, 21, 183-207. https://doi.org/10.1007/s10791-017-9319-5