References
- Bahdanau, D., Cho, K., and Bengio, Y., "Neural machine translation by jointly learning to align and translate", CoRR, abs/1409.0473, 2014.
- Cho, K., van Merrienboer, B., Gulcehre, C., Bougares, F., Schwenk, H., and Bengio, Y., "Learning phrase representations using rnn encoder-decoder for statistical machine translation", CoRR, abs/1406.1078, 2014.
- Han, J., Kim, J., Jeon, W., Kim, H., and Hong, Y., "A Counseling Matching System Using BERT Language Model", The Korean Institute of Information Scientists and Engineers, pp. 1566-1568, 2020.
- Hochreiter, S., Bengio, Y., Frasconi, P., and Schmidhuber, J., "Gradient flow in recurrent nets: The difficulty of learning long-term dependencies", 2001.
- Hong, S., Na, S., Kim, K. W., Shin, B., and Chung, Y., "BERT for Alzheimer's Disease and Schizophrenia diagnosis", The Korean Institute of Information Scientists and Engineers, pp. 419-421. 2020.
- Jacob, D., Chang, M., Lee, K., and Kristina, T., "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding", arXiv preprint at arXiv:1810.04805v1, 2018.
- Kim, J., "Analysis on the Increase in Unemployment Rates Since 2014", KDI (Korea Development Institute) Feature Article (2018.11. 06) Eng, 2018.
- Kim, Y., Denton, C., Hoang, L., and Rush, A. M., "Structured attention networks", In International Conference on Learning Representations, 2017.
- Le, Q. V. and Mikolov, T., "Distributed Representations of Sentences and Documents", Proceeding of International Conference on Machine Learning(ICML), 2014.
- Luong, M., Hieu, P., and Christopher, D. M., "Effective approaches to attentionbased neural machine translation", arXiv preprint arXiv:1508.04025, 2015.
- Maheshwary, S. and Misra, H., "Matching Resumes to Jobs via Deep Siamese Network", Proceedings of the The Web Conference 2018 (WWW '18), 2018, pp. 87-88.
- Pessemier, T. D., Vanhecke, K., and Martens, L., "A scalable, high-performance Algorithm for hybrid job recommendations", Proceedings of the Recommender Systems Challenge (RecSys Challenge '16), 2016.
- SKTBrain, "Korean BERT pre-trained cased (KoBERT)", https://github.com/SKTBrain/KoBERT.
- Son, B., "NCS Utilization in Recruitment and Selection : The case of HRDKorea.", Korean Management Consulting Review, Vol. 15, No. 4, 2015, pp. 217-228.
- Song, H. S., "A design of content-based metric learning model for HR matching", Journal of Information Technology Applications & Management, Vol. 27, No. 6, 2020, pp. 141-151. https://doi.org/10.21219/JITAM.2020.27.6.141
- Statistics Korea, "December 2020 and Annual Employment Trends", 2020,12.
- Sutskever, I., Vinyals, O., and Le, Q. V., "Sequence to sequence learning with neural networks", In Advances in Neural Information Processing Systems, 2014, pp. 3104-3112.
- Valverde-Rebaza, J., Puma, R., Bustios, P., and Silva, N. C., "Job Recommendation based on Job Seeker Skills: An Empirical Study", Proceedings of the First Workshop on Narrative Extraction From Text (Text2Story 2018), 2018, pp. 47-51.
- Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., and Polosukhin, I., "Attention is all you need", arXiv preprint arXiv:1706.03762. 2017.
- Yoo, S. and Jeong, O., "An Intelligent Chatbot Utilizing BERT Model and Knowledge Graph", The Journal of Society for e-Business Studies, Vol. 24, No. 3, 2019, pp. 87-98.