Browse > Article
http://dx.doi.org/10.7236/JIIBC.2022.22.4.23

News Recommendation Exploiting Document Summarization based on Deep Learning  

Heu, Jee-Uk (College of KNU Charm-Injae, Kangnam University)
Publication Information
The Journal of the Institute of Internet, Broadcasting and Communication / v.22, no.4, 2022 , pp. 23-28 More about this Journal
Abstract
Recently smart device(such as smart phone and tablet PC) become a role as an information gateway, using of the web news by multiple users from the web portal has been more important things. However, the quantity of creating web news on the web makes hard to catch the information which the user wants and confuse the users cause of the similar and repeated contents. In this paper, we propose the news recommend system using the document summarization based on KoBART which gives the selected news to users from the candidate news on the news portal. As a result, our proposed system shows higher performance and recommending the news efficiently by pre-training and fine-tuning the KoBART using collected news data.
Keywords
BART; BERT; Document-Summarization; Recommendation; RNN; Seq2Seq;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 Jeong, Eun-Hee, "Regional Image Change Analysis using Text Mining and Network Analysis", The Journal of Korea Institute of Information, Electronics, and Communication Technology, Vol 15, No 2, pp. 79-88. 2022.   DOI
2 Hochreiter, Sepp, and Jurgen Schmidhuber. "Long short-term memory", Neural computation, Vol. 9, No. 8, pp. 1735-1780, 1997.   DOI
3 Radford, Alec, et al. "Improving language understanding by generative pre-training", 2018.
4 Mikolov, T., Corrado, G., Chen, K. & Dean, J, "Efficient estimation of word representations in vector space", Proc. of the International Conference on Learning Representations, ICLR 2013, 1-12.
5 Matthew E. Peters, et al, "Deep Contextualized Word Representations", Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol, 1, pp. 2227-2237, 2018. DOI: http://dx.doi.org/10.18653/v1/N18-1202   DOI
6 Lewis, Mike, et al, "Bart: Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension", arXiv preprint arXiv:1910.13461, 2019.
7 Lin, C. Y., "Rouge: A package for automatic evaluation of summaries". In Text summarization branches out, pp. 74-81, 2004.
8 Pritchard, Duncan, "Good news, bad news, fake news", The Epistemology of Fake News, pp. 46-67, 2021. DOI: https://doi.org/10.1093/oso/9780198863977.003.0003   DOI
9 PARK, Raegeun, et al, "Application of Advertisement Filtering Model and Method for its Performance Improvement", Journal of the Korea Academia-Industrial cooperation Society, Vol 20, No. 11, pp. 1-8. 2020. DOI: https://doi.org/10.5762/KAIS.2020.21.11.1   DOI
10 Korea Press Foundation, "Audience Survey, 2021", 2021.
11 Chung, Junyoung, et al, "Empirical evaluation of gated recurrent neural networks on sequence modeling", arXiv preprint arXiv:1412.3555. 2014.
12 Devlin, J., Chang, M. W., Lee, K., & Toutanova, K., "Bert: Pre-training of deep bidirectional transformers for language understanding", arXiv preprint arXiv:1810.04805. 2018.
13 Sutskever, I., Vinyals, O., and Le, Q. "Sequence to sequence learning with neural networks", Advances in neural information processing systems, 2014.
14 Pennington, J., Socher, R., & Manning, C., "Glove: Global vectors for word representation", In Proceedings of the 2014 conference on empirical methods in natural language processing, pp. 1532-1543, Oct, 2014.
15 Bojanowski, P., Grave, E., Joulin, A., & Mikolov, T., "Enriching word vectors with subword information". Transactions of the Association for Computational Linguistics, pp. 135-146, 2017.   DOI
16 A, Mingxiao, et al. "Neural news recommendation with long-and short-term user representations", In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 336-345. 2019.
17 Okura, Shumpel, et al. "Embedding-based news recommendation for millions of users". In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp. 1933-1942., 2017.