Browse > Article
http://dx.doi.org/10.3745/KTSDE.2021.10.11.521

Sentence Recommendation Using Beam Search in a Military Intelligent Image Analysis System  

Na, Hyung-Sun (광운대학교 인공지능융합학과)
Jeon, Tae-Hyeon (호서대학교 컴퓨터공학과)
Kang, Hyung-Seok (국방과학연구소)
Ahn, Jinhyun (제주대학교 경영정보학과)
Im, Dong-Hyuk (광운대학교 정부융합학부)
Publication Information
KIPS Transactions on Software and Data Engineering / v.10, no.11, 2021 , pp. 521-528 More about this Journal
Abstract
Existing image analysis systems in use in the military field are carried out by readers analyzing and identifying images themselves, writing and disseminating related content, and in this process, repetitive tasks are frequent, resulting in workload. In this paper, to solve the previous problem, we proposed an algorithm that can operate the Seq2Seq model on a word basis, which operates on a sentence basis, and applied the Attention technique to improve accuracy. In addition, by applying the Beam Search technique, we would like to recommend various current identification sentences based on the past identification contents of a specific area. It was confirmed through experiments that the Beam Search technique recommends sentences more effectively than the existing greedy Search technique, and confirmed that the accuracy of recommendation increases when the size of Beam is large.
Keywords
NLP; Beam Search; Military; Seq2seq; Attention Mechanism;
Citations & Related Records
연도 인용수 순위
  • Reference
1 S. Wiseman and A. M. Rush. "Sequence -to-sequence learning as beam-search optimization," Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, 2016.
2 pyhwp Documentation 2013. [Internet], https://pythonhosted.org/pyhwp/ko/ (accessed August 2, 2021.)
3 P. Bojanowski, E. Grave, A. Joulin, and T. Mikolov, "Enriching word vectors with subword information," Transactions of the Association for Computational Linguistics, Vol.5, pp.135-146, 2017.   DOI
4 E. J. Park and S. Z. Cho, "KoNLPy: Korean natural language processing in Python," Annual Conference on Human and Language Technology, pp.133-136, 2014.
5 I. Sutskever, O. Vinyals, and V. L. Quoc, "Sequence to sequence learning with neural networks," In: Advances in neural Information Processing Systems, pp.3104-3112, 2014.
6 K. Palasundram, N. M. Sharef, N. Nasharuddin, K. Kasmiran, and A. Azman "Sequence to sequence model performance for education chatbot," International Journal of Emerging Technologies in Learning (iJET), Vol.14, No.24, pp.56-68, 2019.
7 T. H. Jeon, H. S. Na, J. H. Ahn, and D. H. Im, "Pre-processing and implementation for intelligent imagery interpretation system," Proceedings of the Korea Information Processing Society Conference, Vol.28, pp.305-307, 2021.
8 M. Zhang, Z. Li, G. Fu, and M. Zhang, "Syntax-enhanced neural machine translation with syntax-aware word representations," Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Vol.1 (Long and Short Papers), 2019.
9 D. Bahdanau, C. Kyunghyun, and Y. Bengio, "Neural machine translation by jointly learning to align and translate," 3rd International Conference on Learning Representations, ICLR 2015.
10 Y. D. Kim and H. J. Gwon, "A study on defense command and control system AI application," Korea Information Processing Society Review, Vol.24, No.1, pp.13-18, 2017.
11 K. Qian and Z. Yu. "Domain adaptive dialog generation via meta learning," Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019.
12 Comparison of Korean stemming analyzer performance (2018). [Internet], https://iostream.tistory.com/144 (accessed August 2, 2021)
13 T. Mikolov, K. Chen, G Corrado, and J. Dean, "Efficient estimation of word representations in vector space," arXiv preprint arXiv:1301.3781, 2013.
14 Y. Bengio, R. Ducharme, P. Vincent, and C. Jauvin, "A neural probabilistic language model," Journal of Machine Learning Research, Vol.3, 1137-1155, 2003.
15 S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural Computation, Vol.9, No.8, pp.1735-1780, 1997.   DOI
16 J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, "Empirical evaluation of gated recurrent neural networks on sequence modeling." NIPS 2014 Workshop on Deep Learning, Dec. 2014.
17 K. Cho, "Noisy parallel approximate decoding for conditional recurrent language model," arXiv preprint arXiv:1605.03835, 2016.
18 J. Pennington, R. Socher, and C. D. Manning, "Glove: Global vectors for word representation," Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, pp.1532-1543, 2014.
19 Y. Bengio, P. Simard, and P. Frasconi, "Learning long-term dependencies with gradient descent is difficult." IEEE Transactions on Neural Networks, Vol.5, No.2, pp.157-166, 1994.   DOI
20 S. Mangal, P. Joshi, and R. Modak, "Lstm vs. gru vs. bidirectional rnn for script generation," arXiv preprint arXiv: 1908.04332, 2019.
21 T. Mikolov, M. Karafiat, L. Burget, J. Cernocky, and S. Khudanpur, "Recurrent neural network based language model," In Eleventh Annual Conference of the International Speech Communication Association, Vol.9, pp.1045-1048, 2010.