DOI QR코드

DOI QR Code

동영상 실시간 시청시 유발전위(ERP) N400 속성을 이용한 주제무관 쇼트 선별 자동영상요약 연구

A Video Summarization Study On Selecting-Out Topic-Irrelevant Shots Using N400 ERP Components in the Real-Time Video Watching

  • Kim, Yong Ho (Dept. of Mass Communications, College of Humanities and Social Sciences, Pukyong National University) ;
  • Kim, Hyun Hee (Dept. of Library and Information Science, College of Humanities, Myongji University)
  • 투고 : 2017.07.17
  • 심사 : 2017.07.31
  • 발행 : 2017.08.31

초록

'Semantic gap' has been a year-old problem in automatic video summarization, which refers to the gap between semantics implied in video summarization algorithms and what people actually infer from watching videos. Using the external EEG bio-feedback obtained from video watchers as a solution of this semantic gap problem has several another issues: First, how to define and measure noises against ERP waveforms as signals. Second, whether individual differences among subjects in terms of noise and SNR for conventional ERP studies using still images captured from videos are the same with those differently conceptualized and measured from videos. Third, whether individual differences of subjects by noise and SNR levels help to detect topic-irrelevant shots as signals which are not matched with subject's own semantic topical expectations (mis-match negativity at around 400m after stimulus on-sets). The result of repeated measures ANOVA test clearly shows a 2-way interaction effect between topic-relevance and noise level, implying that subjects of low noise level for video watching session are sensitive to topic-irrelevant visual shots, while showing another 3-way interaction among topic-relevance, noise and SNR levels, implying that subjects of high noise level are sensitive to topic-irrelevant visual shots only if they are of low SNR level.

키워드

참고문헌

  1. S. Lu, M.R. Lyu, and I. King, "Semantic Video Summarization Using Mutual Reinforcement Principle and Shot Arrangement Patterns," Proceedings of the 11th International Multimedia Modelling Conference, pp. 60-67, 2005.
  2. A.G. Money and H. Agius, "Video Summarisaion: A Conceptual Framework and Survey," Journal of Visual Communication and Image Representation, Vol. 19, No. 2, pp. 121-143, 2008. https://doi.org/10.1016/j.jvcir.2007.04.002
  3. A.W.M. Smeulders, M. Worring, S. Satini, A. Gupta, and R. Jain, “Content-based Image Retrieval at the End of the Early Years,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 12, pp. 1349-1380, 2000. https://doi.org/10.1109/34.895972
  4. A.G. Money and H. Agius, “Analysing User Physiological Responses for Affecive Video Summarisation,” Displays, Vol. 30, No. 2, pp. 59-70, 2009. https://doi.org/10.1016/j.displa.2008.12.003
  5. S. Koelstra, C. Muhl, and I. Patras, "EEG Analysis for Implicit Tagging of Video Data," Proceeding of the 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1-6, 2009.
  6. M.J.A. Eugster, T. Ruotsalo, M.M. Spape, I. Kosunen, S. Barral, and N. Ravaja, et al., "Predicting Term-Relevance from Brain Signals," Proceedings of the 37th International Association for Computing Machinery Special Interest Group on Information Retrieval Conference on Research and Development in Information Retrieval, pp. 425-434, 2014.
  7. M. Allegretti, Y. Moshfeghi, J. Hadjigeorgieva, F.E. Pollick, J.M. Jose, and G. Pasi, "When Relevance Judgement is Happening? An EEG-based Study," Proceedings of the 38th International Association for Computing Machinery Special Interest Group on Information Restrieval Conference on Research and Development in Information Retrieval, pp. 719-722, 2015.
  8. Y.H. Kim and H.H. Kim, “Automatic Extraction Techniques of Topic-relevant Visual Shots Using Realtime Brainwave Responses: ERP N400 and P600 Hypotheses Test,” Journal of Korea Multimedia Society, Vol. 19, No. 8, pp. 1260-1274, 2016. https://doi.org/10.9717/kmms.2016.19.8.1260
  9. T.W. Picton, S. Bentin, P. Berg, E. Donchin, S.A. Hillayard, R.J. Johnson, et al., “Guidelines for Using Human Event-related Potentials to Study Cognition: Recording Standards and Publication Criteria,” Psychophysiology, Vol. 37, No. 2, pp. 127-152, 2000. https://doi.org/10.1111/1469-8986.3720127
  10. M. Fabiani, G. Gratton, and M.G.H. Coles, Event-related Brain Potentials-Methods, Theory, and Applications, Handbook of Psychophysiology, Cambridge University Press, Cambridge, 2000.
  11. M. Kutas and S.A. Hillyard, “Reading Senseless Sentences: Brain Potentials Reflect Semantic Incongruity,” Science, Vol. 207, No. 4427, pp. 203-205, 1980. https://doi.org/10.1126/science.7350657
  12. S.J. Luck, Ten Simple Rules for Designing and Interpreting ERP Experiments, Event-Related Potentials: A Methods Handbook, Maddachusetts Institute of Technology Press, Cambridge, 2004.