Browse > Article
http://dx.doi.org/10.4218/etrij.13.0113.0194

Extraction of User Preference for Video Stimuli Using EEG-Based User Responses  

Moon, Jinyoung (Software Research Laboratory, ETRI, KAIST)
Kim, Youngrae (Software Research Laboratory, ETRI)
Lee, Hyungjik (Software Research Laboratory, ETRI)
Bae, Changseok (Software Research Laboratory, ETRI)
Yoon, Wan Chul (Department of Industrial and System Engineering, KAIST)
Publication Information
ETRI Journal / v.35, no.6, 2013 , pp. 1105-1114 More about this Journal
Abstract
Owing to the large number of video programs available, a method for accessing preferred videos efficiently through personalized video summaries and clips is needed. The automatic recognition of user states when viewing a video is essential for extracting meaningful video segments. Although there have been many studies on emotion recognition using various user responses, electroencephalogram (EEG)-based research on preference recognition of videos is at its very early stages. This paper proposes classification models based on linear and nonlinear classifiers using EEG features of band power (BP) values and asymmetry scores for four preference classes. As a result, the quadratic-discriminant-analysis-based model using BP features achieves a classification accuracy of 97.39% (${\pm}0.73%$), and the models based on the other nonlinear classifiers using the BP features achieve an accuracy of over 96%, which is superior to that of previous work only for binary preference classification. The result proves that the proposed approach is sufficient for employment in personalized video segmentation with high accuracy and classification power.
Keywords
Preference; video; EEG; classification; feature selection; brain-computer interface;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 Ericsson Consumer Lab, TV & Video Consumer Trend Report 2011, Ericsson, Sept. 2011.
2 H. Joho et al., "Exploiting Facial Expressions for Affective Video Summarisation," Proc. ACM CIVR, July 2009, pp. 31:1-31:8.
3 W.-T. Peng et al., "Editing by Viewing: Automatic Home Video Summarization by Viewing Behavior Analysis," IEEE Trans. Multimedia, vol. 13, no. 3, June 2011, pp. 539-550.   DOI   ScienceOn
4 A.G. Money and H. Agius, "Analysing User Physiological Responses for Affective Video Summarisation," Displays, vol. 30, no. 2, Apr. 2009, pp. 59-70.   DOI   ScienceOn
5 R. Horlings, D. Datcu, and L.J. Rothkrantz, "Emotion Recognition Using Brain Activity," Proc. CompSysTech PhD Workshop, June 2008, article no. 6.
6 A.G. Money and H. Agius, "ELVIS: Entertainment-Led Video Summaries," ACM Trans. Multimedia Comput., Commun., Appl., vol. 6, no. 3, Aug. 2010, pp. 17:1-17:30.
7 S. Karakaş and E. Başar, "Models and Theories of Brain Function in Cognition within a Framework of Behavioral Cognitive Psychology," Int. J. Psychophysiol., vol. 60, no. 2, May 2006, pp. 186-193.   DOI   ScienceOn
8 D.O. Bos, "EEG-based Emotion Recognition: The Influence of Visual and Auditory Stimuli," online paper, Department of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, Netherlands, 2008. Available: http://hmi.ewi.utwente.nl/verslagen/capita-selecta/CS-Oude_Bos- Danny.pdf
9 K. Schaaff and T. Schultz, "Towards Emotion Recognition from Electroencephalographic Signals," Proc. ACII, Sept. 2009, pp. 1-6.
10 G. Chanel et al., "Short-Term Emotion Assessment in a Recall Paradigm," Int. J. Hum-Comp St., vol. 67, no. 8, Aug. 2009, pp. 607-627.
11 P.C. Petrantonakis and L.J. Hadjileontiadis, "Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis," IEEE Trans. Affect. Comput., vol. 1, no. 2, July-Dec. 2010, pp. 81-97.   DOI   ScienceOn
12 Y. Liu, O. Sourina, and M.K. Nguyen, "Real-Time EEG-Based Human Emotion Recognition and Visualization," Proc. CW, Oct. 2010, pp. 262-269.
13 Y.-P. Lin et al., "EEG-Based Emotion Recognition in Music Listening," IEEE Trans. Biomed. Eng., vol. 57, no. 7, July 2010, pp. 1798-1806.   DOI   ScienceOn
14 M. Murugappan, N. Ramachandran, and Y. Sazali, "Classification of Human Emotion from EEG Using Discrete Wavelet Transform," J. Biomed. Sci. Eng., vol. 3, no. 4, 2010, pp. 390-396.   DOI
15 D. Nie et al., "EEG-based Emotion Recognition during Watching Movies," Proc. IEEE/EMBS NER, Apr. 27-May 1, 2011, pp. 667- 670.
16 B. Hamadicharef et al., "Learning EEG-Based Spectral-Spatial Patterns for Attention Level Measurement," Proc. IEEE ISCAS, May 2009, pp. 1465-1468.
17 Z. Khalili and M.H. Moradi, "Emotion Recognition System Using Brain and Peripheral Signals: Using Correlation Dimension to Improve the Results of EEG," Proc. IJCNN, June 2009, pp. 1571- 1575.
18 S. Koelstra et al., "Single Trial Classification of EEG and Peripheral Physiological Signals for Recognition of Emotions Induced by Music Videos," Proc. BI, Aug. 2010, pp. 89-100.
19 A. Yazdani et al., "Affect Recognition Based on Physiological Changes during the Watching of Music Videos," ACM Trans. Interact. Intell. Syst., vol. 2, no. 1, Mar. 2012, pp. 7:1-7:26.
20 J.W. Bang, E.C. Lee, and K.R. Park, "New Computer Interface Combining Gaze Tracking and Brainwave Measurements," IEEE Trans. Consum. Electron., vol. 57, no. 4, Nov. 2011, pp. 1646-1651.   DOI   ScienceOn
21 J.A. Russell, "A Circumplex Model of Affect," J. Pers. Soc. Psychol., vol. 39, no. 6, Dec. 1980, pp. 1161-1178.   DOI
22 D. Västfjäll and T. Gärling, "Preference for Negative Emotions," Emotion, vol. 6, no. 2, May 2006, pp. 326-329.   DOI   ScienceOn
23 S. Garrido and E. Schubert, "Negative Emotion in Music: What is the Attraction? A Qualitative Study," Empir. Musicol. Rev., vol. 6, no. 4, Oct. 2011, pp. 214-230.   DOI
24 G.M.M. Aurup, User Preference Extraction from Bio-signals: An Experimental Study, master's thesis, Concordia University, 2011.
25 S.K. Hadjidimitriou and L.J. Hadjileontiadis, "Toward an EEGBased Recognition of Music Liking Using Time-Frequency Analysis," IEEE Trans. Biomed. Eng., vol. 59, no. 12, Dec. 2012, pp. 3498-3510.   DOI   ScienceOn
26 J. Malmivou and R. Plonsey, Bioelectromagnetism: Principles and Applications of Bioelectric and Biomagnetic Fields, 1st ed., New York: Oxford University Press, 1995.
27 O. Varol, Raw EEG Data Classification and Applications Using SVM, master's thesis, Istanbul Technical University, 2010.
28 Z. Iscana, Z. Dokura, and T. Demiralp, "Classification of Electroencephalogram Signals with Combined Time and Frequency Features," Expert Syst. Appl., vol. 38, no. 8, Aug. 2011, pp. 10499-10505.   DOI   ScienceOn
29 G.A. Miller, "The Magical Number Seven Plus or Minus Two: Some Limits on Our Capacity for Processing Information," Psychol. Rev., vol. 63, no. 2, 1956, pp. 81-97.   DOI   ScienceOn
30 D.P. Allen and C.D. MacKinnon, "Time-Frequency Analysis of Movement-Related Spectral Power in EEG during Repetitive Movements: A Comparison of Methods," J. Neurosci. Methods, vol. 186, no. 1, Jan. 2010, pp. 107-115.   DOI   ScienceOn
31 L. Sörnmo and P. Laguna, Bioelectrical Processing in Cardiac and Neurological Applications, 1st ed., Waltham, MA: Elsevier Academic Press, 2005.
32 J.J. Allen et al., "The Stability of Resting Frontal Electroencephalographic Asymmetry in Depression," Psychophysiology, vol. 41, no. 2, Mar. 2004, pp. 269-280.   DOI   ScienceOn
33 J.J. Allen and J.P. Kline, "Frontal EEG Asymmetry, Emotion, and Psychopathology: The First, and the Next 25 Years," Biol. Psychol., vol. 67, no. 1-2, Oct. 2004, pp. 1-5.   DOI   ScienceOn
34 W. Yun et al., "Disguised-Face Discriminator for Embedded Systems," ETRI J., vol. 32, no. 5, Oct. 2010, pp. 761-765.   DOI   ScienceOn
35 T. Jabid, M.H. Kabir, and O. Chae, "Robust Facial Expression Recognition Based on Local Directional Pattern," ETRI J., vol. 32, no. 5, Oct. 2010, pp. 784-794.   DOI   ScienceOn
36 H. Peng, F. Long, and C. Ding, "Feature Selection Based on Mutual Information: Criteria of Max-Dependency, Max- Relevance, and Min-Redundancy," IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, no. 8, Aug. 2005, pp. 1226-1238.   DOI   ScienceOn
37 T.R. Schneider et al., "Enhanced EEG Gamma-Band Activity Reflects Multisensory Semantic Matching in Visual-to-Auditory Object Priming," NeuroImage, vol. 42, no. 3, Sept. 2008, pp. 1244-1254.   DOI   ScienceOn
38 M.A. Kisley and Z.M. Cornwell, "Gamma and Beta Neural Activity Evoked during a Sensory Gating Paradigm: Effects of Auditory, Somatosensory and Cross-modal Stimulation," Clin. Neurophysiol., vol. 117, no. 11, Nov. 2006, pp. 2549-2563.   DOI   ScienceOn
39 W.J. Ray and H.W. Cole, "EEG Alpha Activity Reflects Attentional Demands, and Beta Activity Reflects Emotional and Cognitive Processes," Sci., vol. 228, no. 4700, May 1985, pp. 750-752.   DOI