Browse > Article

Emotion Recognition based on Tracking Facial Keypoints  

Lee, Yong-Hwan (Dept. of Digital Contents, Wonkwang University)
Kim, Heung-Jun (Dept. of Computer Science and Engineering, Gyeongnam National University of Science and Technology)
Publication Information
Journal of the Semiconductor & Display Technology / v.18, no.1, 2019 , pp. 97-101 More about this Journal
Abstract
Understanding and classification of the human's emotion play an important tasks in interacting with human and machine communication systems. This paper proposes a novel emotion recognition method by extracting facial keypoints, which is able to understand and classify the human emotion, using active Appearance Model and the proposed classification model of the facial features. The existing appearance model scheme takes an expression of variations, which is calculated by the proposed classification model according to the change of human facial expression. The proposed method classifies four basic emotions (normal, happy, sad and angry). To evaluate the performance of the proposed method, we assess the ratio of success with common datasets, and we achieve the best 93% accuracy, average 82.2% in facial emotion recognition. The results show that the proposed method effectively performed well over the emotion recognition, compared to the existing schemes.
Keywords
Facial Emotion Recognition; Active Appearance Model; Facial Keypoints; Facial Feature Tracking; Emotion Classification Model;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 G. H. and P. M., "Automatic Temporal Segment Detection and Affect Recognition from Face and Body Display," IEEE Transactions on Systems, Man, and Cybernetics - Part B, Vol. 39, No. 1, pp. 64-84, 2009.   DOI
2 P. Ekman and W. Friesen, Facial Action Coding System (FACS): Investigator's Guide, Consulting Psychologists Press, 1978.
3 I. Cohen, A. Garg, and T. S. Huang, "Emotion Recognition from Facial Expressions using Multilevel HMM," Workshop on Affective Computing, 2000.
4 Y. Koda, Y. Yoshitomi, M. Nakano, and M. Tabuse, "Facial Expression Recognition for Speaker using Thermal Image Processing and Speech Recognition System," International Symposium on Robot and Human Interactive Communication, 2009.
5 S. V. Ioannou, A. T. Raouzaiou, V. A. Tzouvaras, T. P. Mailis, K. C. Karpouzis, and S. D. Kollias. Emotion, "Recognition through Facial Expression Analysis based on a Neurofuzzy Network", Neural Networks, Vol. 18, No. 4, pp. 423-435, 2005.   DOI
6 P. M. and I. Patras, "Dynamics of Facial Expression: Recognition of Facial Actions and their Temporal Segments from Face Profile Image Sequences," IEEE Transactions on Systems, Man and Cybernetics, Vol. 36, No. 2, pp.433-449, 2006.   DOI
7 P. M. and L. Rothkrantz, "Facial Action Recognition for Facial Expression Analysis from Static Face Images," IEEE Transactions on Systems, Man and Cybernetics - Part B, Vol. 34, No. 3, pp. 1449-1461, 2004.   DOI
8 M. Valster, P. M., Z. Ambadar, and J. Cohn, "Spontaneous vs. Posed Facial Behavior: Automatic Analysis of Brow Actions," ACM, pp. 162-170, 2006.
9 P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews, "The Extended Cohn-Kanade Dataset (ck+): A Complete Dataset for Action Unit and Emotion-Specified Expression," Conference on Computer Vision and Pattern Recognition, pp. 94-101, 2010.
10 Y.H. Lee and H.J. Kim, "Face Detection using AdaBoost and ASM," Journal of the Semiconductor and Display Technology, Vol. 17, No. 4, 2018.
11 Changrong Yu, Jiehan Zhou and Kukka Riekki, "Expression and Analysis of Emotions: Survey and Expreiment," Workshops on Ubiquitous, Autonomic and Trusted Computing, pp. 428-433, 2009.
12 Aleix Martinez and Shichuan Du, "A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives," Journal of Machine Learning Research, Vol. 13, pp. 1589-1608, 2012.
13 Lacopo Masi, Yue Wu, Tal Hassner and Prem Natarajan, "Deep Face Recognition: A Survey," International Conference on Graphics, Patterns and Images, pp. 471-478, 2018.
14 Jamy Li, "Social Robots as Interactive Technology Agents: Supporting Design with Exploratory Assessment," International Conference on Human-Robot Interaction, pp. 629-630, 2016.
15 G.J. Edwards, C.J. Taylor and T.F. Cootes, "Interpreting Face Images using Active Appearance Models," International Conference on Automatic Face and Gesture Recognition, 1998.
16 Eui-Young Cha, Jung-Hwa Lee and Hyun-Jun Park, "Facial Expression Recognition based on AAM using Backpropagation," Journal of Korea Multimedia Society, Vol. 13, No. 1, pp. 227-230, 2010.
17 T.F. Cootes, G.J. Edwards, C.J.Taylor, "Active Appearance Models," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 6, pp. 681-685, 2001.   DOI