DOI QR코드

DOI QR Code

상반된 감성에 따른 안면 움직임 차이에 대한 분석

Analysis of Facial Movement According to Opposite Emotions

  • 이의철 (상명대학교 컴퓨터과학과) ;
  • 김윤경 (상명대학교 대학원 컴퓨터과학과) ;
  • 배민경 (상명대학교 대학원 컴퓨터과학과) ;
  • 김한솔 (상명대학교 대학원 컴퓨터과학과)
  • 투고 : 2015.06.19
  • 심사 : 2015.07.22
  • 발행 : 2015.10.28

초록

본 논문에서는 Kinect 카메라를 통해 촬영된 영상 처리를 통해 상반된 감성 자극 관점에서 안면 움직임의 차이를 분석하는 연구를 진행하였다. Russell의 2차원 감성 모델에서 원점 대칭 위치에 존재하는 두 상반된 감성인 "Sad - Excitement", "Contentment - Angry" 감성을 유발하기 위해 피험자에게 시각자극과 청각자극을 동시에 제공하였다. Kinect Face Tracking SDK에서 제공되는 121개 특징점으로 구성된 안면 active appearance model에서 안면 움직임을 잘 표현하는 31개의 주요 특징점 주변의 화소 변화를 측정하였다. 안면 근육의 비선형적 움직임 문제를 해결하기 위해 지역 이동 기반 최소거리 결정 방법(local minimum shift matching)을 사용하였다. 분석 결과, sad 감성에서는 우측 안면 움직임이 많이 나타났고, excitement 감성에서는 좌측 안면 움직임이 많이 나타남으로써 두 상반된 감성 자극에 대한 안면 움직임의 위치 또한 상반된 결과를 보였다. 또한 "Contentment" 감성에서는 좌측 안면 움직임이 많이 나타났고, "Angry" 감성에서는 안면의 좌우 구분 없이 움직임이 나타남으로써, 두 상반된 감성 자극에 대해서는 우측 안면에서 차이를 확인할 수 있었다.

In this paper, a study on facial movements are analyzed in terms of opposite emotion stimuli by image processing of Kinect facial image. To induce two opposite emotion pairs such as "Sad - Excitement"and "Contentment - Angry" which are oppositely positioned onto Russell's 2D emotion model, both visual and auditory stimuli are given to subjects. Firstly, 31 main points are chosen among 121 facial feature points of active appearance model obtained from Kinect Face Tracking SDK. Then, pixel changes around 31 main points are analyzed. In here, local minimum shift matching method is used in order to solve a problem of non-linear facial movement. At results, right and left side facial movements were occurred in cases of "Sad" and "Excitement" emotions, respectively. Left side facial movement was comparatively more occurred in case of "Contentment" emotion. In contrast, both left and right side movements were occurred in case of "Angry" emotion.

키워드

참고문헌

  1. M. W. Park, C. J. Kim, M. Whang, and E. C. Lee, "Individual Emotion Classification between Happiness and Sadness by Analyzing Photoplethysmography and Skin Temperature", 2013 Fourth World Congress on Software Engineering(WCSE), pp.190-194, 2013.
  2. N. Wu, H. Jiang, and G. Yang, "ECG Emotion recognition based on physiological signals", Lecture Notes in Computer Science, pp.311-320, 2012.
  3. F. C. Chang, C. K. Chang, C. C. Chiu, S. F. Hsu, and Y. D. Lin, "ECG Variations of HRV analysis in different approaches", In Proc. Of Computers in Cardiology, pp.17-20, 2007.
  4. K. Kaneko and K. Sakamoto, "Spontaneous Blinks as a Criterion of Visual Fatigue during Prolonged Work on Visual Display Terminals", J. of Perceptual and Motor Skills, Vol.92, No.1, pp.234-350, 2001. https://doi.org/10.2466/pms.2001.92.1.234
  5. E. Lee, H. Heo, and K. Park, "The Comparative Measurements of Eyestrain Caused by 2D and 3D Displays", IEEE Transaction on Consumer Electronics, Vol.56, No.3, pp.1677-1683, 2010. https://doi.org/10.1109/TCE.2010.5606312
  6. H. Gunes and M. Piccardi, "Bi-model emotion recognition from expressive face and body gesture", J. of Network and Computer Applications, Vol.30, No.4, pp.1334-1345, 2007. https://doi.org/10.1016/j.jnca.2006.09.007
  7. T. Partatla and V. Surakka, "Pupil Size Variationl as an Indication of Affective Processing", Int. J. Human-Computer Studies, Vol.59, No.1-2, pp.185-198, 2003. https://doi.org/10.1016/S1071-5819(03)00017-X
  8. Y. song, L. Morency, and R. Davis, "Learning a Sparse Codebook of Facial and Body Microe xpressions for Emotion Recognition", ICMI, pp.237-244, 2013.
  9. B. Heisele, "Face Recognition with Support Vector Machines: Global Versus Component-based Approach", ICCV, Vol.2, pp.688-694, 2001.
  10. T. Wang, H. Ai, B. Wu, and C. Huang, "Real Time Facial Expression Recognition with Adaboost", ICPR, Vol.3, pp.926-929, 2004.
  11. H. Sunghee and B. Hyeran, "Facial Expression Recognition using Eigen-points", Korean Institute of Information Science and Engineering, Vol.31, No.1, pp.817-819, 2004.
  12. L. Zhang, "Facial Expression Recognition Using Facial Movement Features", IEEE Transactions on Biometrics Compendium, Vol.2, No.4, pp.219-229, 2011.
  13. E. Lee, M. Whang, D. Ko, S. Park, and S. Hwang, "A new social emotion estimating method by measuring micro movement of human bust", Industrial Application of Affective Engineering, pp.19-26, 2014.
  14. A. Jana, Kinect for Windows SDK Programming Guide, contents Packt, 2014.
  15. Michael J. Lyons, Shigeru Akamatsu, Miyuki Kamachi, and Jiro Gyoba, "Coding Facial Expressions with Gabor Wavelets", Third IEEE International Conference on Automatic Face and Gesture Recognition, Nara Japan, IEEE Computer Society, pp.200-205, 1998.
  16. J. Russell, "A Circumplex Model of Affect", J. of Personality and Social Psychology, Vol.39, No.6, pp.1161-1178, 1980. https://doi.org/10.1037/h0077714
  17. Y. Kim, H. Kim, and E. Lee, "Emotion Classification using Facial Temporal Sparsity", Int. J. of Applied Engineering Research, Vol.9, No.24, pp.24793-24801, 2014.
  18. T. Ahonen, A. Hadid, and M. Pietikainen, "Face Description with Local Binary Patterns: Application to Face Recognitiom", IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol.28, No.12, pp.2037-2041, 2006. https://doi.org/10.1109/TPAMI.2006.244
  19. W. Richard, "Error Detecting and Error Correction Codes", J. of Bell System Technical, Vol.29, No.2, pp.147-160, 1950. https://doi.org/10.1002/j.1538-7305.1950.tb00463.x
  20. http://hmi.ewi.utwente.nl/verslagen/capita-selecta/CS-Oude_Bos-Danny.pdf
  21. M. Jana and A. Hanbury, "Affective Image Classification Using Features Inspired by Psyhology and Art Theory", MM, pp.83-92, 2010.
  22. http://musicovery.com/index.php?ct=us
  23. P. Ekman and W. Friesen, Facial Action Coding System: A Technique for the Measurement of Facial Movement, Consulting Psychologists Press, 1978.