DOI QR코드

DOI QR Code

Effect Analysis of Data Imbalance for Emotion Recognition Based on Deep Learning

딥러닝기반 감정인식에서 데이터 불균형이 미치는 영향 분석

  • 노하진 (숙명여자대학교 IT공학과) ;
  • 임유진 (숙명여자대학교 인공지능공학부)
  • 투고 : 2022.12.19
  • 심사 : 2023.02.22
  • 발행 : 2023.08.31

초록

In recent years, as online counseling for infants and adolescents has increased, CNN-based deep learning models are widely used as assistance tools for emotion recognition. However, since most emotion recognition models are trained on mainly adult data, there are performance restrictions to apply the model to infants and adolescents. In this paper, in order to analyze the performance constraints, the characteristics of facial expressions for emotional recognition of infants and adolescents compared to adults are analyzed through LIME method, one of the XAI techniques. In addition, the experiments are performed on the male and female groups to analyze the characteristics of gender-specific facial expressions. As a result, we describe age-specific and gender-specific experimental results based on the data distribution of the pre-training dataset of CNN models and highlight the importance of balanced learning data.

최근 들어 영유아를 대상으로 한 비대면 상담이 증가함에 따라 감정인식 보조 도구로 CNN기반 딥러닝 모델을 많이 사용하고 있다. 하지만 대부분의 감정인식 모델은 성인 데이터 위주로 학습되어 있어 영유아 및 청소년을 대상으로 적용하기에는 성능상의 제약이 있다. 본 논문에서는 이러한 성능제약의 원인을 분석하기 위하여 XAI 기법 중 하나인 LIME 기법을 통해 성인 대비 영유아와 청소년의 감정인식을 위한 얼굴 표정의 특징을 분석한다. 뿐만 아니라 남녀 집단에도 동일한 실험을 수행함으로써 성별 간 얼굴 표정의 특징을 분석한다. 그 결과로 연령대별 실험 결과와 성별별 실험 결과를 CNN 모델의 사전 훈련 데이터셋의 데이터 분포를 바탕으로 설명하고 균형 있는 학습 데이터의 중요성을 강조한다.

키워드

과제정보

이 성과는 정부(과학기술정보통신부)의 재원으로 한국연구재단의 지원을 받아 수행된 연구임(No. 2021R1F1A1047113).

참고문헌

  1. Q. Lin, R. He, and P. Jiang, "Feature guided CNN for Baby's facial expression recognition," Hindawi Complexity, Vol. 2020, pp.1-10, 2020. https://doi.org/10.1155/2020/8855885
  2. V. LoBue and C. Thrasher, "The Child Affective Facial Expression (CAFE) set: Validity and reliability from untrained adults," Frontiers in Psychology, Vol.5, pp.1-8, 2015. https://doi.org/10.3389/fpsyg.2014.01532
  3. The Child Affective Facial Expression (CAFE) Set. Databrary, http://doi.org/10.17910/B7301K.
  4. H. L. Egger, D. S. Pine, E. Nelson, E. Leibenluft, M. Ernst, K. E. Towbin, and A. Angold, "The NIMH Child Emotional Faces Picture Set (NIMH-ChEFS): A new set of children's facial emotion stimuli," International Journal of Methods in Psychiatric Research, Vol.20, No.3, pp.145-156, 2011. https://doi.org/10.1002/mpr.343
  5. H. Noh and Y. Lim, Proceedings of the Annual Conference of Korea Information Processing Society Conference (KIPS) 2022, Vol.29, No.2, pp.700-702, 2022.
  6. H. Jung et al., "Development of deep learning-based facial expression recognition system," 2015 21st Korea-Japan Joint Workshop on Frontiers of Computer Vision (FCV), Mokpo, Korea (South), pp.1-4, 2015.
  7. A. Fathallah, L. Abdi, and A. Douik, "Facial expression recognition via deep learning," 2017 IEEE/ACS 14th International Conference on Computer Systems and Applications (AICCSA), Hammamet, Tunisia, pp.745-750, 2017.
  8. A. Mollahosseini, D. Chan, and M. H. Mahoor, "Going deeper in facial expression recognition using deep neural networks," 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, pp. 1-10, 2016.
  9. D. K. Jain, P. Shamsolmoali, and P. Sehdev, "Extended deep neural network for facial emotion recognition," Pattern Recognition Letters, Vol.120, pp.69-74, 2019. https://doi.org/10.1016/j.patrec.2019.01.008
  10. R. A. Khan, A. Crenn, A. Meyer, and S. Bouakaz, "A novel database of children's spontaneous facial expressions (LIRIS-CSE)," Image and Vision Computing, Vol.83-84, pp. 61-69, 2019. https://doi.org/10.1016/j.imavis.2019.02.004
  11. T. A. Araf, A. Siddika, S. Karimi, and M. G. R. Alam, "Real-time face emotion recognition and visualization using Grad-CAM," International Conference on Advances in Electrical, Computing, Communication and Sustainable Technologies, Bhilai, India, pp.21-22, 2022.
  12. M. Rathod et al., "Kids' Emotion Recognition Using Various Deep-Learning Models with Explainable AI," Sensors (Basel), Vol.22, No.20, pp.8066, 2022.
  13. M. Deramgozin, S. Jovanovic, H. Rabah, and N. Ramzan, "A hybrid explainable ai framework applied to global and local facial expression recognition," IEEE International Conference on Imaging Systems and Techniques, Kaohsiung, Taiwan, pp.24-26, 2021.
  14. C. Manresa-Yee and S. Ramis, "Assessing gender bias in predictive algorithms using eXplainable AI", XXI International Conference on Human Computer Interaction, Malaga Spain, pp.22-24, 2021.
  15. HSEmotion (High-Speed face Emotion recognition) library [Internet], https://github.com/HSE-asavchenko/face-emotion-recognition.
  16. Q. Cao, L. Shen, W. Xie, O. M. Parkhi, and A. Zisserman, "VGGFace2: A dataset for recognising faces across pose and age," 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi'an, China, pp.67-74, 2018.