DOI QR코드

DOI QR Code

딥러닝의 얼굴 정서 식별 기술 활용-대학생의 심리 건강을 중심으로

Exploration of deep learning facial motions recognition technology in college students' mental health

  • Li, Bo (Department of Psychological Counseling, Paichai University) ;
  • Cho, Kyung-Duk (Department of Psychology & Counselling, Paichai University)
  • 투고 : 2021.10.20
  • 심사 : 2022.02.02
  • 발행 : 2022.03.31

초록

코로나19는 모두로 하여금 초조하고 불안하게 만들고, 사람들간에는 거리두기가 필요하다. 코로나19로 인해 심리적으로 초조하고 불안 해 지고 거리두기가 필요해졌다. 대학교에서는 학기 초에 정신건강에 대한 단체 평가와 검사가 이루어진다. 본 연구에서는 다층감지기 신경망 모델을 채택하고 훈련시켜 딥러닝을 진행했다. 훈련이 끝난 후, 실제 사진과 동영상을 입력하고, 안면탐지를 진행하고, 표본에 있는 사람의 얼굴 위치를 알아낸 후, 그 감정을 다시 분류하고, 그 표본의 예측한 감정 결과를 그림으로 보여주었다. 결과는 다음과 같다. 테스트 시험에서는 93.2%의 정확도를 얻었고, 실제 사용에서는 95.57%의 정확도를 얻었다. 그중 분노의 식별율은 95%, 혐오의 식별율은 97%, 행복의 식별율은 96%, 공포의 식별율은 96%, 슬픔의 식별율은 97%, 놀라움의 식별율은 95%, 중립의 식별율은 93%이었다. 본 연구의 고효율적 정서 식별 기술은 학생들의 부정적 정서를 포착하는 객관적 데이터를 제공 할 수 있다. 딥러닝의 감정식별 시스템은 심리건강을 향상하기 위한 데이터들을 제공할 수 있다.

The COVID-19 has made everyone anxious and people need to keep their distance. It is necessary to conduct collective assessment and screening of college students' mental health in the opening season of every year. This study uses and trains a multi-layer perceptron neural network model for deep learning to identify facial emotions. After the training, real pictures and videos were input for face detection. After detecting the positions of faces in the samples, emotions were classified, and the predicted emotional results of the samples were sent back and displayed on the pictures. The results show that the accuracy is 93.2% in the test set and 95.57% in practice. The recognition rate of Anger is 95%, Disgust is 97%, Happiness is 96%, Fear is 96%, Sadness is 97%, Surprise is 95%, Neutral is 93%, such efficient emotion recognition can provide objective data support for capturing negative. Deep learning emotion recognition system can cooperate with traditional psychological activities to provide more dimensions of psychological indicators for health.

키워드

참고문헌

  1. K. H. Kyu and K . H. Sung, "Research on complex Emotion Expression scheme of children's interactive intelligent toy--Emotion analysis of human expression and voice," Journal of Korean Design Culture Society, 2021.
  2. K. D. Cho, "Physical Features of Faces and Personal Personality Impression," Science of Emotion and Sensibility, vol. 7, iss. 2 , pp. 195-201, 2004.
  3. D. Jane and Z. Kim, "The Historical Landscape of Mental Health Evaluation and Treatment," Physician Assistant Clinics, vol. 6, no. 3, pp. 361-369, 2021 https://doi.org/10.1016/j.cpha.2021.02.001
  4. J. Harikrishnan, A. Sudarsan, A. Sadashiv, and R. A.S. Ajai, "Vision-Face Recognition Attendance Monitoring System for Surveillance using Deep Learning Technology and Computer Vision," in International Conference on Vision Towards Emerging Trends in Communication and Networking, pp. 1-5, 2019.
  5. G. Yang and J. Saniie, "Indoor Navigation for Visually Impaired Using AR Markers," in Electro Information Technology (EIT), 2017 IEEE International Conference on, Lincoln, 2017.
  6. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning, The MIT Press, pp. 199-208, 2016.
  7. Q. Mao, X. Pan, Y. Zhan, and X. Shen, "Using Kinect for real-time emotion recognition via facial expressions," Frontiers of Information Technology&Electronic Engineering, vol. 12, pp. 272-282, Apr. 2015
  8. S. H. Lee, "Facial Data Visualization for Emotion Recognition of Deep Learning," Journal of Information Science Theory and Practice, 2019.
  9. L. Jing,L. Dexi, W. Chanxuan, L. Xiping, Q. Xiangqing, B. Liping, and Z. Tingshao, "A review on Automatic Assessment of Mental Health for Social Network Users," Journal of Chinese Information Processing, vol. 35, no. 2, pp. 20-29, Feb. 2021
  10. T. Perron, T. Jakubowski, and C. Razzi, "Physical Health Assessment of the Frequent Visitors in the School Setting-Part 1: An Overview," NASN School Nurse, vol. 36, no. 4, Mar. 2021.