Browse > Article
http://dx.doi.org/10.13067/JKIECS.2022.17.6.1137

Artificial Intelligence for Assistance of Facial Expression Practice Using Emotion Classification  

Dong-Kyu, Kim (Dept. Human Intelligence Robot Engineering, Sangmyung University)
So Hwa, Lee (Dept. Human Intelligence Robot Engineering, Sangmyung University)
Jae Hwan, Bong (Dept. Human Intelligence Robot Engineering, Sangmyung University)
Publication Information
The Journal of the Korea institute of electronic communication sciences / v.17, no.6, 2022 , pp. 1137-1144 More about this Journal
Abstract
In this study, an artificial intelligence(AI) was developed to help with facial expression practice in order to express emotions. The developed AI used multimodal inputs consisting of sentences and facial images for deep neural networks (DNNs). The DNNs calculated similarities between the emotions predicted by the sentences and the emotions predicted by facial images. The user practiced facial expressions based on the situation given by sentences, and the AI provided the user with numerical feedback based on the similarity between the emotion predicted by sentence and the emotion predicted by facial expression. ResNet34 structure was trained on FER2013 public data to predict emotions from facial images. To predict emotions in sentences, KoBERT model was trained in transfer learning manner using the conversational speech dataset for emotion classification opened to the public by AIHub. The DNN that predicts emotions from the facial images demonstrated 65% accuracy, which is comparable to human emotional classification ability. The DNN that predicts emotions from the sentences achieved 90% accuracy. The performance of the developed AI was evaluated through experiments with changing facial expressions in which an ordinary person was participated.
Keywords
Emotion Classification; Facial Expression Practice; Facial Image Processing; Natural Language Processing;
Citations & Related Records
Times Cited By KSCI : 3  (Citation Analysis)
연도 인용수 순위
1 H. Friedman, M. Dimatteo, and T. Mertz, "Nonverbal communication on television news," Personality and Social Psychology Bulletin, vol. 6, no. 3, 1980, pp. 427-435.   DOI
2 Y. Hwang, W. Shin, and J. Kim, "What I Read on Your Face is My Emotion: The Effects of Emotion on Interpreting Others' Facial Expression," Korea Communication Association, vol. 18, no. 1, 2010, pp. 247-271.
3 K. R. Scherer and P. Ekman, Handbook of Methods in nonverbal behavior research. Cambridge: Cambridge University Press, 1982.
4 W. Kang and M. Kong, Education programs for attention-deficient children. Daegu: Daegu University Press, 1998.
5 D. Pai, A. Cho, and J. Lee, "Nonverbal emotional recognition of face and voice in children with ADHD and depression," Korean J. of Clinical Psychology, vol. 23, no. 3, 2004, pp. 741-754.
6 K. Ryu and K. Oh, "Effect of Depressive Mood on Identification of Emotional Facial Expression," Korean J. of the Science of Emotion & Sensibility, vol. 11, no. 1, 2008, pp. 11-21.
7 S. Salimov and J. Yoo, "A Design of Small Scale Deep CNN Model for Facial Expression Recognition using the Low Resolution Image Datasets," J. of the Korea Institute of Electronic Communication Sciences, vol. 16, no. 1, 2021, pp. 75-80.   DOI
8 S. Bak, N. Kim, M. Jeong, D. Hwang, U. Enkhjargal, B. Kim, M. Park, H. Yoon, and W. Seo, "Study on Detection Technique for Coastal Debris by using Unmanned Aerial Vehicle Remote Sensing and Object Detection Algorithm based on Deep Learning," J. of the Korea Institute of Electronic Communication Sciences, vol. 15, no. 6, 2020, pp. 1209-1216.   DOI
9 S. Bak, M. Jeong, D. Hwang, U. Enkhjargal, N. Kim, and H. Yoon, "Study on Cochlodinium polykrikoides Red tide Prediction using Deep Neural Network under Imbalanced Data," J. of the Korea Institute of Electronic Communication Sciences, vol. 14, no. 6, 2019, pp. 1161-1170.   DOI
10 M. Lee, U. Yoon, S. Go, and G. Jo, "Efficient CNNs with Channel Attention and Group Convolution for Facial Expression Recognition," J. of Korean Institute of Information Scientists and Engineers, vol. 46, no. 12, 2019, pp. 1241-1248.
11 S. Ullah and D. Kim, "A Comparative study on vision-based deep learning algorithms for Face Emotion Recognition using Transfer Learning," In Int. Conf. on Next Generation Computing, Gwangju, Korea, May 2021, pp. 359-362.
12 D. Lee, "Image Classification: Comparsion of Convolution Neural Netwrok Models via Various Image Datasets," Master's Thesis, Korea University Graduate School Department of Statistics, 2021.
13 Y. Lee and H. Choi, "Joint Learning based KoBERT for Emotion Recognition in Korean," In Korea Software Congress, Online, Dec. 2020, pp. 568-570.
14 D. Choi, H. Kim, H. Lee, and Y. Hwang, "Sentimental Analysis of YouTube Korean Subscripts Using KoBERT," In Korea Information Processing Society Conf., Seoul, Korea, May 2022, pp. 513-516.
15 C. Jeong and M. Kim, "A Study on Visual Emotion Classification using Balanced Data Augmentation," J. of Korea Multimedia Society, vol. 24, no. 7, 2021, pp. 880-889.   DOI
16 I. Choi, H. Ahn, and J. Yoo, "Facial Expression Classification Using Deep Convolutional Neural Network," J. of Broadcast Engineering&Technology, vol. 13, no. 1, 2018, pp. 485-492.