• Title/Summary/Keyword: Facial Emotions

Search Result 159, Processing Time 0.018 seconds

Transfer Learning for Face Emotions Recognition in Different Crowd Density Situations

  • Amirah Alharbi
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.4
    • /
    • pp.26-34
    • /
    • 2024
  • Most human emotions are conveyed through facial expressions, which represent the predominant source of emotional data. This research investigates the impact of crowds on human emotions by analysing facial expressions. It examines how crowd behaviour, face recognition technology, and deep learning algorithms contribute to understanding the emotional change according to different level of crowd. The study identifies common emotions expressed during congestion, differences between crowded and less crowded areas, changes in facial expressions over time. The findings can inform urban planning and crowd event management by providing insights for developing coping mechanisms for affected individuals. However, limitations and challenges in using reliable facial expression analysis are also discussed, including age and context-related differences.

Developmental Changes in Emotional-States and Facial Expression (정서 상태와 얼굴표정간의 연결 능력의 발달)

  • Park, Soo-Jin;Song, In-Hae;Ghim, Hei-Rhee;Cho, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.127-133
    • /
    • 2007
  • The present study investigated whether the emotional states reading ability through facial expression changes by age(3-, 5-year-old and university student groups), sex(male, female), facial expression's presenting areas(face, eyes) and the type of emotions(basic emotions, complex emotions). 32 types of emotional state's facial expressions which are linked relatively strong with the emotional vocabularies were used as stimuli. Stimuli were collected by taking photographs of professional actors facial expression performance. Each individuals were presented with stories which set off certain emotions, and then were asked to choose a facial expression that the principal character would have made for the occasion presented in stories. The result showed that the ability of facial expression reading improves as the age get higher. Also, they performed better with the condition of face than eyes, and basic emotions than complex emotions. While female doesn't show any performance difference with the presenting areas, male shows better performance in case of facial condition compared with eye condition. The results demonstrate that age, facial expression's presenting areas and the type of emotions effect on estimation of other people's emotion through facial expressions.

  • PDF

Artificial Intelligence for Assistance of Facial Expression Practice Using Emotion Classification (감정 분류를 이용한 표정 연습 보조 인공지능)

  • Dong-Kyu, Kim;So Hwa, Lee;Jae Hwan, Bong
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.6
    • /
    • pp.1137-1144
    • /
    • 2022
  • In this study, an artificial intelligence(AI) was developed to help with facial expression practice in order to express emotions. The developed AI used multimodal inputs consisting of sentences and facial images for deep neural networks (DNNs). The DNNs calculated similarities between the emotions predicted by the sentences and the emotions predicted by facial images. The user practiced facial expressions based on the situation given by sentences, and the AI provided the user with numerical feedback based on the similarity between the emotion predicted by sentence and the emotion predicted by facial expression. ResNet34 structure was trained on FER2013 public data to predict emotions from facial images. To predict emotions in sentences, KoBERT model was trained in transfer learning manner using the conversational speech dataset for emotion classification opened to the public by AIHub. The DNN that predicts emotions from the facial images demonstrated 65% accuracy, which is comparable to human emotional classification ability. The DNN that predicts emotions from the sentences achieved 90% accuracy. The performance of the developed AI was evaluated through experiments with changing facial expressions in which an ordinary person was participated.

Alexithymia and the Recognition of Facial Emotion in Schizophrenic Patients (정신분열병 환자에서의 감정표현불능증과 얼굴정서인식결핍)

  • Noh, Jin-Chan;Park, Sung-Hyouk;Kim, Kyung-Hee;Kim, So-Yul;Shin, Sung-Woong;Lee, Koun-Seok
    • Korean Journal of Biological Psychiatry
    • /
    • v.18 no.4
    • /
    • pp.239-244
    • /
    • 2011
  • Objectives Schizophrenic patients have been shown to be impaired in both emotional self-awareness and recognition of others' facial emotions. Alexithymia refers to the deficits in emotional self-awareness. The relationship between alexithymia and recognition of others' facial emotions needs to be explored to better understand the characteristics of emotional deficits in schizophrenic patients. Methods Thirty control subjects and 31 schizophrenic patients completed the Toronto Alexithymia Scale-20-Korean version (TAS-20K) and facial emotion recognition task. The stimuli in facial emotion recognition task consist of 6 emotions (happiness, sadness, anger, fear, disgust, and neutral). Recognition accuracy was calculated within each emotion category. Correlations between TAS-20K and recognition accuracy were analyzed. Results The schizophrenic patients showed higher TAS-20K scores and lower recognition accuracy compared with the control subjects. The schizophrenic patients did not demonstrate any significant correlations between TAS-20K and recognition accuracy, unlike the control subjects. Conclusions The data suggest that, although schizophrenia may impair both emotional self-awareness and recognition of others' facial emotions, the degrees of deficit can be different between emotional self-awareness and recognition of others' facial emotions. This indicates that the emotional deficits in schizophrenia may assume more complex features.

Effects of the facial expression presenting types and facial areas on the emotional recognition (얼굴 표정의 제시 유형과 제시 영역에 따른 정서 인식 효과)

  • Lee, Jung-Hun;Park, Soo-Jin;Han, Kwang-Hee;Ghim, Hei-Rhee;Cho, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.113-125
    • /
    • 2007
  • The aim of the experimental studies described in this paper is to investigate the effects of the face/eye/mouth areas using dynamic facial expressions and static facial expressions on emotional recognition. Using seven-seconds-displays, experiment 1 for basic emotions and experiment 2 for complex emotions are executed. The results of two experiments supported that the effects of dynamic facial expressions are higher than static one on emotional recognition and indicated the higher emotional recognition effects of eye area on dynamic images than mouth area. These results suggest that dynamic properties should be considered in emotional study with facial expressions for not only basic emotions but also complex emotions. However, we should consider the properties of emotion because each emotion did not show the effects of dynamic image equally. Furthermore, this study let us know which facial area shows emotional states more correctly is according to the feature emotion.

  • PDF

Affective Computing in Education: Platform Analysis and Academic Emotion Classification

  • So, Hyo-Jeong;Lee, Ji-Hyang;Park, Hyun-Jin
    • International journal of advanced smart convergence
    • /
    • v.8 no.2
    • /
    • pp.8-17
    • /
    • 2019
  • The main purpose of this study isto explore the potential of affective computing (AC) platforms in education through two phases ofresearch: Phase I - platform analysis and Phase II - classification of academic emotions. In Phase I, the results indicate that the existing affective analysis platforms can be largely classified into four types according to the emotion detecting methods: (a) facial expression-based platforms, (b) biometric-based platforms, (c) text/verbal tone-based platforms, and (c) mixed methods platforms. In Phase II, we conducted an in-depth analysis of the emotional experience that a learner encounters in online video-based learning in order to establish the basis for a new classification system of online learner's emotions. Overall, positive emotions were shown more frequently and longer than negative emotions. We categorized positive emotions into three groups based on the facial expression data: (a) confidence; (b) excitement, enjoyment, and pleasure; and (c) aspiration, enthusiasm, and expectation. The same method was used to categorize negative emotions into four groups: (a) fear and anxiety, (b) embarrassment and shame, (c) frustration and alienation, and (d) boredom. Drawn from the results, we proposed a new classification scheme that can be used to measure and analyze how learners in online learning environments experience various positive and negative emotions with the indicators of facial expressions.

3D Emotional Avatar Creation and Animation using Facial Expression Recognition (표정 인식을 이용한 3D 감정 아바타 생성 및 애니메이션)

  • Cho, Taehoon;Jeong, Joong-Pill;Choi, Soo-Mi
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.9
    • /
    • pp.1076-1083
    • /
    • 2014
  • We propose an emotional facial avatar that portrays the user's facial expressions with an emotional emphasis, while achieving visual and behavioral realism. This is achieved by unifying automatic analysis of facial expressions and animation of realistic 3D faces with details such as facial hair and hairstyles. To augment facial appearance according to the user's emotions, we use emotional templates representing typical emotions in an artistic way, which can be easily combined with the skin texture of the 3D face at runtime. Hence, our interface gives the user vision-based control over facial animation of the emotional avatar, easily changing its moods.

Emotional Expression of the Virtual Influencer "Luo Tianyi(洛天依)" in Digital'

  • Guangtao Song;Albert Young Choi
    • International Journal of Advanced Culture Technology
    • /
    • v.12 no.2
    • /
    • pp.375-385
    • /
    • 2024
  • In the context of contemporary digital media, virtual influencers have become an increasingly important form of socialization and entertainment, in which emotional expression is a key factor in attracting viewers. In this study, we take Luo Tianyi, a Chinese virtual influencer, as an example to explore how emotions are expressed and perceived through facial expressions in different types of videos. Using Paul Ekman's Facial Action Coding System (FACS) and six basic emotion classifications, the study systematically analyzes Luo Tianyi's emotional expressions in three types of videos, namely Music show, Festivals and Brand Cooperation. During the study, Luo Tianyi's facial expressions and emotional expressions were analyzed through rigorous coding and categorization, as well as matching the context of the video content. The results show that Enjoyment is the most frequently expressed emotion by Luo Tianyi, reflecting the centrality of positive emotions in content creation. Meanwhile, the presence of other emotion types reveals the virtual influencer's efforts to create emotionally rich and authentic experiences. The frequency and variety of emotions expressed in different video genres indicate Luo Tianyi's diverse strategies for communicating and connecting with viewers in different contexts. The study provides an empirical basis for understanding and utilizing virtual influencers' emotional expressions, and offers valuable insights for digital media content creators to design emotional expression strategies. Overall, this study is valuable for understanding the complexity of virtual influencer emotional expression and its importance in digital media strategy.

Mood Suggestion Framework Using Emotional Relaxation Matching Based on Emotion Meshes

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.8
    • /
    • pp.37-43
    • /
    • 2018
  • In this paper, we propose a framework that automatically suggests emotion using emotion analysis method based on facial expression change. We use Microsoft's Emotion API to calculate and analyze emotion values in facial expressions to recognize emotions that change over time. In this step, we use standard deviations based on peak analysis to measure and classify emotional changes. The difference between the classified emotion and the normal emotion is calculated, and the difference is used to recognize the emotion abnormality. We match user's emotions to relatively relaxed emotions using histograms and emotional meshes. As a result, we provide relaxed emotions to users through images. The proposed framework helps users to recognize emotional changes easily and to train their emotions through emotional relaxation.

Facial Emotion Recognition in Older Adults With Cognitive Complaints

  • YongSoo Shim
    • Dementia and Neurocognitive Disorders
    • /
    • v.22 no.4
    • /
    • pp.158-168
    • /
    • 2023
  • Background and Purpose: Facial emotion recognition deficits impact the daily life, particularly of Alzheimer's disease patients. We aimed to assess these deficits in the following three groups: subjective cognitive decline (SCD), mild cognitive impairment (MCI), and mild Alzheimer's dementia (AD). Additionally, we explored the associations between facial emotion recognition and cognitive performance. Methods: We used the Korean version of the Florida Facial Affect Battery (K-FAB) in 72 SCD, 76 MCI, and 76 mild AD subjects. The comparison was conducted using the analysis of covariance (ANCOVA), with adjustments being made for age and sex. The Mini-Mental State Examination (MMSE) was utilized to gauge the overall cognitive status, while the Seoul Neuropsychological Screening Battery (SNSB) was employed to evaluate the performance in the following five cognitive domains: attention, language, visuospatial abilities, memory, and frontal executive functions. Results: The ANCOVA results showed significant differences in K-FAB subtests 3, 4, and 5 (p=0.001, p=0.003, and p=0.004, respectively), especially for anger and fearful emotions. Recognition of 'anger' in the FAB subtest 5 declined from SCD to MCI to mild AD. Correlations were observed with age and education, and after controlling for these factors, MMSE and frontal executive function were associated with FAB tests, particularly in the FAB subtest 5 (r=0.507, p<0.001 and r=-0.288, p=0.026, respectively). Conclusions: Emotion recognition deficits worsened from SCD to MCI to mild AD, especially for negative emotions. Complex tasks, such as matching, selection, and naming, showed greater deficits, with a connection to cognitive impairment, especially frontal executive dysfunction.