• Title/Summary/Keyword: Facial emotional stimuli

Search Result 17, Processing Time 0.024 seconds

P3 Elicited by the Positive and Negative Emotional Stimuli (긍정적, 부정적 정서 자극에 의해 유발된 P3)

  • An, Suk-Kyoon;Lee, Soo-Jung;NamKoong, Kee;Lee, Chang-Il;Lee, Eun;Kim, The-Hoon;Roh, Kyo-Sik;Choi, Hye-Won;Park, Jun-Mo
    • Korean Journal of Psychosomatic Medicine
    • /
    • v.9 no.2
    • /
    • pp.143-152
    • /
    • 2001
  • Objects : The aim of this study was to determine whether the P3 elicited by the negative emotional stimuli is different to that by positive stimuli. Methods : We measured the event-related potentials, especially P3 elicited by the facial photographs in 12 healthy subjects. Subjects were instructed to feel and respond to the rare target facial photographs imbedded in frequent non-target checkerboards. Results : We found that amplitude of P3 elicited by negative emotional photographs was significantly larger than that by the positive stimuli in healthy subjects. Conclusion : These findings suggest that P3 elicited by facial stimuli may be used as a psychophy-siological variable of the emotional processing.

  • PDF

Emotion Recognition using Facial Thermal Images

  • Eom, Jin-Sup;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.3
    • /
    • pp.427-435
    • /
    • 2012
  • The aim of this study is to investigate facial temperature changes induced by facial expression and emotional state in order to recognize a persons emotion using facial thermal images. Background: Facial thermal images have two advantages compared to visual images. Firstly, facial temperature measured by thermal camera does not depend on skin color, darkness, and lighting condition. Secondly, facial thermal images are changed not only by facial expression but also emotional state. To our knowledge, there is no study to concurrently investigate these two sources of facial temperature changes. Method: 231 students participated in the experiment. Four kinds of stimuli inducing anger, fear, boredom, and neutral were presented to participants and the facial temperatures were measured by an infrared camera. Each stimulus consisted of baseline and emotion period. Baseline period lasted during 1min and emotion period 1~3min. In the data analysis, the temperature differences between the baseline and emotion state were analyzed. Eyes, mouth, and glabella were selected for facial expression features, and forehead, nose, cheeks were selected for emotional state features. Results: The temperatures of eyes, mouth, glanella, forehead, and nose area were significantly decreased during the emotional experience and the changes were significantly different by the kind of emotion. The result of linear discriminant analysis for emotion recognition showed that the correct classification percentage in four emotions was 62.7% when using both facial expression features and emotional state features. The accuracy was slightly but significantly decreased at 56.7% when using only facial expression features, and the accuracy was 40.2% when using only emotional state features. Conclusion: Facial expression features are essential in emotion recognition, but emotion state features are also important to classify the emotion. Application: The results of this study can be applied to human-computer interaction system in the work places or the automobiles.

Effect of Depressive Mood on Identification of Emotional Facial Expression (우울감이 얼굴 표정 정서 인식에 미치는 영향)

  • Ryu, Kyoung-Hi;Oh, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.11 no.1
    • /
    • pp.11-21
    • /
    • 2008
  • This study was designed to examine the effect of depressive mood on identification of emotional facial expression. Participants were screened out of 305 college students on the basis of the BDI-II score. Students with BDI-II score higher than 14(upper 20%) were selected for the Depression Group and those with BDI-II score lower than 5(lower 20%) were selected for the Control Group. A final sample of 20 students in the Depression Group and 20 in the Control Group were presented with facial expression stimuli of an increasing degree of emotional intensity, slowly changing from a neutral to a full intensity of happy, sad, angry, or fearful expressions. The result showed that there was the significant interaction of Group by Emotion(esp. happy and sad) which suggested that depressive mood affects processing of emotional stimuli such as facial expressions. Implication of this result for mood-congruent information processing were discussed.

  • PDF

Developmental Changes in Emotional-States and Facial Expression (정서 상태와 얼굴표정간의 연결 능력의 발달)

  • Park, Soo-Jin;Song, In-Hae;Ghim, Hei-Rhee;Cho, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.127-133
    • /
    • 2007
  • The present study investigated whether the emotional states reading ability through facial expression changes by age(3-, 5-year-old and university student groups), sex(male, female), facial expression's presenting areas(face, eyes) and the type of emotions(basic emotions, complex emotions). 32 types of emotional state's facial expressions which are linked relatively strong with the emotional vocabularies were used as stimuli. Stimuli were collected by taking photographs of professional actors facial expression performance. Each individuals were presented with stories which set off certain emotions, and then were asked to choose a facial expression that the principal character would have made for the occasion presented in stories. The result showed that the ability of facial expression reading improves as the age get higher. Also, they performed better with the condition of face than eyes, and basic emotions than complex emotions. While female doesn't show any performance difference with the presenting areas, male shows better performance in case of facial condition compared with eye condition. The results demonstrate that age, facial expression's presenting areas and the type of emotions effect on estimation of other people's emotion through facial expressions.

  • PDF

Alexithymia and the Recognition of Facial Emotion in Schizophrenic Patients (정신분열병 환자에서의 감정표현불능증과 얼굴정서인식결핍)

  • Noh, Jin-Chan;Park, Sung-Hyouk;Kim, Kyung-Hee;Kim, So-Yul;Shin, Sung-Woong;Lee, Koun-Seok
    • Korean Journal of Biological Psychiatry
    • /
    • v.18 no.4
    • /
    • pp.239-244
    • /
    • 2011
  • Objectives Schizophrenic patients have been shown to be impaired in both emotional self-awareness and recognition of others' facial emotions. Alexithymia refers to the deficits in emotional self-awareness. The relationship between alexithymia and recognition of others' facial emotions needs to be explored to better understand the characteristics of emotional deficits in schizophrenic patients. Methods Thirty control subjects and 31 schizophrenic patients completed the Toronto Alexithymia Scale-20-Korean version (TAS-20K) and facial emotion recognition task. The stimuli in facial emotion recognition task consist of 6 emotions (happiness, sadness, anger, fear, disgust, and neutral). Recognition accuracy was calculated within each emotion category. Correlations between TAS-20K and recognition accuracy were analyzed. Results The schizophrenic patients showed higher TAS-20K scores and lower recognition accuracy compared with the control subjects. The schizophrenic patients did not demonstrate any significant correlations between TAS-20K and recognition accuracy, unlike the control subjects. Conclusions The data suggest that, although schizophrenia may impair both emotional self-awareness and recognition of others' facial emotions, the degrees of deficit can be different between emotional self-awareness and recognition of others' facial emotions. This indicates that the emotional deficits in schizophrenia may assume more complex features.

Clinical Convergence Study on Attention Processing of Individuals with Social Anxiety Tendency : Focusing on Positive Stimulation in Emotional Context (사회불안성향자의 주의 과정에 관한 임상 융합 연구 : 정서맥락에서 긍정 자극을 중심으로)

  • Park, Ji-Yoon;Yoon, Hyae-Young
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.3
    • /
    • pp.79-90
    • /
    • 2018
  • The purpose of this study was to investigate the difference of individuals with social anxiety tendency and normal people according to existence of emotional context in attention processing for positive facial stimulation. To do this, we investigated attentional processing for positive face stimuli in a condition without/with emotional context. SADS and CES-D were administered to 800 undergraduate students in D city and the social anxiety group (SA, n=24) and the normal control group (NC, n=24) were selected. In order to measure the two factors of attention process (attention engagement and attention disengagement), first gaze direction and first gaze time were measured through eye-movement tracking. The results show that the SA group exhibited faster attention disengagement from positive face stimuli compared to the NC group in the condition without context. But, when the positive context presented with positive face stimuli, there is no difference between SA and NC. This result suggests that the positive background affects emotional processing of social anxiety disorder.

Attentional Bias to Emotional Stimuli and Effects of Anxiety on the Bias in Neurotypical Adults and Adolescents

  • Mihee Kim;Jejoong Kim;So-Yeon Kim
    • Science of Emotion and Sensibility
    • /
    • v.25 no.4
    • /
    • pp.107-118
    • /
    • 2022
  • Human can rapidly detect and deal with dangerous elements in their environment, and they generally manifest as attentional bias toward threat. Past studies have reported that this attentional bias is affected by anxiety level. Other studies, however, have argued that children and adolescents show attentional bias to threatening stimuli, regardless of their anxiety levels. Few studies directly have compared the two age groups in terms of attentional bias to threat, and furthermore, most previous studies have focused on attentional capture and the early stages of attention, without investigating further attentional holding by the stimuli. In this study, we investigated both attentional bias patterns (attentional capture and holding) with respect to negative emotional stimulus in neurotypical adults and adolescents. The effects of anxiety level on attentional bias were also examined. The results obtained for adult participants showed that abrupt onset of a distractor delayed attentional capture to the target, regardless of distractor type (angry or neutral faces), while it had no effect on attention holding. In adolescents, on the other hand, only the angry face distractor resulted in longer reaction time for detecting a target. Regarding anxiety, state anxiety revealed a significant positive correlation with attentional capture to a face distractor in adult participants but not in adolescents. Overall, this is the first study to investigate developmental tendencies of attentional bias to negative facial emotion in both adults and adolescents, providing novel evidence on attentional bias to threats at different ages. Our results can be applied to understanding the attentional mechanisms in people with emotion-related developmental disorders, as well as typical development.

Study on Emotional Words and Favorableness Associated with the Faces of Women in Their 60s

  • Kim, Ae Kyung;Oh, Yun Kyoung
    • Fashion & Textile Research Journal
    • /
    • v.16 no.6
    • /
    • pp.995-1000
    • /
    • 2014
  • This study, using the free language association method, examined the characteristics of emotional words of respondents who were exposed to facial photos of women in 60s, and favorableness and favorable styles of them. To analyze mood characteristics on the faces, they were divided into positive mood words and negative mood words. Following previous researches, they were divided into introversion, extraversion, and ambiversion. It was found that the proportion of positive emotional words respondents used was 37%, and that of negative ones was 63%, demonstrating that respondents are more likely than not to get the negative impressions from the faces of their contemporaries. The characteristics of the words consists of 38% introversion, 47% extraversion, and 14% ambiversion. And, respondents used the words like 'beautiful' and 'good-looking' to the stimuli to which they felt favorable, and 'ill-tempered' and 'stubborn' to the stimuli to which they felt unfavorable. Third, the most favorable style to both male and female respondents in 60s were sentimental and good-mannered. They generally favor women who are soft and caring, and dislike talkative, snobbish, and thick make-up women. The analysis results in this paper may help image making and personal relations. Further study needs to expand the survey area to ensure more significant influence on the social life and interpersonal relationship of senior citizens.

Improvement of a Context-aware Recommender System through User's Emotional State Prediction (사용자 감정 예측을 통한 상황인지 추천시스템의 개선)

  • Ahn, Hyunchul
    • Journal of Information Technology Applications and Management
    • /
    • v.21 no.4
    • /
    • pp.203-223
    • /
    • 2014
  • This study proposes a novel context-aware recommender system, which is designed to recommend the items according to the customer's responses to the previously recommended item. In specific, our proposed system predicts the user's emotional state from his or her responses (such as facial expressions and movements) to the previous recommended item, and then it recommends the items that are similar to the previous one when his or her emotional state is estimated as positive. If the customer's emotional state on the previously recommended item is regarded as negative, the system recommends the items that have characteristics opposite to the previous item. Our proposed system consists of two sub modules-(1) emotion prediction module, and (2) responsive recommendation module. Emotion prediction module contains the emotion prediction model that predicts a customer's arousal level-a physiological and psychological state of being awake or reactive to stimuli-using the customer's reaction data including facial expressions and body movements, which can be measured using Microsoft's Kinect Sensor. Responsive recommendation module generates a recommendation list by using the results from the first module-emotion prediction module. If a customer shows a high level of arousal on the previously recommended item, the module recommends the items that are most similar to the previous item. Otherwise, it recommends the items that are most dissimilar to the previous one. In order to validate the performance and usefulness of the proposed recommender system, we conducted empirical validation. In total, 30 undergraduate students participated in the experiment. We used 100 trailers of Korean movies that had been released from 2009 to 2012 as the items for recommendation. For the experiment, we manually constructed Korean movie trailer DB which contains the fields such as release date, genre, director, writer, and actors. In order to check if the recommendation using customers' responses outperforms the recommendation using their demographic information, we compared them. The performance of the recommendation was measured using two metrics-satisfaction and arousal levels. Experimental results showed that the recommendation using customers' responses (i.e. our proposed system) outperformed the recommendation using their demographic information with statistical significance.

Classification of Three Different Emotion by Physiological Parameters

  • Jang, Eun-Hye;Park, Byoung-Jun;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.271-279
    • /
    • 2012
  • Objective: This study classified three different emotional states(boredom, pain, and surprise) using physiological signals. Background: Emotion recognition studies have tried to recognize human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 122 college students participated in this experiment. Three different emotional stimuli were presented to participants and physiological signals, i.e., EDA(Electrodermal Activity), SKT(Skin Temperature), PPG(Photoplethysmogram), and ECG (Electrocardiogram) were measured for 1 minute as baseline and for 1~1.5 minutes during emotional state. The obtained signals were analyzed for 30 seconds from the baseline and the emotional state and 27 features were extracted from these signals. Statistical analysis for emotion classification were done by DFA(discriminant function analysis) (SPSS 15.0) by using the difference values subtracting baseline values from the emotional state. Results: The result showed that physiological responses during emotional states were significantly differed as compared to during baseline. Also, an accuracy rate of emotion classification was 84.7%. Conclusion: Our study have identified that emotions were classified by various physiological signals. However, future study is needed to obtain additional signals from other modalities such as facial expression, face temperature, or voice to improve classification rate and to examine the stability and reliability of this result compare with accuracy of emotion classification using other algorithms. Application: This could help emotion recognition studies lead to better chance to recognize various human emotions by using physiological signals as well as is able to be applied on human-computer interaction system for emotion recognition. Also, it can be useful in developing an emotion theory, or profiling emotion-specific physiological responses as well as establishing the basis for emotion recognition system in human-computer interaction.