• Title/Summary/Keyword: Facial emotion

Search Result 309, Processing Time 0.031 seconds

Design of the emotion expression in multimodal conversation interaction of companion robot (컴패니언 로봇의 멀티 모달 대화 인터랙션에서의 감정 표현 디자인 연구)

  • Lee, Seul Bi;Yoo, Seung Hun
    • Design Convergence Study
    • /
    • v.16 no.6
    • /
    • pp.137-152
    • /
    • 2017
  • This research aims to develop the companion robot experience design for elderly in korea based on needs-function deploy matrix of robot and emotion expression research of robot in multimodal interaction. First, Elder users' main needs were categorized into 4 groups based on ethnographic research. Second, the functional elements and physical actuators of robot were mapped to user needs in function- needs deploy matrix. The final UX design prototype was implemented with a robot type that has a verbal non-touch multi modal interface with emotional facial expression based on Ekman's Facial Action Coding System (FACS). The proposed robot prototype was validated through a user test session to analyze the influence of the robot interaction on the cognition and emotion of users by Story Recall Test and face emotion analysis software; Emotion API when the robot changes facial expression corresponds to the emotion of the delivered information by the robot and when the robot initiated interaction cycle voluntarily. The group with emotional robot showed a relatively high recall rate in the delayed recall test and In the facial expression analysis, the facial expression and the interaction initiation of the robot affected on emotion and preference of the elderly participants.

An Intelligent Emotion Recognition Model Using Facial and Bodily Expressions

  • Jae Kyeong Kim;Won Kuk Park;Il Young Choi
    • Asia pacific journal of information systems
    • /
    • v.27 no.1
    • /
    • pp.38-53
    • /
    • 2017
  • As sensor technologies and image processing technologies make collecting information on users' behavior easy, many researchers have examined automatic emotion recognition based on facial expressions, body expressions, and tone of voice, among others. Specifically, many studies have used normal cameras in the multimodal case using facial and body expressions. Thus, previous studies used a limited number of information because normal cameras generally produce only two-dimensional images. In the present research, we propose an artificial neural network-based model using a high-definition webcam and Kinect to recognize users' emotions from facial and bodily expressions when watching a movie trailer. We validate the proposed model in a naturally occurring field environment rather than in an artificially controlled laboratory environment. The result of this research will be helpful in the wide use of emotion recognition models in advertisements, exhibitions, and interactive shows.

Alexithymia and the Recognition of Facial Emotion in Schizophrenic Patients (정신분열병 환자에서의 감정표현불능증과 얼굴정서인식결핍)

  • Noh, Jin-Chan;Park, Sung-Hyouk;Kim, Kyung-Hee;Kim, So-Yul;Shin, Sung-Woong;Lee, Koun-Seok
    • Korean Journal of Biological Psychiatry
    • /
    • v.18 no.4
    • /
    • pp.239-244
    • /
    • 2011
  • Objectives Schizophrenic patients have been shown to be impaired in both emotional self-awareness and recognition of others' facial emotions. Alexithymia refers to the deficits in emotional self-awareness. The relationship between alexithymia and recognition of others' facial emotions needs to be explored to better understand the characteristics of emotional deficits in schizophrenic patients. Methods Thirty control subjects and 31 schizophrenic patients completed the Toronto Alexithymia Scale-20-Korean version (TAS-20K) and facial emotion recognition task. The stimuli in facial emotion recognition task consist of 6 emotions (happiness, sadness, anger, fear, disgust, and neutral). Recognition accuracy was calculated within each emotion category. Correlations between TAS-20K and recognition accuracy were analyzed. Results The schizophrenic patients showed higher TAS-20K scores and lower recognition accuracy compared with the control subjects. The schizophrenic patients did not demonstrate any significant correlations between TAS-20K and recognition accuracy, unlike the control subjects. Conclusions The data suggest that, although schizophrenia may impair both emotional self-awareness and recognition of others' facial emotions, the degrees of deficit can be different between emotional self-awareness and recognition of others' facial emotions. This indicates that the emotional deficits in schizophrenia may assume more complex features.

Analysis of Facial Movement According to Opposite Emotions (상반된 감성에 따른 안면 움직임 차이에 대한 분석)

  • Lee, Eui Chul;Kim, Yoon-Kyoung;Bea, Min-Kyoung;Kim, Han-Sol
    • The Journal of the Korea Contents Association
    • /
    • v.15 no.10
    • /
    • pp.1-9
    • /
    • 2015
  • In this paper, a study on facial movements are analyzed in terms of opposite emotion stimuli by image processing of Kinect facial image. To induce two opposite emotion pairs such as "Sad - Excitement"and "Contentment - Angry" which are oppositely positioned onto Russell's 2D emotion model, both visual and auditory stimuli are given to subjects. Firstly, 31 main points are chosen among 121 facial feature points of active appearance model obtained from Kinect Face Tracking SDK. Then, pixel changes around 31 main points are analyzed. In here, local minimum shift matching method is used in order to solve a problem of non-linear facial movement. At results, right and left side facial movements were occurred in cases of "Sad" and "Excitement" emotions, respectively. Left side facial movement was comparatively more occurred in case of "Contentment" emotion. In contrast, both left and right side movements were occurred in case of "Angry" emotion.

Facial Expression Recognition with Fuzzy C-Means Clusstering Algorithm and Neural Network Based on Gabor Wavelets

  • Youngsuk Shin;Chansup Chung;Lee, Yillbyung
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.126-132
    • /
    • 2000
  • This paper presents a facial expression recognition based on Gabor wavelets that uses a fuzzy C-means(FCM) clustering algorithm and neural network. Features of facial expressions are extracted to two steps. In the first step, Gabor wavelet representation can provide edges extraction of major face components using the average value of the image's 2-D Gabor wavelet coefficient histogram. In the next step, we extract sparse features of facial expressions from the extracted edge information using FCM clustering algorithm. The result of facial expression recognition is compared with dimensional values of internal stated derived from semantic ratings of words related to emotion. The dimensional model can recognize not only six facial expressions related to Ekman's basic emotions, but also expressions of various internal states.

  • PDF

Differentiation of Facial EMG Responses Induced by Positive and Negative Emotions in Children (긍정정서와 부정정서에 따른 아동의 안면근육반응 차이)

  • Jang Eun-Hye;Lim Hye-Jin;Lee Young-Chang;Chung Soon-Cheol;Sohn Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.8 no.2
    • /
    • pp.161-167
    • /
    • 2005
  • The study is to examine how facial EMG responses change when children experience a positive emotion(happiness) and a negative emotion(fear). It is to prove that the positive emotion(happiness) could be distinguishable from the negative emotion(fear) by the EMG responses. Audiovisual film clips were used for evoking the positive emotion(happiness) and the negative emotion(fear). 47 children (11-13 years old, 23 boys and 24 girls) participated in the study Facial EMG (right corrugator and orbicularis oris) was measured while children were experiencing the positive or negative emotion. Emotional assessment scale was used for measuring children's psychological responses. It showed more than $85\%$ appropriateness and 3.15, 4.04 effectiveness (5 scale) for happiness and fear, respectively. Facial EMG responses were significantly different between a resting state and a emotional state both in happiness and in fear (p<001). Result suggests that each emotion was distinguishable by corrugator and orbicularis oris responses. Specifically, corrugator was more activated in the positive emotion(happiness) than in the negative emotion(fear), whereas orbicularis oris was more activated in the negative emotion(fear) than in the positive emotion(fear).

  • PDF

Gender Differences in Empathic Ability and Facial Emotion Recognition of Schizophrenic Patients (성별에 따른 조현병 환자의 공감 능력 및 얼굴 정서 인식 능력의 차이)

  • Kim, Ki-Chang;Son, Jung-Woo;Ghim, Hei-Rhee;Lee, Sang-Ick;Shin, Chul-Gin;Kim, Sie-Kyeong;Ju, Gawon;Eom, Jin-Sup;Jung, Myung-Sook;Park, Min;Moon, Eunok;Cheon, Young-Un
    • Korean Journal of Biological Psychiatry
    • /
    • v.21 no.1
    • /
    • pp.21-27
    • /
    • 2014
  • Objectives The aim of the present study was to investigate gender difference in empathic ability and recognition of facial emotion expression in schizophrenic patients. Methods Twenty-two schizophrenic outpatients (11 men and 11 women) and controls (10 men and 12 women) performed both the scale of Empathic Quotient (EQ) and facial emotion recognition test. We compared the scores of EQ and the facial emotion recognition test among each group according to diagnosis and gender. Results We found a significant sex difference in the scores of EQ and the facial emotion recognition test in the schizophrenic patients. And there were significantly negative correlation between the score of the facial emotion recognition test and the scores of Positive and Negative Symptom Scale (PANSS) in female schizophrenic patients. However, in male schizophrenic patients, there were no significant correlations between the score of each test and the scores of PANSS. Conclusions This study suggests that the sex difference in empathic ability and facial emotion recognition would be very important in chronic schizophrenic patients. Investigation of sex effects in empathic ability and facial emotion recognition in chronic schizophrenic patients would present an important solution for constructing optimal rehabilitation program.

A Study on Emotion Recognition Systems based on the Probabilistic Relational Model Between Facial Expressions and Physiological Responses (생리적 내재반응 및 얼굴표정 간 확률 관계 모델 기반의 감정인식 시스템에 관한 연구)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.6
    • /
    • pp.513-519
    • /
    • 2013
  • The current vision-based approaches for emotion recognition, such as facial expression analysis, have many technical limitations in real circumstances, and are not suitable for applications that use them solely in practical environments. In this paper, we propose an approach for emotion recognition by combining extrinsic representations and intrinsic activities among the natural responses of humans which are given specific imuli for inducing emotional states. The intrinsic activities can be used to compensate the uncertainty of extrinsic representations of emotional states. This combination is done by using PRMs (Probabilistic Relational Models) which are extent version of bayesian networks and are learned by greedy-search algorithms and expectation-maximization algorithms. Previous research of facial expression-related extrinsic emotion features and physiological signal-based intrinsic emotion features are combined into the attributes of the PRMs in the emotion recognition domain. The maximum likelihood estimation with the given dependency structure and estimated parameter set is used to classify the label of the target emotional states.

Effects of Working Memory Load on Negative Facial Emotion Processing: an ERP study (작업기억 부담이 부적 얼굴정서 처리에 미치는 영향: ERP 연구)

  • Park, Taejin;Kim, Junghee
    • Korean Journal of Cognitive Science
    • /
    • v.29 no.1
    • /
    • pp.39-59
    • /
    • 2018
  • To elucidate the effect of working memory (WM) load on negative facial emotion processing, we examined ERP components (P1 and N170) elicited by fearful and neutral expressions each of which was presented during 0-back (low-WM load) or 2-back (high-WM load) tasks. During N-back tasks, visual objects were presented one by one as targets and each of facial expressions was presented as a passively observed stimulus during intervals between targets. Behavioral results showed more accurate and fast responses at low-WM load condition compared to high-WM load condition. Analysis of mean amplitudes of P1 on the occipital region showed significant WM load effect (high-WM load > low-WM load) but showed nonsignificant facial emotion effect. Analysis of mean amplitudes of N170 on the posterior occipito-temporal region showed significant overall facial emotion effect (fearful > neutral), but, in detail, significant facial emotion effect was observed only at low-WM load condition on the left hemisphere, but was observed at high-WM load condition as well as low-WM load condition on the right hemisphere. To summarize, facial emotion effect observed by N170 amplitudes was modulated by WM load only on the left hemisphere. These results show that early emotional processing of negative facial expression could be eliminated or reduced by high load of WM on the left hemisphere, but could not be eliminated by high load on the right hemisphere, and suggest right hemispheric lateralization of negative facial emotion processing.

Study on the Relationship Between 12Meridians Flow and Facial Expressions by Emotion (감정에 따른 얼굴 표정변화와 12경락(經絡) 흐름의 상관성 연구)

  • Park, Yu-Jin;Moon, Ju-Ho;Choi, Su-Jin;Shin, Seon-Mi;Kim, Ki-Tae;Ko, Heung
    • Journal of Physiology & Pathology in Korean Medicine
    • /
    • v.26 no.2
    • /
    • pp.253-258
    • /
    • 2012
  • Facial expression was an important communication methods. In oriental medicine, according to the emotion the face has changed shape and difference occurs in physiology and pathology. To verify such a theory, we studied the correlation between emotional facial expressions and meridian and collateral flow. The facial region divided by meridian, outer brow was Gallbladder meridian, inner brow was Bladder meridian, medial canthus was Bladder meridian, lateral canthus was Gallbladder meridian, upper eyelid was Bladder meridian, lower eyelid was Stomach meridian, central cheeks was Stomach meridian, lateral cheeks was Small intestine meridian, upper and lower lips, lip corner, chin were Small and Large intestine meridian. Meridian and collateral associated with happiness was six. This proves happiness is a high importance on facial expression. Meridian and collateral associated with anger was five. Meridian and Collateral associated with fear and sadness was four. This shows fear and sadness are a low importance on facial expression than different emotion. Based on yang meridian which originally descending flow in the body, the ratio of anterograde and retrograde were happiness 3:4, angry 2:5, sadness 5:3, fear 4:1. Based on face of the meridian flow, the ratio of anterograde and retrograde were happiness 5:2, angry 3:4, sadness 3:5, fear 4:1. We found out that practical meridian and collateral flow change by emotion does not correspond to the expected meridian and collateral flow change by emotion.