• Title/Summary/Keyword: Human emotion

Search Result 1,210, Processing Time 0.026 seconds

Face Emotion Recognition by Fusion Model based on Static and Dynamic Image (정지영상과 동영상의 융합모델에 의한 얼굴 감정인식)

  • Lee Dae-Jong;Lee Kyong-Ah;Go Hyoun-Joo;Chun Myung-Geun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.5
    • /
    • pp.573-580
    • /
    • 2005
  • In this paper, we propose an emotion recognition using static and dynamic facial images to effectively design human interface. The proposed method is constructed by HMM(Hidden Markov Model), PCA(Principal Component) and wavelet transform. Facial database consists of six basic human emotions including happiness, sadness, anger, surprise, fear and dislike which have been known as common emotions regardless of nation and culture. Emotion recognition in the static images is performed by using the discrete wavelet. Here, the feature vectors are extracted by using PCA. Emotion recognition in the dynamic images is performed by using the wavelet transform and PCA. And then, those are modeled by the HMM. Finally, we obtained better performance result from merging the recognition results for the static images and dynamic images.

The study on emotion recognition by time-dependent parameters of autonomic nervous response (TDP(time-dependent parameters)를 적용하여 분석한 자율신경계 반응에 의한 감성인식에 대한 연구)

  • Kim, Jong-Hwa;Whang, Min-Cheol;Kim, Young-Joo;Woo, Jin-Cheol
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.637-644
    • /
    • 2008
  • Human emotion has been tried to be recognized by physiological measurements in developing emotion machine enabling to understand and react to user's emotion. This study is to find the time-dependent physiological measurements and their variation characteristics for discriminating emotions according to dimensional emotion model. Ten university students were asked to watch sixteen prepared images to evoke different emotions. Their subjective emotions and autonomic nervous responses such as ECG (electrocardiogram), PPG (photoplethysmogram), GSR (Galvanic skin response), RSP (respiration), and SKT(skin temperature) were measured during experiment. And these responses were analyzed into HR(Heart Rate), Respiration Rate, GSR amplitude average, SKT amplitude average, PPG amplitude, and PTT(Pulse Transition Time). TDPs(Time dependent parameters) defined as the delay, the activation, the half recovery and the full recovery of respective physiological signal in this study have been determined and statistically compared between variations from different emotions. The significant tendencies in TDP were shown between emotions. Therefore, TDP may provide useful measurements with emotion recognition.

  • PDF

The Effect of Emotional Experience with Korea's Low-Price Cosmetic Brands on Brand Relationship (국내 저가 화장품 브랜드에 대한 감성적 경험이 브랜드 관계에 미치는 영향)

  • Kim, Sung-Eun;Chung, Myung-Sun
    • The Research Journal of the Costume Culture
    • /
    • v.19 no.3
    • /
    • pp.565-578
    • /
    • 2011
  • The purpose of this study was to examine the emotional experience dimension on the brand of the low price cosmetics and to empirically investigate the effect of emotional experience on the relationship between customers and brand. Data were collected for 10 days starting on March $2^{nd}$ 2009. Questionnaires were distributed to 517 female college students who experienced to purchase the products of the low price cosmetic brand and answers were collected. The results of analysis on collected data showed that the emotional experience was classified into sense, spatial environmental emotion, sales promotion emotion, salesperson emotion and visual/verbal identity, and emotional experience on low price cosmetic brand had a positive effect on the relationship between customer and brand. Specifically, it was showed that emotional experience on low price cosmetic brand significantly affected the customer satisfaction, brand trust and brand attachment. In terms of effect of emotional experience on customer satisfaction, the orders in the degree of influence were following: the sense, spatial environmental emotion, sale promotion emotion and salesperson emotions. In terms of effect of emotional experience on brand trust, the orders in the degree of influence were following: the sense, salesperson emotions, sale promotion emotion and spatial environmental emotion. In terms of effect of emotional experience on brand attachment, the orders in the degree of influence were following: the sense, spatial environmental emotion, salesperson emotions, visual/verbal identity and sale promotion emotion.

Emotion Recognition Method of Competition-Cooperation Using Electrocardiogram (심전도를 이용한 경쟁-협력의 감성 인식 방법)

  • Park, Sangin;Lee, Don Won;Mun, Sungchul;Whang, Mincheol
    • Science of Emotion and Sensibility
    • /
    • v.21 no.3
    • /
    • pp.73-82
    • /
    • 2018
  • Attempts have been made to recognize social emotion, including competition-cooperation, while designing interaction in work places. This study aimed to determine the cardiac response associated with classifying competition-cooperation of social emotion. Sixty students from Sangmyung University participated in the study and were asked to play a pattern game to experience the social emotion associated with competition and cooperation. Electrocardiograms were measured during the task and were analyzed to obtain time domain indicators, such as RRI, SDNN, and pNN50, and frequency domain indicators, such as VLF, LF, HF, VLF/HF, LF/HF, lnVLF, lnLF, lnHF, and lnVLF/lnHF. The significance of classifying social emotions was assessed using an independent t-test. The rule-base for the classification was determined using significant parameters of 30 participants and verified from data obtained from another 30 participants. As a result, 91.67% participants were correctly classified. This study proposes a new method of classifying social emotions of competition and cooperation and provides objective data for designing social interaction.

Development of Emotion Contents Recommender System for Improvement of Sentimental Status (감정 및 정서상태 전이를 위한 감성 컨텐츠 추천 시스템 개발)

  • Park, Myon-Woong;Ha, Sung-Do;Jeong, Do-Un;Lyoo, In-Kyoon;Ahn, Seong-Min
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.1-11
    • /
    • 2007
  • An Infotainment Service intended to enhance the human emotion is introduced in this paper. The service is to be installed on the robot helping elderly persons to live comfortable and enjoyable life. The research started with defining the undesirable status of emotion in everyday life, and the psychological skills to cope with the status were sought about. Then, a methodology for providing emotion contents reflecting the coping skill has been suggested. Based on the Cognitive Behavior Therapy, the coping skill is used to edit animation clips. A movie recommendation system to utilize the edited animation clips has been being developed. A series of process for developing the system is described, where the emotion elements are taken into consideration in addition to the user preference as the criterion for recommendation.

  • PDF

Artificial Intelligence for Assistance of Facial Expression Practice Using Emotion Classification (감정 분류를 이용한 표정 연습 보조 인공지능)

  • Dong-Kyu, Kim;So Hwa, Lee;Jae Hwan, Bong
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.6
    • /
    • pp.1137-1144
    • /
    • 2022
  • In this study, an artificial intelligence(AI) was developed to help with facial expression practice in order to express emotions. The developed AI used multimodal inputs consisting of sentences and facial images for deep neural networks (DNNs). The DNNs calculated similarities between the emotions predicted by the sentences and the emotions predicted by facial images. The user practiced facial expressions based on the situation given by sentences, and the AI provided the user with numerical feedback based on the similarity between the emotion predicted by sentence and the emotion predicted by facial expression. ResNet34 structure was trained on FER2013 public data to predict emotions from facial images. To predict emotions in sentences, KoBERT model was trained in transfer learning manner using the conversational speech dataset for emotion classification opened to the public by AIHub. The DNN that predicts emotions from the facial images demonstrated 65% accuracy, which is comparable to human emotional classification ability. The DNN that predicts emotions from the sentences achieved 90% accuracy. The performance of the developed AI was evaluated through experiments with changing facial expressions in which an ordinary person was participated.

The Emotional Boundary Decision in a Linear Affect-Expression Space for Effective Robot Behavior Generation (효과적인 로봇 행동 생성을 위한 선형의 정서-표정 공간 내 감정 경계의 결정 -비선형의 제스처 동기화를 위한 정서, 표정 공간의 영역 결정)

  • Jo, Su-Hun;Lee, Hui-Sung;Park, Jeong-Woo;Kim, Min-Gyu;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.540-546
    • /
    • 2008
  • In the near future, robots should be able to understand human's emotional states and exhibit appropriate behaviors accordingly. In Human-Human Interaction, the 93% consist of the speaker's nonverbal communicative behavior. Bodily movements provide information of the quantity of emotion. Latest personal robots can interact with human using multi-modality such as facial expression, gesture, LED, sound, sensors and so on. However, a posture needs a position and an orientation only and in facial expression or gesture, movements are involved. Verbal, vocal, musical, color expressions need time information. Because synchronization among multi-modalities is a key problem, emotion expression needs a systematic approach. On the other hand, at low intensity of surprise, the face could be expressed but the gesture could not be expressed because a gesture is not linear. It is need to decide the emotional boundaries for effective robot behavior generation and synchronization with another expressible method. If it is so, how can we define emotional boundaries? And how can multi-modality be synchronized each other?

  • PDF

The study on Quantitative Analysis of Emotional Reaction Related with Step and Sound (스텝과 사운드의 정량적 감성반응 분석에 관한 연구)

  • Jeong, Jae-Wook
    • Archives of design research
    • /
    • v.18 no.2 s.60
    • /
    • pp.211-218
    • /
    • 2005
  • As digital Information equipment is new arrival, new paradigm such as 'function exist but form don't' is needed in the field of design. Therefore, the activity of design is focused on the relationship of human and machine against visual form. For that reason, it is involved emotional factor in the relationship and studied on new field, the emotional interlace. The goal of this paper is to suggest the way of emotional interface on searching multimedia data. The main target of paper is effect sound and human's step and the main way of research is visualization after measuring and analyzing numerically similarity level among emotion-words. This paper suggests the theoretical bad(ground such as personal opinion, the character of auditory information and human's step and case studies on the emotion research. The experimental content about sound is fueled from my previous research and the main experimental content about human's step is made with regression-expression to substitute Quantification method 1 for value about stimulation. The realistic prototype to apply the research result will is suggested on the next research after studying the search environment.

  • PDF

Difference of Autonomic Nervous System Responses among Boredom, Pain, and Surprise (무료함, 통증, 그리고 놀람 정서 간 자율신경계 반응의 차이)

  • Jang, Eun-Hye;Eum, Yeong-Ji;Park, Byoung-Jun;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.14 no.4
    • /
    • pp.503-512
    • /
    • 2011
  • Recently in HCI research, emotion recognition is one of the core processes to implement emotional intelligence. There are many studies using bio signals in order to recognize human emotions, but it has been done merely for the basic emotions and very few exists for the other emotions. The purpose of present study is to confirm the difference of autonomic nervous system (ANS) response in three emotions (boredom, pain, and surprise). There were totally 217 of participants (male 96, female 121), we presented audio-visual stimulus to induce boredom and surprise, and pressure by using the sphygmomanometer for pain. During presented emotional stimuli, we measured electrodermal activity (EDA), skin temperature (SKT), electrocardiac activity (ECG) and photoplethysmography (PPG), besides; we required them to classify their present emotion and its intensity according to the emotion assessment scale. As the results of emotional stimulus evaluation, emotional stimulus which we used was shown to mean 92.5% of relevance and 5.43 of efficiency; this inferred that each emotional stimulus caused its own emotion quite effectively. When we analyzed the results of the ANS response which had been measured, we ascertained the significant difference between the baseline and emotional state on skin conductance response, SKT, heart rate, low frequency and blood volume pulse amplitude. In addition, the ANS response caused by each emotion had significant differences among the emotions. These results can probably be able to use to extend the emotion theory and develop the algorithm in recognition of three kinds of emotions (boredom, surprise, and pain) by response measurement indicators and be used to make applications for differentiating various human emotions in computer system.

  • PDF