• 제목/요약/키워드: Human emotion

검색결과 1,204건 처리시간 0.024초

얼굴영상과 음성을 이용한 멀티모달 감정인식 (Multimodal Emotion Recognition using Face Image and Speech)

  • 이현구;김동주
    • 디지털산업정보학회논문지
    • /
    • 제8권1호
    • /
    • pp.29-40
    • /
    • 2012
  • A challenging research issue that has been one of growing importance to those working in human-computer interaction are to endow a machine with an emotional intelligence. Thus, emotion recognition technology plays an important role in the research area of human-computer interaction, and it allows a more natural and more human-like communication between human and computer. In this paper, we propose the multimodal emotion recognition system using face and speech to improve recognition performance. The distance measurement of the face-based emotion recognition is calculated by 2D-PCA of MCS-LBP image and nearest neighbor classifier, and also the likelihood measurement is obtained by Gaussian mixture model algorithm based on pitch and mel-frequency cepstral coefficient features in speech-based emotion recognition. The individual matching scores obtained from face and speech are combined using a weighted-summation operation, and the fused-score is utilized to classify the human emotion. Through experimental results, the proposed method exhibits improved recognition accuracy of about 11.25% to 19.75% when compared to the most uni-modal approach. From these results, we confirmed that the proposed approach achieved a significant performance improvement and the proposed method was very effective.

Emotion Recognition Method for Driver Services

  • Kim, Ho-Duck;Sim, Kwee-Bo
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제7권4호
    • /
    • pp.256-261
    • /
    • 2007
  • Electroencephalographic(EEG) is used to record activities of human brain in the area of psychology for many years. As technology developed, neural basis of functional areas of emotion processing is revealed gradually. So we measure fundamental areas of human brain that controls emotion of human by using EEG. Hands gestures such as shaking and head gesture such as nodding are often used as human body languages for communication with each other, and their recognition is important that it is a useful communication medium between human and computers. Research methods about gesture recognition are used of computer vision. Many researchers study Emotion Recognition method which uses one of EEG signals and Gestures in the existing research. In this paper, we use together EEG signals and Gestures for Emotion Recognition of human. And we select the driver emotion as a specific target. The experimental result shows that using of both EEG signals and gestures gets high recognition rates better than using EEG signals or gestures. Both EEG signals and gestures use Interactive Feature Selection(IFS) for the feature selection whose method is based on the reinforcement learning.

칠정(七情)으로 유발되는 병증(病證)의 유형 연구 (A Study on Symptoms Derived from Seven Emotions on DongUiBoGam)

  • 이병희;유승연;박영배;박영재;오환섭;김민용
    • 대한한의진단학회지
    • /
    • 제14권2호
    • /
    • pp.13-24
    • /
    • 2010
  • Background and purpose: Seven Emotions consist of Joy(喜), Anger(怒), Anxiety(憂), Thought(思), Sorrow(悲), Fear(恐), Fright(驚). If Seven Emotion is excessive, its extreme mental stimulation causes physical illness. There was no study of the Seven Emotion Disease in detail for now. Therefore the purpose of this study is to pigeonhole the Seven Emotion Disease. Methods: We extract the sentences about the Seven Emotion and related words in Donguibogam. We classify the sententences into Joy(喜), Anger(怒), Anxiety(憂), Thought(思), Sorrow(悲), Fear(恐), Fright(驚), Frustration, Mental Exhaustion, Character. We analysis pattern of Symptoms Derived from Seven Emotions. Results and Conclusions Seven Emotion give rise to various type of symptom. In special Anger cause more illness than other Seven Emotion.

가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석 (Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans)

  • 김진옥
    • 인터넷정보학회논문지
    • /
    • 제13권5호
    • /
    • pp.9-19
    • /
    • 2012
  • 디지털 콘텐츠에서 HCI로 활용되는 가상 인간은 얼굴 표정과 신체자세와 같은 모달을 이용하여 다양한 감정을 표현하지만 비언어적 다중모달의 조합에 대한 연구는 많지 않다. 감정을 표현하는 가상 인간을 제작하려면 계산 엔진 모델은 얼굴 표정과 신체자세와 같은 비언어적 모달의 조합이 사용자에 의해 어떻게 인식되는지를 고려해야 하기 때문에 본 연구는 가상 인간의 감정 표현 디자인에 필요한 비언어적 다중모달의 영향을 분석하여 제시한다. 먼저 가상 인간에 대한 다중모달 별 감정 인식을 평가하여 다른 모달간의 상대적 영향성을 분석하였다. 그리고 일치하는 얼굴과 자세 모달을 통해 기본 감정 및 정서가와 활성화 인식에 대한 영향을 평가하며 감정이 불일치하는 다중모달을 통해 일상생활에서 빈번하게 드러나는 중첩된 감정의 인식 정도를 관측하였다. 실험 결과, 가상 인간의 얼굴과 신체자세의 표정이 일치하면 감정 인식이 용이하며, 얼굴 표정으로 감정 카테고리를 판별하지만 감정의 활성화 차원 판단에는 자세 모달리티가 선호됨을 확인하였다. 본 연구 결과는 감정을 드러내는 가상 인간의 행동 동기화 및 애니메이션 엔진 시스템 구현에 활용할 수 있다.

어머니의 정서표현과 아동의 기질 및 자아존중감이 정서조절능력에 미치는 영향 (The Effects of Maternal Emotion Expression, Temperament and Self-Esteem on Emotion Regulation among Children)

  • 이경님
    • 한국생활과학회지
    • /
    • 제18권6호
    • /
    • pp.1209-1219
    • /
    • 2009
  • The purpose of this study examined the path model of maternal emotional expression, temperament and self-esteem on emotion regulation among children. The subjects were 487 5th and 6th graders. Data was gathered through questionnaires reported by children and their mothers and analyzed by structural equation modeling. The results showed that children's 'activity level' temperament and maternal negative emotional expression directly affected maladaptive emotion regulation. Children's 'emotionality' temperament and maternal positive emotional expression directly affected adaptive emotion regulation. Children's 'approach-flexibility' temperament and self-esteem directly affected both maladaptive and adaptive emotion regulation. Maternal emotional expression and children's self-esteem mediated between children's temperament and emotion regulation. Additionally, the most important variable predicting children's maladaptive emotion regulation was the children's 'activity level' temperament, and the most important variable for adaptive emotion regulation was the children's 'emotionality' temperament.

비젼에 의한 감성인식 (Emotion Recognition by Vision System)

  • 이상윤;오재흥;주영훈;심귀보
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2001년도 추계학술대회 학술발표 논문집
    • /
    • pp.203-207
    • /
    • 2001
  • In this Paper, we propose the neural network based emotion recognition method for intelligently recognizing the human's emotion using CCD color image. To do this, we first acquire the color image from the CCD camera, and then propose the method for recognizing the expression to be represented the structural correlation of man's feature Points(eyebrows, eye, nose, mouse) It is central technology that the Process of extract, separate and recognize correct data in the image. for representation is expressed by structural corelation of human's feature Points In the Proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). Had separated complexion area using color-difference of color space by method that have separated background and human's face toughly to change such as external illumination in this paper. For this, we propose an algorithm to extract four feature Points from the face image acquired by the color CCD camera and find normalization face picture and some feature vectors from those. And then we apply back-prapagation algorithm to the secondary feature vector. Finally, we show the Practical application possibility of the proposed method.

  • PDF

로봇의 인간과 유사한 행동을 위한 2차원 무드 모델 제안 (Proposal of 2D Mood Model for Human-like Behaviors of Robot)

  • 김원화;박정우;김우현;이원형;정명진
    • 로봇학회논문지
    • /
    • 제5권3호
    • /
    • pp.224-230
    • /
    • 2010
  • As robots are no longer just working labors in the industrial fields, but stepping into the human's daily lives, interaction and communication between human and robot is becoming essential. For this social interaction with humans, emotion generation of a robot has become necessary, which is a result of very complicated process. Concept of mood has been considered in psychology society as a factor that effects on emotion generation, which is similar to emotion but not the same. In this paper, mood factors for robot considering not only the conditions of the robot itself but also the circumstances of the robot are listed, chosen and finally considered as elements defining a 2-dimensional mood space. Moreover, architecture that combines the proposed mood model and a emotion generation module is given at the end.

AE-Artificial Emotion

  • Xuyan, Tu;Liqun, Han
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.146-149
    • /
    • 2003
  • This paper proposes the concept of “Artificial Emotion”(AE). The goal of AE is simulation, extension and expansion of natural emotion, especially human emotion. The object of AE is machine emotion and emotion machine. The contents of AE are emotion recognition, emotion measurement, emotion understanding emotion representation emotion generation, emotion processing, emotion control and emotion communication. The methodology, technology, scientific significance and application value of artificial emotion are discussed

  • PDF

Emotion Modeling for Emotion-based Personalization Service

  • Kim, Tae Yeun;Bae, Sang Hyun
    • 통합자연과학논문집
    • /
    • 제13권3호
    • /
    • pp.97-104
    • /
    • 2020
  • This study suggests the emotion space modeling and emotion inference methods suitable for personalized services based on psychological and emotional models. For personalized emotion space modeling taking into account the subjective disposition based on the empirical assessment of the personal emotions felt by the personalization process of emotion space was used as a decision support tool, the Analytic Hierarchy Process. This confirmed that the special learning to perform personalized emotion space modeling without considering the subjective tendencies. In particular to check the possible reasoning based on fuzzy emotion space modeling and sensitivity for the quantification and vague human emotion to it based on the inherent human sensitivity.

얼굴근전도와 얼굴표정으로 인한 감성의 정성적 평가에 대한 연구

  • 황민철;김지은;김철중
    • 대한인간공학회:학술대회논문집
    • /
    • 대한인간공학회 1996년도 춘계학술대회논문집
    • /
    • pp.264-269
    • /
    • 1996
  • Facial expression is innate communication skill of human. Human can recognize theri psychological state by facial parameters which contain surface movement, color, humidity and etc. This study is to quantify or qualify human emotion by measurement of facial electromyography (EMG) and facial movement. The measurement is taken at the facial area of frontalis and zygomaticus The results is indicative to discriminate the positive and negative respond of emotion and to extract the parameter sensitive to positive and negative facial-expression. The facial movement according to EMG shows the possibility of non-invasive technique of human emotion.

  • PDF