• 제목/요약/키워드: Human emotion

검색결과 1,204건 처리시간 0.03초

Emotion Recognition by CCD Color Image

  • Joo, Young-Hoon;Lee, Sang-Yoon;Oh, Jae-Heung;Sim, Kwee-Bo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.138.2-138
    • /
    • 2001
  • This paper proposes the technique for recognizing the human´s emotion by using the CCD color image. To do this, we first acquire the color image from the CCD camera. And then propose the method for recognizing the expressing to be represented the structural correlation of man´s feature points(eyebrows, eye, nose, mouse), In the proposed method. Human´s emotion is divided into four emotion(surprise, anger, happiness, sadness). Finally, we have proven the effectiveness of the proposed method through the experimentation.

  • PDF

성격과 친밀도를 지닌 로봇의 일반화된 상황 입력에 기반한 감정 생성 (Robot's Emotion Generation Model based on Generalized Context Input Variables with Personality and Familiarity)

  • 권동수;박종찬;김영민;김형록;송현수
    • 대한임베디드공학회논문지
    • /
    • 제3권2호
    • /
    • pp.91-101
    • /
    • 2008
  • For a friendly interaction between human and robot, emotional interchange has recently been more important. So many researchers who are investigating the emotion generation model tried to naturalize the robot's emotional state and to improve the usability of the model for the designer of the robot. And also the various emotion generation of the robot is needed to increase the believability of the robot. So in this paper we used the hybrid emotion generation architecture, and defined the generalized context input of emotion generation model for the designer to easily implement it to the robot. And we developed the personality and loyalty model based on the psychology for various emotion generation. Robot's personality is implemented with the emotional stability from Big-Five, and loyalty is made of familiarity generation, expression, and learning procedure which are based on the human-human social relationship such as balance theory and social exchange theory. We verify this emotion generation model by implementing it to the 'user calling and scheduling' scenario.

  • PDF

Efficient Emotional Relaxation Framework with Anisotropic Features Based Dijkstra Algorithm

  • Kim, Jong-Hyun
    • 한국컴퓨터정보학회논문지
    • /
    • 제25권4호
    • /
    • pp.79-86
    • /
    • 2020
  • 본 논문에서는 비등방성 특징 기반의 다익스트라 알고리즘(Dijkstra algorithm)을 이용한 효율적인 감성 완화 프레임워크를 제안한다. 감성을 완화시키는 것은 감성 분석만큼이나 중요하며, 사람의 우울함이나 외로움을 자동으로 완화시켜줄 수 있는 프레임워크로써 인간과 컴퓨터의 상호작용(HCI, Human-Computer Interaction)측면에서도 매우 중요한 의미를 갖는다. 본 논문에서는 1) 마이크로소프트의 Emotion API를 이용하여 얼굴 표정으로부터 변화하는 감정값을 계산하고, 2) 이 감정값의 차이를 이용하여 우울이나 외로움 같은 이상 감정을 인지한다. 3) 마지막으로, 감성 히스토그램과 비등방성 특성을 고려한 감정 메시 기반의 매칭 과정을 거침으로써 사용자에게 완화된 감성이 내포된 이미지들을 제시해준다. 본 논문에서 제안하는 기법은 얼굴 영상을 이용하여 사용자가 쉽게 감성의 변화를 인지하고, 완화된 감성으로 감정을 트레이닝 할 수 있는 시스템이다.

Emotion Detecting Method Based on Various Attributes of Human Voice

  • MIYAJI Yutaka;TOMIYAMA Ken
    • 감성과학
    • /
    • 제8권1호
    • /
    • pp.1-7
    • /
    • 2005
  • This paper reports several emotion detecting methods based on various attributes of human voice. These methods have been developed at our Engineering Systems Laboratory. It is noted that, in all of the proposed methods, only prosodic information in voice is used for emotion recognition and semantic information in voice is not used. Different types of neural networks(NNs) are used for detection depending on the type of voice parameters. Earlier approaches separately used linear prediction coefficients(LPCs) and time series data of pitch but they were combined in later studies. The proposed methods are explained first and then evaluation experiments of individual methods and their performances in emotion detection are presented and compared.

  • PDF

Emotion Recognition Method Based on Multimodal Sensor Fusion Algorithm

  • Moon, Byung-Hyun;Sim, Kwee-Bo
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제8권2호
    • /
    • pp.105-110
    • /
    • 2008
  • Human being recognizes emotion fusing information of the other speech signal, expression, gesture and bio-signal. Computer needs technologies that being recognized as human do using combined information. In this paper, we recognized five emotions (normal, happiness, anger, surprise, sadness) through speech signal and facial image, and we propose to method that fusing into emotion for emotion recognition result is applying to multimodal method. Speech signal and facial image does emotion recognition using Principal Component Analysis (PCA) method. And multimodal is fusing into emotion result applying fuzzy membership function. With our experiments, our average emotion recognition rate was 63% by using speech signals, and was 53.4% by using facial images. That is, we know that speech signal offers a better emotion recognition rate than the facial image. We proposed decision fusion method using S-type membership function to heighten the emotion recognition rate. Result of emotion recognition through proposed method, average recognized rate is 70.4%. We could know that decision fusion method offers a better emotion recognition rate than the facial image or speech signal.

개인성을 고려한 지식-감정 반응 모델의 설계 (The Design of Knowledge-Emotional Reaction Model considering Personality)

  • 심정연
    • 전자공학회논문지CI
    • /
    • 제47권1호
    • /
    • pp.116-122
    • /
    • 2010
  • 급격한 컴퓨터 기술의 발달로 인간과 컴퓨터 상호작용 (Human Computer Interface : HCI)에 대한 중요성이 높아지면서 보다 인간 친화적인 시스템 설계에 대한 요구가 높아지고 있다. 인간 친화적인 시스템 구축을 위해서는 먼저 개인성과 감성적요소가 고려되어야 한다. 지식(Knowledge), 감정(Emotion), 개인성(Personality)의 각 영역에서 이를 실현하려는 시도는 많이 되고 있으나 이 세 가지 요소를 연결하는 시도는 아직 미흡한 실정이다. 지식이 기억될 때 감정도 함께 기억되는 경우가 많으며 사고 과정과 결정 단계에서 감정적인 상태는 막대한 영향을 미친다. 따라서 좀 더 인간 친화적이며 섬세하고 효율적인 지능 시스템 구현을 위해서는 이러한 세 가지 요소를 고려한 시스템이 모델링되고 설계되어야 한다. 본 논문에서는 개인성을 고려한 지식-감정 반응 모델을 설계하고, 개인성을 구현하기위하여 5가지 타입을 정의하고 타입매칭선택 메커니즘에 의해 추출된 사고 스레드의 감정벡터를 계산하고 자극에 반응하는 방법을 제안하였다. 또한 제안된 시스템을 가상메모리에 적용하여 타입별 감정 반응을 시뮬레이션 하였다.

Discrimination of Three Emotions using Parameters of Autonomic Nervous System Response

  • Jang, Eun-Hye;Park, Byoung-Jun;Eum, Yeong-Ji;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • 대한인간공학회지
    • /
    • 제30권6호
    • /
    • pp.705-713
    • /
    • 2011
  • Objective: The aim of this study is to compare results of emotion recognition by several algorithms which classify three different emotional states(happiness, neutral, and surprise) using physiological features. Background: Recent emotion recognition studies have tried to detect human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 217 students participated in this experiment. While three kinds of emotional stimuli were presented to participants, ANS responses(EDA, SKT, ECG, RESP, and PPG) as physiological signals were measured in twice first one for 60 seconds as the baseline and 60 to 90 seconds during emotional states. The obtained signals from the session of the baseline and of the emotional states were equally analyzed for 30 seconds. Participants rated their own feelings to emotional stimuli on emotional assessment scale after presentation of emotional stimuli. The emotion classification was analyzed by Linear Discriminant Analysis(LDA, SPSS 15.0), Support Vector Machine (SVM), and Multilayer perceptron(MLP) using difference value which subtracts baseline from emotional state. Results: The emotional stimuli had 96% validity and 5.8 point efficiency on average. There were significant differences of ANS responses among three emotions by statistical analysis. The result of LDA showed that an accuracy of classification in three different emotions was 83.4%. And an accuracy of three emotions classification by SVM was 75.5% and 55.6% by MLP. Conclusion: This study confirmed that the three emotions can be better classified by LDA using various physiological features than SVM and MLP. Further study may need to get this result to get more stability and reliability, as comparing with the accuracy of emotions classification by using other algorithms. Application: This could help get better chances to recognize various human emotions by using physiological signals as well as be applied on human-computer interaction system for recognizing human emotions.

생체 신호와 몸짓을 이용한 감정인식 방법 (Emotion Recognition Method using Physiological Signals and Gestures)

  • 김호덕;양현창;심귀보
    • 한국지능시스템학회논문지
    • /
    • 제17권3호
    • /
    • pp.322-327
    • /
    • 2007
  • 심리학 분야의 연구자들은 Electroencephalographic(EEG)을 오래전부터 인간 두뇌의 활동을 측정 기록하는데 사용하였다. 과학이 발달함에 따라 점차적으로 인간의 두뇌에서 감정을 조절하는 기본적인 영역들이 밝혀지고 있다. 그래서 인간의 감정을 조절하는 인간의 두뇌 활동 영역들을 EEG를 이용하여 측정하였다. 손짓이나 고개의 움직임은 사람들 사이에 대화를 위한 인간의 몸 언어로 사용된다. 그리고 그것들의 인식은 컴퓨터와 인간 사이에 유용한 회화수단으로 매우 중요하다. 몸짓에 관한 연구들은 주로 영상을 통한 인식 방법이 주를 이루고 있다. 많은 연구자들의 기존 연구에서는 생체신호나 몸짓중 한 가지만을 이용하여 감정인식 방법 연구를 하였다. 본 논문에서는 EEG 신호와 몸짓을 같이 사용해서 사람의 감정을 인식하였다. 그리고 인식의 대상자를 운전자라는 특정 대상자를 설정하고 실험을 하였다. 실험 결과 생체신호와 몸짓을 같이 사용한 실점의 인식률이 둘 중 한 가지만을 사용한 것보다 높은 인식률을 보였다. 생체신호와 몸짓들의 특징 신호들은 강화학습의 개념을 이용한 IFS(Interactive Feature Selection)를 이용하여 특징 선택을 하였다.

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • 대한인간공학회지
    • /
    • 제31권5호
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.