• Title/Summary/Keyword: Human emotion

Search Result 1,204, Processing Time 0.033 seconds

Emotion Recognition by CCD Color Image

  • Joo, Young-Hoon;Lee, Sang-Yoon;Oh, Jae-Heung;Sim, Kwee-Bo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.138.2-138
    • /
    • 2001
  • This paper proposes the technique for recognizing the human´s emotion by using the CCD color image. To do this, we first acquire the color image from the CCD camera. And then propose the method for recognizing the expressing to be represented the structural correlation of man´s feature points(eyebrows, eye, nose, mouse), In the proposed method. Human´s emotion is divided into four emotion(surprise, anger, happiness, sadness). Finally, we have proven the effectiveness of the proposed method through the experimentation.

  • PDF

Robot's Emotion Generation Model based on Generalized Context Input Variables with Personality and Familiarity (성격과 친밀도를 지닌 로봇의 일반화된 상황 입력에 기반한 감정 생성)

  • Kwon, Dong-Soo;Park, Jong-Chan;Kim, Young-Min;Kim, Hyoung-Rock;Song, Hyunsoo
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.3 no.2
    • /
    • pp.91-101
    • /
    • 2008
  • For a friendly interaction between human and robot, emotional interchange has recently been more important. So many researchers who are investigating the emotion generation model tried to naturalize the robot's emotional state and to improve the usability of the model for the designer of the robot. And also the various emotion generation of the robot is needed to increase the believability of the robot. So in this paper we used the hybrid emotion generation architecture, and defined the generalized context input of emotion generation model for the designer to easily implement it to the robot. And we developed the personality and loyalty model based on the psychology for various emotion generation. Robot's personality is implemented with the emotional stability from Big-Five, and loyalty is made of familiarity generation, expression, and learning procedure which are based on the human-human social relationship such as balance theory and social exchange theory. We verify this emotion generation model by implementing it to the 'user calling and scheduling' scenario.

  • PDF

Efficient Emotional Relaxation Framework with Anisotropic Features Based Dijkstra Algorithm

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.4
    • /
    • pp.79-86
    • /
    • 2020
  • In this paper, we propose an efficient emotional relaxation framework using Dijkstra algorithm based on anisotropic features. Emotional relaxation is as important as emotional analysis. This is a framework that can automatically alleviate the person's depression or loneliness. This is very important for HCI (Human-Computer Interaction). In this paper, 1) Emotion value changing from facial expression is calculated using Microsoft's Emotion API, 2) Using these differences in emotion values, we recognize abnormal feelings such as depression or loneliness. 3) Finally, emotional mesh based matching process considering the emotional histogram and anisotropic characteristics is proposed, which suggests emotional relaxation to the user. In this paper, we propose a system which can recognize the change of emotion easily by using face image and train personal emotion by emotion relaxation.

Emotion Detecting Method Based on Various Attributes of Human Voice

  • MIYAJI Yutaka;TOMIYAMA Ken
    • Science of Emotion and Sensibility
    • /
    • v.8 no.1
    • /
    • pp.1-7
    • /
    • 2005
  • This paper reports several emotion detecting methods based on various attributes of human voice. These methods have been developed at our Engineering Systems Laboratory. It is noted that, in all of the proposed methods, only prosodic information in voice is used for emotion recognition and semantic information in voice is not used. Different types of neural networks(NNs) are used for detection depending on the type of voice parameters. Earlier approaches separately used linear prediction coefficients(LPCs) and time series data of pitch but they were combined in later studies. The proposed methods are explained first and then evaluation experiments of individual methods and their performances in emotion detection are presented and compared.

  • PDF

Emotion Recognition Method Based on Multimodal Sensor Fusion Algorithm

  • Moon, Byung-Hyun;Sim, Kwee-Bo
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.8 no.2
    • /
    • pp.105-110
    • /
    • 2008
  • Human being recognizes emotion fusing information of the other speech signal, expression, gesture and bio-signal. Computer needs technologies that being recognized as human do using combined information. In this paper, we recognized five emotions (normal, happiness, anger, surprise, sadness) through speech signal and facial image, and we propose to method that fusing into emotion for emotion recognition result is applying to multimodal method. Speech signal and facial image does emotion recognition using Principal Component Analysis (PCA) method. And multimodal is fusing into emotion result applying fuzzy membership function. With our experiments, our average emotion recognition rate was 63% by using speech signals, and was 53.4% by using facial images. That is, we know that speech signal offers a better emotion recognition rate than the facial image. We proposed decision fusion method using S-type membership function to heighten the emotion recognition rate. Result of emotion recognition through proposed method, average recognized rate is 70.4%. We could know that decision fusion method offers a better emotion recognition rate than the facial image or speech signal.

The Design of Knowledge-Emotional Reaction Model considering Personality (개인성을 고려한 지식-감정 반응 모델의 설계)

  • Shim, Jeong-Yon
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.1
    • /
    • pp.116-122
    • /
    • 2010
  • As the importance of HCI(Human-Computer Interface) caused by dramatically developed computer technology is getting high, the requirement for the design of human friendly systems is also getting high. First of all, the personality and Emotional factor should be considered for implementing more human friendly systems. Many studies on Knowledge, Emotion and personality have been made, but the combined methods connecting these three factors is not so many investigated yet. It is known that memorizing process includes not only knowledge but also the emotion and the emotion state has much effects on the process of reasoning and decision making step. Accordingly, for implementing more human friendly efficient sophisticated intelligent system, the system considering these three factors should be modeled and designed. In this paper, knowledge-emotion reaction model was designed. Five types are defined for representing the personality and emotion reaction mechanism calculating emotion vector based on the extracted Thought threads by Type matching selection was proposed. This system is applied to the virtual memory and its emotional reactions are simulated.

Discrimination of Three Emotions using Parameters of Autonomic Nervous System Response

  • Jang, Eun-Hye;Park, Byoung-Jun;Eum, Yeong-Ji;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.6
    • /
    • pp.705-713
    • /
    • 2011
  • Objective: The aim of this study is to compare results of emotion recognition by several algorithms which classify three different emotional states(happiness, neutral, and surprise) using physiological features. Background: Recent emotion recognition studies have tried to detect human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 217 students participated in this experiment. While three kinds of emotional stimuli were presented to participants, ANS responses(EDA, SKT, ECG, RESP, and PPG) as physiological signals were measured in twice first one for 60 seconds as the baseline and 60 to 90 seconds during emotional states. The obtained signals from the session of the baseline and of the emotional states were equally analyzed for 30 seconds. Participants rated their own feelings to emotional stimuli on emotional assessment scale after presentation of emotional stimuli. The emotion classification was analyzed by Linear Discriminant Analysis(LDA, SPSS 15.0), Support Vector Machine (SVM), and Multilayer perceptron(MLP) using difference value which subtracts baseline from emotional state. Results: The emotional stimuli had 96% validity and 5.8 point efficiency on average. There were significant differences of ANS responses among three emotions by statistical analysis. The result of LDA showed that an accuracy of classification in three different emotions was 83.4%. And an accuracy of three emotions classification by SVM was 75.5% and 55.6% by MLP. Conclusion: This study confirmed that the three emotions can be better classified by LDA using various physiological features than SVM and MLP. Further study may need to get this result to get more stability and reliability, as comparing with the accuracy of emotions classification by using other algorithms. Application: This could help get better chances to recognize various human emotions by using physiological signals as well as be applied on human-computer interaction system for recognizing human emotions.

Emotion Recognition Method using Physiological Signals and Gestures (생체 신호와 몸짓을 이용한 감정인식 방법)

  • Kim, Ho-Duck;Yang, Hyun-Chang;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.3
    • /
    • pp.322-327
    • /
    • 2007
  • Researchers in the field of psychology used Electroencephalographic (EEG) to record activities of human brain lot many years. As technology develope, neural basis of functional areas of emotion processing is revealed gradually. So we measure fundamental areas of human brain that controls emotion of human by using EEG. Hands gestures such as shaking and head gesture such as nodding are often used as human body languages for communication with each other, and their recognition is important that it is a useful communication medium between human and computers. Research methods about gesture recognition are used of computer vision. Many researchers study emotion recognition method which uses one of physiological signals and gestures in the existing research. In this paper, we use together physiological signals and gestures for emotion recognition of human. And we select the driver emotion as a specific target. The experimental result shows that using of both physiological signals and gestures gets high recognition rates better than using physiological signals or gestures. Both physiological signals and gestures use Interactive Feature Selection(IFS) for the feature selection whose method is based on a reinforcement learning.

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.