• Title/Summary/Keyword: emotion of fear and anger

Search Result 87, Processing Time 0.024 seconds

Sound-based Emotion Estimation and Growing HRI System for an Edutainment Robot (에듀테인먼트 로봇을 위한 소리기반 사용자 감성추정과 성장형 감성 HRI시스템)

  • Kim, Jong-Cheol;Park, Kui-Hong
    • The Journal of Korea Robotics Society
    • /
    • v.5 no.1
    • /
    • pp.7-13
    • /
    • 2010
  • This paper presents the sound-based emotion estimation method and the growing HRI (human-robot interaction) system for a Mon-E robot. The method of emotion estimation uses the musical element based on the law of harmony and counterpoint. The emotion is estimated from sound using the information of musical elements which include chord, tempo, volume, harmonic and compass. In this paper, the estimated emotions display the standard 12 emotions including Eckman's 6 emotions (anger, disgust, fear, happiness, sadness, surprise) and the opposite 6 emotions (calmness, love, confidence, unhappiness, gladness, comfortableness) of those. The growing HRI system analyzes sensing information, estimated emotion and service log in an edutainment robot. So, it commands the behavior of the robot. The growing HRI system consists of the emotion client and the emotion server. The emotion client estimates the emotion from sound. This client not only transmits the estimated emotion and sensing information to the emotion server but also delivers response coming from the emotion server to the main program of the robot. The emotion server not only updates the rule table of HRI using information transmitted from the emotion client and but also transmits the response of the HRI to the emotion client. The proposed system was applied to a Mon-E robot and can supply friendly HRI service to users.

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF

An Exploratory Investigation on Visual Cues for Emotional Indexing of Image (이미지 감정색인을 위한 시각적 요인 분석에 관한 탐색적 연구)

  • Chung, SunYoung;Chung, EunKyung
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.48 no.1
    • /
    • pp.53-73
    • /
    • 2014
  • Given that emotion-based computing environment has grown recently, it is necessary to focus on emotional access and use of multimedia resources including images. The purpose of this study aims to identify the visual cues for emotion in images. In order to achieve it, this study selected five basic emotions such as love, happiness, sadness, fear, and anger and interviewed twenty participants to demonstrate the visual cues for emotions. A total of 620 visual cues mentioned by participants were collected from the interview results and coded according to five categories and 18 sub-categories for visual cues. Findings of this study showed that facial expressions, actions / behaviors, and syntactic features were found to be significant in terms of perceiving a specific emotion of the image. An individual emotion from visual cues demonstrated distinctive characteristics. The emotion of love showed a higher relation with visual cues such as actions and behaviors, and the happy emotion is substantially related to facial expressions. In addition, the sad emotion was found to be perceived primarily through actions and behaviors and the fear emotion is perceived considerably through facial expressions. The anger emotion is highly related to syntactic features such as lines, shapes, and sizes. Findings of this study implicated that emotional indexing could be effective when content-based features were considered in combination with concept-based features.

RECOGNIZING SIX EMOTIONAL STATES USING SPEECH SIGNALS

  • Kang, Bong-Seok;Han, Chul-Hee;Youn, Dae-Hee;Lee, Chungyong
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.366-369
    • /
    • 2000
  • This paper examines three algorithms to recognize speaker's emotion using the speech signals. Target emotions are happiness, sadness, anger, fear, boredom and neutral state. MLB(Maximum-Likeligood Bayes), NN(Nearest Neighbor) and HMM (Hidden Markov Model) algorithms are used as the pattern matching techniques. In all cases, pitch and energy are used as the features. The feature vectors for MLB and NN are composed of pitch mean, pitch standard deviation, energy mean, energy standard deviation, etc. For HMM, vectors of delta pitch with delta-delta pitch and delta energy with delta-delta energy are used. We recorded a corpus of emotional speech data and performed the subjective evaluation for the data. The subjective recognition result was 56% and was compared with the classifiers' recognition rates. MLB, NN, and HMM classifiers achieved recognition rates of 68.9%, 69.3% and 89.1% respectively, for the speaker dependent, and context-independent classification.

  • PDF

The Effects of Color Hue-Tone on Recognizing Emotions of Characters in the Film, Les Misérables

  • Kim, Yu-Jin
    • Science of Emotion and Sensibility
    • /
    • v.18 no.1
    • /
    • pp.67-78
    • /
    • 2015
  • This study investigated whether people experience a correspondence between color hue-tone and the main characters' emotions in the 2012 British musical drama film, Les $Mis\grave{e}rables$ through three practical experiments. Six screen images, which represent the characters' different emotions (Parrot's six primary types including love, joy, surprise, anger, sadness, and fear) were selected. For each screen image, participants were asked to judge the degree of the character's dominant emotions evoked from 17 varied screen images, which consisted of original chromatic and achromatized images as well as 15 color-filtered images (5 hues X 3 tones of the IRI color system). These tasks revealed that a chromatic color scheme is more effective to deliver the characters' positive emotions (i.e. love and joy) than an achromatic one. In addition, they proved that the hue and tone dimensions partially influence the relationships between the character emotions and colors.

An EEG Study of Emotion Using the International Affective Picture System (국제정서사진체계 ( IAPS ) 를 사용하여 유발된 정서의 뇌파 연구)

  • 이임갑;김지은;이경화;손진훈
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1997.11a
    • /
    • pp.224-227
    • /
    • 1997
  • The International Affective Picture System (IAPS) developed by Lang and colleagues[1] is a world-widely adopted tool in studices relating a variety of physiological indices to subjective emotions induced by the presentation of standardized pictures of which subjective ratings are well established in the three dimensions of pleasure, arousal and dominance. In the present stuey we investigated whether distinctive EEG characteristics for six discrete emotions can be discernible using 12 IAPS pictures that scored highest subjective ratings for one of the 6 categorical emotions, i. e., happiness, sadness, fear, anger, disgust, and surprise (Two slides for each emotion). These pictures as visual stimuli were randomly given to 38 right-handed college students (20-26 years old) with 30 sec of exposure time and 30sec of inter-stimulus interval for each picture while EEG signals were recorded from F3, F4, O1, and O2 referenced to linked ears. The FFT technoque were used to analyze the acquired EEG data. There were significant differences in RP value changes of EEG bands, most prominent in theta, between positive positive and negative emotions, and partial also among negative emotions. This result is in agreement with previous studies[2, 3]. However, it requires further studied to decided whether IAPS could be a useful tool for catigorical approaches to emotion in addition to its traditional uwe, namely dimensional to emotion.

  • PDF

Non-verbal Emotional Expressions for Social Presence of Chatbot Interface (챗봇의 사회적 현존감을 위한 비언어적 감정 표현 방식)

  • Kang, Minjeong
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.1
    • /
    • pp.1-11
    • /
    • 2021
  • The users of a chatbot messenger can be better engaged in the conversation if they feel intimacy with the chatbot. This can be achieved by the chatbot's effective expressions of human emotions to chatbot users. Thus motivated, this study aims to identify the appropriate emotional expressions of a chatbot that make people feel the social presence of the chatbot. In the background research, we obtained that facial expression is the most effective way of emotions and movement is important for relationship emersion. In a survey, we prepared moving text, moving gestures, and still emoticon that represent five emotions such as happiness, sadness, surprise, fear, and anger. Then, we asked the best way for them to feel social presence with a chatbot in each emotion. We found that, for an arousal and pleasant emotion such as 'happiness', people prefer moving gesture and text most while for unpleasant emotions such as 'sadness' and 'anger', people prefer emoticons. Lastly, for the neutral emotions such as 'surprise' and 'fear', people tend to select moving text that delivers clear meaning. We expect that this results of the study are useful for developing emotional chatbots that enable more effective conversations with users.

An Emotion Recognition Method using Facial Expression and Speech Signal (얼굴표정과 음성을 이용한 감정인식)

  • 고현주;이대종;전명근
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.6
    • /
    • pp.799-807
    • /
    • 2004
  • In this paper, we deal with an emotion recognition method using facial images and speech signal. Six basic human emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Emotion recognition using the facial expression is performed by using a multi-resolution analysis based on the discrete wavelet transform. And then, the feature vectors are extracted from the linear discriminant analysis method. On the other hand, the emotion recognition from speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and then the final recognition is obtained from a multi-decision making scheme.

Development of Deep Learning Models for Multi-class Sentiment Analysis (딥러닝 기반의 다범주 감성분석 모델 개발)

  • Syaekhoni, M. Alex;Seo, Sang Hyun;Kwon, Young S.
    • Journal of Information Technology Services
    • /
    • v.16 no.4
    • /
    • pp.149-160
    • /
    • 2017
  • Sentiment analysis is the process of determining whether a piece of document, text or conversation is positive, negative, neural or other emotion. Sentiment analysis has been applied for several real-world applications, such as chatbot. In the last five years, the practical use of the chatbot has been prevailing in many field of industry. In the chatbot applications, to recognize the user emotion, sentiment analysis must be performed in advance in order to understand the intent of speakers. The specific emotion is more than describing positive or negative sentences. In light of this context, we propose deep learning models for conducting multi-class sentiment analysis for identifying speaker's emotion which is categorized to be joy, fear, guilt, sad, shame, disgust, and anger. Thus, we develop convolutional neural network (CNN), long short term memory (LSTM), and multi-layer neural network models, as deep neural networks models, for detecting emotion in a sentence. In addition, word embedding process was also applied in our research. In our experiments, we have found that long short term memory (LSTM) model performs best compared to convolutional neural networks and multi-layer neural networks. Moreover, we also show the practical applicability of the deep learning models to the sentiment analysis for chatbot.

Responding to negative emotions in COVID-19 (코로나 19상황에 발생하는 부정적 정서에 대한 대처)

  • Yang, Hye-Jin
    • The Journal of the Convergence on Culture Technology
    • /
    • v.6 no.3
    • /
    • pp.135-143
    • /
    • 2020
  • This study was attempted to grasp the negative emotions of college freshmen in the COVID-19 situation and how to cope with them. The survey was conducted online from April 6 to 13, 2020, and the results were analyzed using a total of 220 copies of the survey data, excluding insufficient surveys. As a result of the analysis, the subject's negative emotion was the most frustrating, and the average value was highest in the order of helplessness, anxiety, and anger. As a result of verifying the differences according to gender, it was found that anxiety, lethargy, anger, fear, and confusion all experience statistically significantly more women than men. As a result of analyzing the correlation between the negative emotion of the subject and the method for resolving the negative emotion, the helplessness was found to have a significant negative correlation with the activities of talking with family, friends, and acquaintances. Showed statistically significant positive correlation. Based on these results, some practical methods that can be applied in universities are suggested.