• Title/Summary/Keyword: Recognizing Emotion

Search Result 106, Processing Time 0.044 seconds

Emotion Recognition Based on Human Gesture (인간의 제스쳐에 의한 감정 인식)

  • Song, Min-Kook;Park, Jin-Bae;Joo, Young-Hoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.1
    • /
    • pp.46-51
    • /
    • 2007
  • This paper is to present gesture analysis for human-robot interaction. Understanding human emotions through gesture is one of the necessary skills fo the computers to interact intelligently with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. For efficient operation we used recognizing a gesture with HMM(Hidden Markov Model). We constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile system.

Development of Emotion-Based Human Interaction Method for Intelligent Robot (지능형 로봇을 위한 감성 기반 휴먼 인터액션 기법 개발)

  • Joo, Young-Hoon;So, Jea-Yun;Sim, Kee-Bo;Song, Min-Kook;Park, Jin-Bae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.5
    • /
    • pp.587-593
    • /
    • 2006
  • This paper is to present gesture analysis for human-robot interaction. Understanding human emotions through gesture is one of the necessary skills for the computers to interact intelligently with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. For efficient operation we used recognizing a gesture with HMM(Hidden Markov Model). We constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile system.

The Effects of Color Hue-Tone on Recognizing Emotions of Characters in the Film, Les Misérables

  • Kim, Yu-Jin
    • Science of Emotion and Sensibility
    • /
    • v.18 no.1
    • /
    • pp.67-78
    • /
    • 2015
  • This study investigated whether people experience a correspondence between color hue-tone and the main characters' emotions in the 2012 British musical drama film, Les $Mis\grave{e}rables$ through three practical experiments. Six screen images, which represent the characters' different emotions (Parrot's six primary types including love, joy, surprise, anger, sadness, and fear) were selected. For each screen image, participants were asked to judge the degree of the character's dominant emotions evoked from 17 varied screen images, which consisted of original chromatic and achromatized images as well as 15 color-filtered images (5 hues X 3 tones of the IRI color system). These tasks revealed that a chromatic color scheme is more effective to deliver the characters' positive emotions (i.e. love and joy) than an achromatic one. In addition, they proved that the hue and tone dimensions partially influence the relationships between the character emotions and colors.

RECOGNIZING SIX EMOTIONAL STATES USING SPEECH SIGNALS

  • Kang, Bong-Seok;Han, Chul-Hee;Youn, Dae-Hee;Lee, Chungyong
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.366-369
    • /
    • 2000
  • This paper examines three algorithms to recognize speaker's emotion using the speech signals. Target emotions are happiness, sadness, anger, fear, boredom and neutral state. MLB(Maximum-Likeligood Bayes), NN(Nearest Neighbor) and HMM (Hidden Markov Model) algorithms are used as the pattern matching techniques. In all cases, pitch and energy are used as the features. The feature vectors for MLB and NN are composed of pitch mean, pitch standard deviation, energy mean, energy standard deviation, etc. For HMM, vectors of delta pitch with delta-delta pitch and delta energy with delta-delta energy are used. We recorded a corpus of emotional speech data and performed the subjective evaluation for the data. The subjective recognition result was 56% and was compared with the classifiers' recognition rates. MLB, NN, and HMM classifiers achieved recognition rates of 68.9%, 69.3% and 89.1% respectively, for the speaker dependent, and context-independent classification.

  • PDF

The Accuracy of Recognizing Emotion From Korean Standard Facial Expression (한국인 표준 얼굴 표정 이미지의 감성 인식 정확률)

  • Lee, Woo-Ri;Whang, Min-Cheol
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.9
    • /
    • pp.476-483
    • /
    • 2014
  • The purpose of this study was to make a suitable images for korean emotional expressions. KSFI(Korean Standard Facial Image)-AUs was produced from korean standard apperance and FACS(Facial Action coding system)-AUs. For the objectivity of KSFI, the survey was examined about emotion recognition rate and contribution of emotion recognition in facial elements from six-basic emotional expression images(sadness, happiness, disgust, fear, anger and surprise). As a result of the experiment, the images of happiness, surprise, sadness and anger which had shown higher accuracy. Also, emotional recognition rate was mainly decided by the facial element of eyes and a mouth. Through the result of this study, KSFI contents which could be combined AU images was proposed. In this future, KSFI would be helpful contents to improve emotion recognition rate.

Emotion Recognition Using Template Vector and Neural-Network (형판 벡터와 신경망을 이용한 감성인식)

  • Joo, Young-Hoon;Oh, Jae-Heung
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.13 no.6
    • /
    • pp.710-715
    • /
    • 2003
  • In this paper, we propose the new emotion recognition method for intelligently recognizing the human's emotion using the template vector and neural network. In the proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). The proposed method is based on the template vector extraction and the template location recognition by using the color difference. It is not easy to extract the skin color area correctly using the single color space. To solve this problem, we propose the extraction method using the various color spaces and using the each template vectors. And then we apply the back-propagation algorithm by using the template vectors among the feature points). Finally, we show the practical application possibility of the proposed method.

Speaker-Dependent Emotion Recognition For Audio Document Indexing

  • Hung LE Xuan;QUENOT Georges;CASTELLI Eric
    • Proceedings of the IEEK Conference
    • /
    • summer
    • /
    • pp.92-96
    • /
    • 2004
  • The researches of the emotions are currently great interest in speech processing as well as in human-machine interaction domain. In the recent years, more and more of researches relating to emotion synthesis or emotion recognition are developed for the different purposes. Each approach uses its methods and its various parameters measured on the speech signal. In this paper, we proposed using a short-time parameter: MFCC coefficients (Mel­Frequency Cepstrum Coefficients) and a simple but efficient classifying method: Vector Quantification (VQ) for speaker-dependent emotion recognition. Many other features: energy, pitch, zero crossing, phonetic rate, LPC... and their derivatives are also tested and combined with MFCC coefficients in order to find the best combination. The other models: GMM and HMM (Discrete and Continuous Hidden Markov Model) are studied as well in the hope that the usage of continuous distribution and the temporal behaviour of this set of features will improve the quality of emotion recognition. The maximum accuracy recognizing five different emotions exceeds $88\%$ by using only MFCC coefficients with VQ model. This is a simple but efficient approach, the result is even much better than those obtained with the same database in human evaluation by listening and judging without returning permission nor comparison between sentences [8]; And this result is positively comparable with the other approaches.

  • PDF

Vision System for NN-based Emotion Recognition (신경회로망 기반 감성 인식 비젼 시스템)

  • Lee, Sang-Yun;Kim, Sung-Nam;Joo, Young-Hoon;Park, Chang-Hyun;Sim, Kwee-Bo
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2036-2038
    • /
    • 2001
  • In this paper, we propose the neural network based emotion recognition method for intelligently recognizing the human's emotion using vision system. In the proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). Also, we use R,G,B(red, green, blue) color image data and the gray image data to get the highly trust rate of feature point extraction. For this, we propose an algorithm to extract four feature points (eyebrow, eye, nose, mouth) from the face image acquired by the color CCD camera and find some feature vectors from those. And then we apply back-prapagation algorithm to the secondary feature vector(position and distance among the feature points). Finally, we show the practical application possibility of the proposed method.

  • PDF

Korean Facial Expression Emotion Recognition based on Image Meta Information (이미지 메타 정보 기반 한국인 표정 감정 인식)

  • Hyeong Ju Moon;Myung Jin Lim;Eun Hee Kim;Ju Hyun Shin
    • Smart Media Journal
    • /
    • v.13 no.3
    • /
    • pp.9-17
    • /
    • 2024
  • Due to the recent pandemic and the development of ICT technology, the use of non-face-to-face and unmanned systems is expanding, and it is very important to understand emotions in communication in non-face-to-face situations. As emotion recognition methods for various facial expressions are required to understand emotions, artificial intelligence-based research is being conducted to improve facial expression emotion recognition in image data. However, existing research on facial expression emotion recognition requires high computing power and a lot of learning time because it utilizes a large amount of data to improve accuracy. To improve these limitations, this paper proposes a method of recognizing facial expressions using age and gender, which are image meta information, as a method of recognizing facial expressions with even a small amount of data. For facial expression emotion recognition, a face was detected using the Yolo Face model from the original image data, and age and gender were classified through the VGG model based on image meta information, and then seven emotions were recognized using the EfficientNet model. The accuracy of the proposed data classification learning model was higher as a result of comparing the meta-information-based data classification model with the model trained with all data.

Development of Bio-sensor-Based Feature Extraction and Emotion Recognition Model (바이오센서 기반 특징 추출 기법 및 감정 인식 모델 개발)

  • Cho, Ye Ri;Pae, Dong Sung;Lee, Yun Kyu;Ahn, Woo Jin;Lim, Myo Taeg;Kang, Tae Koo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.11
    • /
    • pp.1496-1505
    • /
    • 2018
  • The technology of emotion recognition is necessary for human computer interaction communication. There are many cases where one cannot communicate without considering one's emotion. As such, emotional recognition technology is an essential element in the field of communication. n this regard, it is highly utilized in various fields. Various bio-sensor sensors are used for human emotional recognition and can be used to measure emotions. This paper proposes a system for recognizing human emotions using two physiological sensors. For emotional classification, two-dimensional Russell's emotional model was used, and a method of classification based on personality was proposed by extracting sensor-specific characteristics. In addition, the emotional model was divided into four emotions using the Support Vector Machine classification algorithm. Finally, the proposed emotional recognition system was evaluated through a practical experiment.