• Title/Summary/Keyword: Emotion Classification

Search Result 292, Processing Time 0.028 seconds

Measurement of Human Sensibility by Bio-Signal Analysis (생체신호 분석을 통한 인간감성의 측정)

  • Park, Joon-Young;Park, Jahng-Hyon;Park, Ji-Hyoung;Park, Dong-Soo
    • Proceedings of the KSME Conference
    • /
    • 2003.04a
    • /
    • pp.935-939
    • /
    • 2003
  • The emotion recognition is one of the most significant interface technologies which make the high level of human-machine communication possible. The central nervous system stimulated by emotional stimuli affects the autonomous nervous system like a heart, blood vessel, endocrine organs, and so on. Therefore bio-signals like HRV, ECG and EEG can reflect one' emotional state. This study investigates the correlation between emotional states and bio-signals to realize the emotion recognition. This study also covers classification of human emotional states, selection of the effective bio-signal and signal processing. The experimental results presented in this paper show possibility of the emotion recognition.

  • PDF

Analyzing and classifying emotional flow of story in emotion dimension space (정서 차원 공간에서 소설의 지배 정서 분석 및 분류)

  • Rhee, Shin-Young;Ham, Jun-Seok;Ko, Il-Ju
    • Korean Journal of Cognitive Science
    • /
    • v.22 no.3
    • /
    • pp.299-326
    • /
    • 2011
  • The text such as stories, blogs, chat, message and reviews have the overall emotional flow. It can be classified to the text having similar emotional flow if we compare the similarity between texts, and it can be used such as recommendations and opinion collection. In this paper, we extract emotion terms from the text sequentially and analysis emotion terms in the pleasantness-unpleasantness and activation dimension in order to identify the emotional flow of the text. To analyze the 'dominant emotion' which is the overall emotional flow in the text, we add the time dimension as sequential flow of the text, and analyze the emotional flow in three dimensional space: pleasantness-unpleasantness, activation and time. Also, we suggested that a classification method to compute similarity of the emotional flow in the text using the Euclidean distance in three dimensional space. With the proposed method, we analyze the dominant emotion in korean modern short stories and classify them to similar dominant emotion.

  • PDF

Emotion Recognition using Facial Thermal Images

  • Eom, Jin-Sup;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.3
    • /
    • pp.427-435
    • /
    • 2012
  • The aim of this study is to investigate facial temperature changes induced by facial expression and emotional state in order to recognize a persons emotion using facial thermal images. Background: Facial thermal images have two advantages compared to visual images. Firstly, facial temperature measured by thermal camera does not depend on skin color, darkness, and lighting condition. Secondly, facial thermal images are changed not only by facial expression but also emotional state. To our knowledge, there is no study to concurrently investigate these two sources of facial temperature changes. Method: 231 students participated in the experiment. Four kinds of stimuli inducing anger, fear, boredom, and neutral were presented to participants and the facial temperatures were measured by an infrared camera. Each stimulus consisted of baseline and emotion period. Baseline period lasted during 1min and emotion period 1~3min. In the data analysis, the temperature differences between the baseline and emotion state were analyzed. Eyes, mouth, and glabella were selected for facial expression features, and forehead, nose, cheeks were selected for emotional state features. Results: The temperatures of eyes, mouth, glanella, forehead, and nose area were significantly decreased during the emotional experience and the changes were significantly different by the kind of emotion. The result of linear discriminant analysis for emotion recognition showed that the correct classification percentage in four emotions was 62.7% when using both facial expression features and emotional state features. The accuracy was slightly but significantly decreased at 56.7% when using only facial expression features, and the accuracy was 40.2% when using only emotional state features. Conclusion: Facial expression features are essential in emotion recognition, but emotion state features are also important to classify the emotion. Application: The results of this study can be applied to human-computer interaction system in the work places or the automobiles.

Acoustic parameters for induced emotion categorizing and dimensional approach (자연스러운 정서 반응의 범주 및 차원 분류에 적합한 음성 파라미터)

  • Park, Ji-Eun;Park, Jeong-Sik;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.16 no.1
    • /
    • pp.117-124
    • /
    • 2013
  • This study examined that how precisely MFCC, LPC, energy, and pitch related parameters of the speech data, which have been used mainly for voice recognition system could predict the vocal emotion categories as well as dimensions of vocal emotion. 110 college students participated in this experiment. For more realistic emotional response, we used well defined emotion-inducing stimuli. This study analyzed the relationship between the parameters of MFCC, LPC, energy, and pitch of the speech data and four emotional dimensions (valence, arousal, intensity, and potency). Because dimensional approach is more useful for realistic emotion classification. It results in the best vocal cue parameters for predicting each of dimensions by stepwise multiple regression analysis. Emotion categorizing accuracy analyzed by LDA is 62.7%, and four dimension regression models are statistically significant, p<.001. Consequently, this result showed the possibility that the parameters could also be applied to spontaneous vocal emotion recognition.

  • PDF

Robust Real-time Tracking of Facial Features with Application to Emotion Recognition (안정적인 실시간 얼굴 특징점 추적과 감정인식 응용)

  • Ahn, Byungtae;Kim, Eung-Hee;Sohn, Jin-Hun;Kweon, In So
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.4
    • /
    • pp.266-272
    • /
    • 2013
  • Facial feature extraction and tracking are essential steps in human-robot-interaction (HRI) field such as face recognition, gaze estimation, and emotion recognition. Active shape model (ASM) is one of the successful generative models that extract the facial features. However, applying only ASM is not adequate for modeling a face in actual applications, because positions of facial features are unstably extracted due to limitation of the number of iterations in the ASM fitting algorithm. The unaccurate positions of facial features decrease the performance of the emotion recognition. In this paper, we propose real-time facial feature extraction and tracking framework using ASM and LK optical flow for emotion recognition. LK optical flow is desirable to estimate time-varying geometric parameters in sequential face images. In addition, we introduce a straightforward method to avoid tracking failure caused by partial occlusions that can be a serious problem for tracking based algorithm. Emotion recognition experiments with k-NN and SVM classifier shows over 95% classification accuracy for three emotions: "joy", "anger", and "disgust".

Emotion Prediction of Document using Paragraph Analysis (문단 분석을 통한 문서 내의 감정 예측)

  • Kim, Jinsu
    • Journal of Digital Convergence
    • /
    • v.12 no.12
    • /
    • pp.249-255
    • /
    • 2014
  • Recently, creation and sharing of information make progress actively through the SNS(Social Network Service) such as twitter, facebook and so on. It is necessary to extract the knowledge from aggregated information and data mining is one of the knowledge based approach. Especially, emotion analysis is a recent subdiscipline of text classification, which is concerned with massive collective intelligence from an opinion, policy, propensity and sentiment. In this paper, We propose the emotion prediction method, which extracts the significant key words and related key words from SNS paragraph, then predicts the emotion using these extracted emotion features.

A Study on the Emotion Analysis of Instagram Using Images and Hashtags (이미지와 해시태그를 이용한 인스타그램의 감정 분석 연구)

  • Jeong, Dahye;Gim, Jangwon
    • The Journal of Korean Institute of Information Technology
    • /
    • v.17 no.9
    • /
    • pp.123-131
    • /
    • 2019
  • Social network service users actively express and share their feelings about social issues and content of interest through postings. As a result, the sharing of emotions among individuals and community members in social network is spreading rapidly. Therefore, resulting in active research of emotion analysis on posting of users. However, There is insufficient research on emotion analysis for postings containing various emotions. In this paper, we propose a method that analyzes the emotions of an Instagram posts using hashtags and images. This method extracts representative emotion from user posts containing multiple emotions with 66.4% accuracy and 81.7% recall, which improves the emotion classification performance compared to the previous method.

Speech emotion recognition through time series classification (시계열 데이터 분류를 통한 음성 감정 인식)

  • Kim, Gi-duk;Kim, Mi-sook;Lee, Hack-man
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2021.07a
    • /
    • pp.11-13
    • /
    • 2021
  • 본 논문에서는 시계열 데이터 분류를 통한 음성 감정 인식을 제안한다. mel-spectrogram을 사용하여 음성파일에서 특징을 뽑아내 다변수 시계열 데이터로 변환한다. 이를 Conv1D, GRU, Transformer를 결합한 딥러닝 모델에 학습시킨다. 위의 딥러닝 모델에 음성 감정 인식 데이터 세트인 TESS, SAVEE, RAVDESS, EmoDB에 적용하여 각각의 데이터 세트에서 기존의 모델 보다 높은 정확도의 음성 감정 분류 결과를 얻을 수 있었다. 정확도는 99.60%, 99.32%, 97.28%, 99.86%를 얻었다.

  • PDF

Classification of Human Sense Indexes Based on G7 HAN Project (G7 감성공학기반사업에 기초한 감성지표 분류체계에 관한 연구)

  • 이지혜;김진호;박수찬;이상태
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2001.05a
    • /
    • pp.304-307
    • /
    • 2001
  • 본 연구에서는 G7 감성공학 연구에서 발생한 231 개의 지표들을 이용자가 접근 용이하도록 분류하여 그 체계를 갖추는 작업을 수행하여 web 상에 제공함으로써 감성공학에 대해 적은 지식을 가진 이용자라 할지라도 분류체계에 따라 지표를 검색하고 이용할 수 있게 하는 것을 목표로 하고자 한다. 본 연구의 결과를 이용하면 현재 서비스 실시 중인 감성공학 인터넷 사이트(http://www.gamsung.or.kr)에서 감성지표의 검색 및 조회의 사용성을 향상 시킬 것으로 기대한다.

  • PDF

CLASSIFICATION OF BINARY DECISION RESPONSES USING EEG (뇌파를 이용한 양분법적 판단반응의 분류)

  • 문성실;최상섭;류창수;손진훈
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1999.03a
    • /
    • pp.281-284
    • /
    • 1999
  • 본 연구는 인간의 뇌파로부터 간단한 의사 표시를 식별하는 기술을 얻어 뇌파인터페이스를 구현하기 위한 기초연구로서 수행되었다. 실험에 참가한 피험자들은 컴퓨터 화면에 나타나는 문제를 본 후 답을 제시받았을 때 이것이 옳은지, 그른지에 대한 양분법적 판단반응을 해야하며, 이때 동시에 뇌파가 기록되었다. 옳다는 긍정반응과, 옳지 않다는 부정반응시의 뇌파를 비교한 결과 전두엽 부위의 fp1, f3, f4 부위에서 부정의 대답을 할 경우 theta파와 fast alpha파의 상대적 출현량이 긍정의 경우에 비하여 통계적으로 유의하게 컸다.

  • PDF