• Title/Summary/Keyword: Emotion Classification

Search Result 292, Processing Time 0.034 seconds

Comparison of EEG Feature Vector for Emotion Classification according to Music Listening (음악에 따른 감정분류을 위한 EEG특징벡터 비교)

  • Lee, So-Min;Byun, Sung-Woo;Lee, Seok-Pil
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.5
    • /
    • pp.696-702
    • /
    • 2014
  • Recently, researches on analyzing relationship between the state of emotion and musical stimuli using EEG are increasing. A selection of feature vectors is very important for the performance of EEG pattern classifiers. This paper proposes a comparison of EEG feature vectors for emotion classification according to music listening. For this, we extract some feature vectors like DAMV, IAV, LPC, LPCC from EEG signals in each class related to music listening and compare a separability of the extracted feature vectors using Bhattacharyya distance. So more effective feature vectors are recommended for emotion classification according to music listening.

Emotion Classification based on EEG signals with LSTM deep learning method (어텐션 메커니즘 기반 Long-Short Term Memory Network를 이용한 EEG 신호 기반의 감정 분류 기법)

  • Kim, Youmin;Choi, Ahyoung
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.26 no.1
    • /
    • pp.1-10
    • /
    • 2021
  • This study proposed a Long-Short Term Memory network to consider changes in emotion over time, and applied an attention mechanism to give weights to the emotion states that appear at specific moments. We used 32 channel EEG data from DEAP database. A 2-level classification (Low and High) experiment and a 3-level classification experiment (Low, Middle, and High) were performed on Valence and Arousal emotion model. As a result, accuracy of the 2-level classification experiment was 90.1% for Valence and 88.1% for Arousal. The accuracy of 3-level classification was 83.5% for Valence and 82.5% for Arousal.

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.

Hybrid-Feature Extraction for the Facial Emotion Recognition

  • Byun, Kwang-Sub;Park, Chang-Hyun;Sim, Kwee-Bo;Jeong, In-Cheol;Ham, Ho-Sang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1281-1285
    • /
    • 2004
  • There are numerous emotions in the human world. Human expresses and recognizes their emotion using various channels. The example is an eye, nose and mouse. Particularly, in the emotion recognition from facial expression they can perform the very flexible and robust emotion recognition because of utilization of various channels. Hybrid-feature extraction algorithm is based on this human process. It uses the geometrical feature extraction and the color distributed histogram. And then, through the independently parallel learning of the neural-network, input emotion is classified. Also, for the natural classification of the emotion, advancing two-dimensional emotion space is introduced and used in this paper. Advancing twodimensional emotion space performs a flexible and smooth classification of emotion.

  • PDF

Emotion Classification Method Using Various Ocular Features (다양한 눈의 특징 분석을 통한 감성 분류 방법)

  • Kim, Yoonkyoung;Won, Myoung Ju;Lee, Eui Chul
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.10
    • /
    • pp.463-471
    • /
    • 2014
  • In this paper, emotion classification was performed by using four ocular features extracted from near-infrared camera image. According to comparing with previous work, the proposed method used more ocular features and each feature was validated as significant one in terms of emotion classification. To minimize side effects on ocular features caused by using visual stimuli, auditory stimuli for causing two opposite emotion pairs such as "positive-negative" and "arousal-relaxation" were used. As four features for emotion classification, pupil size, pupil accommodation rate, blink frequency, and eye cloased duration were adopted which could be automatically extracted by using lab-made image processing software. At result, pupil accommodation rate and blink frequency were statistically significant features for classification arousal-relaxation. Also, eye closed duration was the most significant feature for classification positive-negative.

2D Emotion Classification using Short-Time Fourier Transform of Pupil Size Variation Signals and Convolutional Neural Network (동공크기 변화신호의 STFT와 CNN을 이용한 2차원 감성분류)

  • Lee, Hee-Jae;Lee, David;Lee, Sang-Goog
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.10
    • /
    • pp.1646-1654
    • /
    • 2017
  • Pupil size variation can not be controlled intentionally by the user and includes various features such as the blinking frequency and the duration of a blink, so it is suitable for understanding the user's emotional state. In addition, an ocular feature based emotion classification method should be studied for virtual and augmented reality, which is expected to be applied to various fields. In this paper, we propose a novel emotion classification based on CNN with pupil size variation signals which include not only various ocular feature information but also time information. As a result, compared to previous studies using the same database, the proposed method showed improved results of 5.99% and 12.98% respectively from arousal and valence emotion classification.

A Study on Emotion Classification using 4-Channel EEG Signals (4채널 뇌파 신호를 이용한 감정 분류에 관한 연구)

  • Kim, Dong-Jun;Lee, Hyun-Min
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.2
    • /
    • pp.23-28
    • /
    • 2009
  • This study describes an emotion classification method using two different feature parameters of four-channel EEG signals. One of the parameters is linear prediction coefficients based on AR modelling. Another one is cross-correlation coefficients on frequencies of ${\theta}$, ${\alpha}$, ${\beta}$ bands of FFT spectra. Using the linear predictor coefficients and the cross-correlation coefficients of frequencies, the emotion classification test for four emotions, such as anger, sad, joy, and relaxation is performed with an artificial neural network. The results of the two parameters showed that the linear prediction coefficients have produced the better results for emotion classification than the cross-correlation coefficients of FFT spectra.

  • PDF

A Study on the Emotion State Classification using Multi-channel EEG (다중채널 뇌파를 이용한 감정상태 분류에 관한 연구)

  • Kang, Dong-Kee;Kim, Heung-Hwan;Kim, Dong-Jun;Lee, Byung-Chae;Ko, Han-Woo
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2815-2817
    • /
    • 2001
  • This study describes the emotion classification using two different feature extraction methods for four-channel EEG signals. One of the methods is linear prediction analysis based on AR model. Another method is cross-correlation coefficients on frequencies of ${\theta}$, ${\alpha}$, ${\beta}$ bands. Using the linear predictor coefficients and the cross-correlation coefficients of frequencies, the emotion classification test for four emotions, such as anger, sad, joy, and relaxation is performed with a neural network. Comparing the results of two methods, it seems that the linear predictor coefficients produce the better results than the cross-correlation coefficients of frequencies for-emotion classification.

  • PDF

Music Emotion Classification Based On Three-Level Structure (3 레벨 구조 기반의 음악 무드분류)

  • Kim, Hyoung-Gook;Jeong, Jin-Guk
    • The Journal of the Acoustical Society of Korea
    • /
    • v.26 no.2E
    • /
    • pp.56-62
    • /
    • 2007
  • This paper presents the automatic music emotion classification on acoustic data. A three-level structure is developed. The low-level extracts the timbre and rhythm features. The middle-level estimates the indication functions that represent the emotion probability of a single analysis unit. The high-level predicts the emotion result based on the indication function values. Experiments are carried out on 695 homogeneous music pieces labeled with four emotions, including pleasant, calm, sad, and excited. Three machine learning methods, GMM, MLP, and SVM, are compared on the high-level. The best result of 90.16% is obtained by MLP method.