• Title/Summary/Keyword: Emotion Classification

Search Result 292, Processing Time 0.033 seconds

Classification Scheme using Emotional Elements for Abstract Computer-Generated Images (감성 요소에 기반한 추상 CGI의 분류)

  • Seo, Dong-Su;Choi, Min-Young
    • Science of Emotion and Sensibility
    • /
    • v.14 no.2
    • /
    • pp.293-300
    • /
    • 2011
  • The CGI(Computer-generated Image) techniques provide designers with an effective means of creating design artifacts in an automatic way. It has been pointed that two important activities while applying the CGI techniques are both image generation and managemental issues for the generated images. By applying automatic generation techniques for creation of images, designers can acquire benefits in that they can produce free style results in a simple way. Along with such benefits, it is also important for designer to identify and to establish well defined mechanisms for storing vast quantity of auto-generated CGIs. However, it is problematic to assign key-words and to classify abstract images mainly because they lack an analogy of the real world entities. This paper presents classification scheme for the abstract CGIs by applying classification and description criteria from the viewpoint of both design elements and emotional elements. Effective classification and specification can help designers build and retrieve desired images in an easy way, and make management process more simple and effective.

  • PDF

Classification between Intentional and Natural Blinks in Infrared Vision Based Eye Tracking System

  • Kim, Song-Yi;Noh, Sue-Jin;Kim, Jin-Man;Whang, Min-Cheol;Lee, Eui-Chul
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.601-607
    • /
    • 2012
  • Objective: The aim of this study is to classify between intentional and natural blinks in vision based eye tracking system. Through implementing the classification method, we expect that the great eye tracking method will be designed which will perform well both navigation and selection interactions. Background: Currently, eye tracking is widely used in order to increase immersion and interest of user by supporting natural user interface. Even though conventional eye tracking system is well focused on navigation interaction by tracking pupil movement, there is no breakthrough selection interaction method. Method: To determine classification threshold between intentional and natural blinks, we performed experiment by capturing eye images including intentional and natural blinks from 12 subjects. By analyzing successive eye images, two features such as eye closed duration and pupil size variation after eye open were collected. Then, the classification threshold was determined by performing SVM(Support Vector Machine) training. Results: Experimental results showed that the average detection accuracy of intentional blinks was 97.4% in wearable eye tracking system environments. Also, the detecting accuracy in non-wearable camera environment was 92.9% on the basis of the above used SVM classifier. Conclusion: By combining two features using SVM, we could implement the accurate selection interaction method in vision based eye tracking system. Application: The results of this research might help to improve efficiency and usability of vision based eye tracking method by supporting reliable selection interaction scheme.

Music classification system through emotion recognition based on regression model of music signal and electroencephalogram features (음악신호와 뇌파 특징의 회귀 모델 기반 감정 인식을 통한 음악 분류 시스템)

  • Lee, Ju-Hwan;Kim, Jin-Young;Jeong, Dong-Ki;Kim, Hyoung-Gook
    • The Journal of the Acoustical Society of Korea
    • /
    • v.41 no.2
    • /
    • pp.115-121
    • /
    • 2022
  • In this paper, we propose a music classification system according to user emotions using Electroencephalogram (EEG) features that appear when listening to music. In the proposed system, the relationship between the emotional EEG features extracted from EEG signals and the auditory features extracted from music signals is learned through a deep regression neural network. The proposed system based on the regression model automatically generates EEG features mapped to the auditory characteristics of the input music, and automatically classifies music by applying these features to an attention-based deep neural network. The experimental results suggest the music classification accuracy of the proposed automatic music classification framework.

A Study on the Image Scale through the Classification of Emotion in Web Site (웹사이트 사용자 감성유형 분류를 통한 감성척도 연구)

  • Hong, Soo-Youn;Lee, Hyun-Ju;Jin, Ki-Nam
    • Science of Emotion and Sensibility
    • /
    • v.12 no.1
    • /
    • pp.1-10
    • /
    • 2009
  • The purpose of this study is to find out the relationship between the design factor and the sensitivity in web site. The classification of sensitivity-types consists of the research of books and the survey, and the language specialist's review and the analysis of factor. The research of the Image Scale accomplished through the analysis of the result of sensitivity-types. The major findings of the analysis are summarized as follows. The webpage sensitivity-types are classified into the 7 types, namely 'refreshment', 'calm', 'refinement', 'strongness', 'youth', 'uniqueness', 'futurity'. As a result of analyzing of similarity between the adjectives by multiple standards, the web site Image Scale space consists of the axis between 'heavy-light' and 'soft-hard'. As a result of the research of relationship between the web site design factor and the emotion, the color and the layout influenced into 'soft-hard' much, and the light and the color influenced into 'heavy-light' much.

  • PDF

Development of Personalized Media Contents Curation System based on Emotional Information (감성 정보 기반 맞춤형 미디어콘텐츠 큐레이션 시스템 개발)

  • Im, Ji-Hui;Chang, Du-Seong;Choe, Ho-Seop;Ock, Cheol-Young
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.12
    • /
    • pp.181-191
    • /
    • 2016
  • We analyzed the search word of the media content in the IPTV service, and as a result we found that an important factor is general meta information as well as content(material, plot, etc.) and emotion information in the media content selection criteria of customers. Therefore, in this research, in order to efficiently provide various media contents of IPTV to users, we designed the emotion classification system for utilizing the emotion information of the media content. Next, we proposed 'personalized media contents curation system based on emotion information' for organizing the media contents, through the various processing steps. Finally, to demonstrate the effectiveness of this system, we conducted a user satisfaction survey(72.0 points). In addition, the results of comparing the results based on popularity and the results of the proposed system showed that the ratio leading to the actual users' viewing behavior was 10 times higher.

A Comparison of Effective Feature Vectors for Speech Emotion Recognition (음성신호기반의 감정인식의 특징 벡터 비교)

  • Shin, Bo-Ra;Lee, Soek-Pil
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.10
    • /
    • pp.1364-1369
    • /
    • 2018
  • Speech emotion recognition, which aims to classify speaker's emotional states through speech signals, is one of the essential tasks for making Human-machine interaction (HMI) more natural and realistic. Voice expressions are one of the main information channels in interpersonal communication. However, existing speech emotion recognition technology has not achieved satisfactory performances, probably because of the lack of effective emotion-related features. This paper provides a survey on various features used for speech emotional recognition and discusses which features or which combinations of the features are valuable and meaningful for the emotional recognition classification. The main aim of this paper is to discuss and compare various approaches used for feature extraction and to propose a basis for extracting useful features in order to improve SER performance.

Speech Emotion Recognition on a Simulated Intelligent Robot (모의 지능로봇에서의 음성 감정인식)

  • Jang Kwang-Dong;Kim Nam;Kwon Oh-Wook
    • MALSORI
    • /
    • no.56
    • /
    • pp.173-183
    • /
    • 2005
  • We propose a speech emotion recognition method for affective human-robot interface. In the Proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad and surprised. Features for an input utterance are extracted from statistics of phonetic and prosodic information. Phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; Prosodic information includes Pitch, jitter, duration, and rate of speech. Finally a pattern classifier based on Gaussian support vector machines decides the emotion class of the utterance. We record speech commands and dialogs uttered at 2m away from microphones in 5 different directions. Experimental results show that the proposed method yields $48\%$ classification accuracy while human classifiers give $71\%$ accuracy.

  • PDF

Speech Emotion Recognition by Speech Signals on a Simulated Intelligent Robot (모의 지능로봇에서 음성신호에 의한 감정인식)

  • Jang, Kwang-Dong;Kwon, Oh-Wook
    • Proceedings of the KSPS conference
    • /
    • 2005.11a
    • /
    • pp.163-166
    • /
    • 2005
  • We propose a speech emotion recognition method for natural human-robot interface. In the proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad and surprised. Features for an input utterance are extracted from statistics of phonetic and prosodic information. Phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; Prosodic information includes pitch, jitter, duration, and rate of speech. Finally a patten classifier based on Gaussian support vector machines decides the emotion class of the utterance. We record speech commands and dialogs uttered at 2m away from microphones in 5different directions. Experimental results show that the proposed method yields 59% classification accuracy while human classifiers give about 50%accuracy, which confirms that the proposed method achieves performance comparable to a human.

  • PDF

Extracting and Clustering of Story Events from a Story Corpus

  • Yu, Hye-Yeon;Cheong, Yun-Gyung;Bae, Byung-Chull
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.10
    • /
    • pp.3498-3512
    • /
    • 2021
  • This article describes how events that make up text stories can be represented and extracted. We also address the results from our simple experiment on extracting and clustering events in terms of emotions, under the assumption that different emotional events can be associated with the classified clusters. Each emotion cluster is based on Plutchik's eight basic emotion model, and the attributes of the NLTK-VADER are used for the classification criterion. While comparisons of the results with human raters show less accuracy for certain emotion types, emotion types such as joy and sadness show relatively high accuracy. The evaluation results with NRC Word Emotion Association Lexicon (aka EmoLex) show high accuracy values (more than 90% accuracy in anger, disgust, fear, and surprise), though precision and recall values are relatively low.

The study of Emotion Traits in Sasang Constitution by Several Mood scale (정서 관련 척도를 이용한 사상체질의 감정 특성 요인 연구)

  • Kim, Woo-Chul;Kim, Kyeong-Su;Kim, Kyeong-Ok
    • Journal of Oriental Neuropsychiatry
    • /
    • v.22 no.4
    • /
    • pp.63-75
    • /
    • 2011
  • Objectives : One's mind is turned over by environment and personal relationship. This Emotion is called Chiljung in Oriental Medicine. Sasang Constitution is sorted each Emotion by Nature & Emotion(性情). So, this study aimed at figuring out the relations on Sasang Constitution, and emotion traits of oriental medicine students by EEQ and CISS(as named Mood scale). Methods : 199 students of Dongshin university oriental medicine were tested by Questionnaire for Sasang Constitution ClassificationII(QSCCII) and Mood scale. In this study is used 156 students' data, except 43 students' one for research. 156 students are classified four groups by QSCCII. The degree of emotion was determined by Mood scale. These data ware analyzed by frequency, t-test, ANOVA, Multiple comparison, Correlation, Regression with SPSS windows 15.0. Results : 1. Soeumin has high score on EEQ more than Soyangin. 2. Sasang constitution make no difference on CISS, except emotion-oriented coping in not classify group. 3. It has influence on Emotional express by Sasang constotution that Task-oriented coping, EEQ and CISS. Conclusions : Sasang constitution has significant difference on Emotional express.