• Title/Summary/Keyword: anger and sadness

Search Result 147, Processing Time 0.027 seconds

A Study on the Human Sensibility Evaluation Technique using 10-channel EEG (10채널 뇌파를 이용한 감성평가 기술에 관한 연구)

  • Kim, Heung-Hwan;Lee, Sang-Han;Kang, Dong-Kee;Kim, Dong-Jun;Ko, Han-Woo
    • Proceedings of the KIEE Conference
    • /
    • 2002.07d
    • /
    • pp.2690-2692
    • /
    • 2002
  • This paper describes a technique for human sensibility evaluation using 10-channel EEG(electroencephalogram). The proposed method uses the linear predictor coefficients as EEG feature parameters and a neural network as sensibility pattern classifier. For subject independent system, multiple templates are stored and the most similar template can be selected. EEG signals corresponding to 4 emotions such as, relaxation, joy, sadness and anger are collected from 5 armature performers. The states of relaxation and joy are considered as positive sensibility and those of sadness and anger as negative. The classification performance using the proposed method is about 72.6%. This will be promising performance in the human sensibility evaluation.

  • PDF

A Study on the Human Sensibility Evaluation Technique Using EEGs of 4 Emotions (4가지 감정의 뇌파를 이용한 감성평가 기술에 관한 연구)

  • Kim, Dong-Jun;Kang, Dong-Kee;Kim, Heung-Hwan;Yi, Sang-Han;Go, Han-Woo;Park, Se-Jin
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.51 no.11
    • /
    • pp.528-534
    • /
    • 2002
  • This paper describes a technique for human sensibility evaluation using EEGs of 4 emotions. The proposed method uses the linear predictor coefficients as EEG feature parameters and a neural network as sensibility pattern classifier. For subject independent system, multiple templates are stored and the most similar template can be selected. EEG signals corresponding to 4 emotions such as relaxation, joy, sadness and anger are collected from 5 armature performers. The states of relaxation and joy are considered as positive sensibility and those of sadness and anger as negative. The classification performance suing the proposed method is about 72.6%. This may be promising performance in the human sensibility evaluation.

DIFFERENTIATION OF BASIC EMOTIONS BY EEG AND AUTONOMIC RESPONSES (뇌파 및 자율신경계 반응특성에 의한 기본정서의 구분)

  • 이경화;이임갑;손진훈
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1999.03a
    • /
    • pp.11-15
    • /
    • 1999
  • The discrete state theory on emotion postulated that there existed discrete emotions, such as happiness, anger, fear, disgust, and so forth. Many investigators who emphasized discreteness of emotions have suggested that discrete emotions entailed their specific activities in the autonomic nervous system. The purposes of this study were to develop a model of emotion-specific physiological response patterns. The study postulated six emotions (i.e., happiness, sadness, anger, disgust, fear, and surprise) as the basic discrete emotions. Thirty eight college students participated in the present study. Twelve slides (2 for each emotion category) were presented to the subjects in random order. During resting period of 30 s prior to the presentation of each slide, four presentation of each slide, four physiological measures (EEG, ECG, EDA, and respiration) were recorded to establish a baseline. The same physiological measures were recorded while each slide was being presented for 60 s (producing an emotional sate). Then, the subjects were asked to rate the degree of emotion induced by the slide on semantic differential scales. This procedure was repeated for every slide. Based upon the results, a model of emotion-specific physiological response patterns was developed: four emotion (fear, disgust, sadness, and anger) were classified according to the characteristics of EEG and autonomic responses. However, emotions of happiness and surprise were not distinguished by any combination of the physiological measures employed in this study, suggesting another appropriate measure should be adopted for differentiation.

  • PDF

A Convergence Study on Music-color Association Responses of People with Visual Impairment Mediated by Emotion (시각장애인의 정서 기반 음악-색채 연합에 대한 융복합적 연구)

  • Park, Hye-Young
    • Journal of the Korea Convergence Society
    • /
    • v.10 no.5
    • /
    • pp.313-321
    • /
    • 2019
  • The purpose of this study was to examine music-color association response(MCAR) of people with visual impairment through music-emotion scale and music-color scale. The study was conducted on 60 participants(30 congenital/ 30 adventitious) who are using services of two welfare centers at S and B cities. For this, four basic emotions (happiness, sadness, anger, and fear) mediated by music were selected, and MCAR to emotion-inducing music were analyzed through self-report method. As a result, first, there were found contrasts in MCAR between happiness and sadness according to type of emotion, however, similar in anger and fear. Second, in MCAR among three variables of the music-emotion scale(valence, arousal and intensity), valence was congruent with MCAR according to type of emotion, arousal marked high scores in negative emotions, and scores of intensity in happiness and sadness were higher than those in anger and fear. Third, there were no significant differences between two groups of people with congenital and adventitious visual impairments. It is meaningful that this study showed the MCAR can be mediated by music through investigating those of people with visual impairment.

The Congruent Effects of Gesture and Facial Expression of Virtual Character on Emotional Perception: What Facial Expression is Significant? (가상 캐릭터의 몸짓과 얼굴표정의 일치가 감성지각에 미치는 영향: 어떤 얼굴표정이 중요한가?)

  • Ryu, Jeeheon;Yu, Seungbeom
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.5
    • /
    • pp.21-34
    • /
    • 2016
  • In the design and develop a virtual character, it is important to correctly deliver target emotions generated by the combination of facial expression and gesture. The purpose of this study is to examine the effect of congruent/incongruent between gesture and facial expression on target emotion. In this study four emotions, joy, sadness, fear, and anger, are applied. The results of study showed that sadness emotion were incorrectly perceived. Moreover, it was perceived as anger instead of sadness. Sadness can be easily confused when facial expression and gestures were simultaneously presented. However, in the other emotional status, the intended emotional expressions were correctly perceived. The overall evaluation of virtual character's emotional expression was significantly low when joy gesture was combined with sad facial expression. The results of this study suggested that emotional gesture is more influential correctly to deliver target emotions to users. This study suggested that social cues like gender or age of virtual character should be further studied.

Modulation of the Time Course of Cardiac Chronotropic Responses during Exposure to Affective Pictures

  • Estate M. Sokhadze;Lee, kyung-Hwa;Lee, Jong-Mee;Oh, Jong-In;Sohn, Jin-Hun
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.290-300
    • /
    • 2000
  • One of the most important topics in attentional and emotional modulation of cardiac responses is time course of cardiac chronotropic response. The reason lies in dual innervation of heart, which leads to occurrence of several phases of cardiac response during exposure to affective stimuli, determined by the balance of sympathetic and parasympathetic influences. Cardiac chronotropic reactivity thus represents quite effective measure capable to trace the moment when attending and orienting processes (i.e., sensory intake of stimulus) prime relevant behavioral response (ile., emotion with approach or avoidance tendencies). The aim of this study was to find the time course of heart rate (HR) responses typical for negative (disgust, surprise, fear, anger) and positive (happiness, pleasant erotic) affective pictures and to identify cardiac response dissociation for emotions with different action tendencies such as "approach" (surprise, anger, happiness) and "avoidance" (fear, sadness, disgust). Forty college students participated in this study where cardiac responses to slides from IAPS intended to evoke basic emotions (surprise, fear, anger, sadness, disgust, happiness, pleasant-erotic). Inter-beat intervals of HR were analyzed on every 10 sec basis during 60 sec long exposure to affective visual stimuli. Obtained results demonstrated that differentiation was observed at the very first 10s of exposure (anger-fear, surprise-sad, surprise-erotic, surprise-happiness paris), reaching the peak of dissociation at 30s (same pairs plus surprise-disgust and surprise-fear) and was still effective for some pairs (surprise-erotic, surprise-sad) even at 50s and 60s. discussed are potential cardiac autonomic mechanisms underlying attention and emotion processes evoked by affective stimulation and theoretical considerations implicated to understand the role of differential cardiac reactivity in the behavioral context (e.g., approach-avoidance tendencies, orienting-defense responses).

  • PDF

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.

An acoustical analysis of emotional speech using close-copy stylization of intonation curve (억양의 근접복사 유형화를 이용한 감정음성의 음향분석)

  • Yi, So Pae
    • Phonetics and Speech Sciences
    • /
    • v.6 no.3
    • /
    • pp.131-138
    • /
    • 2014
  • A close-copy stylization of intonation curve was used for an acoustical analysis of emotional speech. For the analysis, 408 utterances of five emotions (happiness, anger, fear, neutral and sadness) were processed to extract acoustical feature values. The results show that certain pitch point features (pitch point movement time and pitch point distance within a sentence) and sentence level features (pitch range of a final pitch point, pitch range of a sentence and pitch slope of a sentence) are affected by emotions. Pitch point movement time, pitch point distance within a sentence and pitch slope of a sentence show no significant difference between male and female participants. The emotions with high arousal (happiness and anger) are consistently distinguished from the emotion with low arousal (sadness) in terms of these acoustical features. Emotions with higher arousal show steeper pitch slope of a sentence. They have steeper pitch slope at the end of a sentence. They also show wider pitch range of a sentence. The acoustical analysis in this study implies the possibility that the measurement of these acoustical features can be used to cluster and identify emotions of speech.

A comparison between affective prosodic characteristics observed in children with cochlear implant and normal hearing (인공와우 이식 아동과 정상 청력 아동의 정서적 운율 특성 비교)

  • Oh, Yeong Geon;Seong, Cheoljae
    • Phonetics and Speech Sciences
    • /
    • v.8 no.3
    • /
    • pp.67-78
    • /
    • 2016
  • This study examined the affective prosodic characteristics observed from the children with cochlear implant (CI, hereafter) and normal hearing (NH, hereafter) along with listener's perception on them. Speech samples were acquired from 15 normal and 15 CI children. 8 SLPs(Speech Language Pathologists) perceptually evaluated affective types using Praat's ExperimentMFC. When it comes to the acoustic results, there were statistically meaningful differences between 2 groups in affective types [joy (discriminated by intensity deviation), anger (by intensity-related variables dominantly and duration-related variables partly), and sadness (by all aspects of prosodic variables)]. CI's data are much more louder when expressing joy, louder and slower when expressing anger, and higher, louder, and slower when it comes to sadness than those of NH. The listeners showed much higher correlation when evaluating normal children than CI group(p<.001). Chi-square results revealed that listeners did not show coherence at CI's utterance, but did at those of NH's (CI(p<.01), normal(p=.48)). When CI utterances were discriminated into 3 emotional types by DA(Discriminant Analysis) using 8 acoustic variables, speed related variables such as articulation rate took primary role.

A Study on Emotional Recognition and Expressive Behavior of Children Aged Four in Institutional Care Through a Role Play Program Using Dolls (인형매체 역할놀이 프로그램에서 나타나는 4세 시설보호유아의 정서인식 및 표현행동 탐색)

  • Yang, Sim-Young
    • Korean Journal of Child Studies
    • /
    • v.33 no.1
    • /
    • pp.93-109
    • /
    • 2012
  • The purpose of this study is to explore how children aged four, in institutional care, perceive and express basic emotions, such as happiness, sadness, anger, surprise, through a role play program using dolls. This study selected two children who were both aged four years old and are currently in child institutional care. The children were observed during six role play trials using dolls and were questioned after the trials. The results were : 1) The children aged four in institutional care best perceived and expressed the emotion of happiness. 2) The children aged four in institutional care were able to positively change their feelings of sadness and expressive behavior through the intimate relationships they formed with the researcher. 3) The children aged four in institutional care expressed the emotion of anger throughout the entire role play using dolls and were positively changed the instructions and coaching given by the researcher. 4) The children aged four in institutional care had the most difficulty in expressing the emotion of surprise. The results of this study could be used as basic data for creating a program intended to help children aged four in institutional care develop their emotions.