• Title/Summary/Keyword: Discrete Emotions

Search Result 20, Processing Time 0.017 seconds

Korean Emotion Vocabulary: Extraction and Categorization of Feeling Words (한국어 감정표현단어의 추출과 범주화)

  • Sohn, Sun-Ju;Park, Mi-Sook;Park, Ji-Eun;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.15 no.1
    • /
    • pp.105-120
    • /
    • 2012
  • This study aimed to develop a Korean emotion vocabulary list that functions as an important tool in understanding human feelings. In doing so, the focus was on the careful extraction of most widely used feeling words, as well as categorization into groups of emotion(s) in relation to its meaning when used in real life. A total of 12 professionals (including Korean major graduate students) partook in the study. Using the Korean 'word frequency list' developed by Yonsei University and through various sorting processes, the study condensed the original 64,666 emotion words into a finalized 504 words. In the next step, a total of 80 social work students evaluated and classified each word for its meaning and into any of the following categories that seem most appropriate for inclusion: 'happiness', 'sadness', 'fear', 'anger', 'disgust', 'surprise', 'interest', 'boredom', 'pain', 'neutral', and 'other'. Findings showed that, of the 504 feeling words, 426 words expressed a single emotion, whereas 72 words reflected two emotions (i.e., same word indicating two distinct emotions), and 6 words showing three emotions. Of the 426 words that represent a single emotion, 'sadness' was predominant, followed by 'anger' and 'happiness'. Amongst 72 words that showed two emotions were mostly a combination of 'anger' and 'disgust', followed by 'sadness' and 'fear', and 'happiness' and 'interest'. The significance of the study is on the development of a most adaptive list of Korean feeling words that can be meticulously combined with other emotion signals such as facial expression in optimizing emotion recognition research, particularly in the Human-Computer Interface (HCI) area. The identification of feeling words that connote more than one emotion is also noteworthy.

  • PDF

A Multimodal Emotion Recognition Using the Facial Image and Speech Signal

  • Go, Hyoun-Joo;Kim, Yong-Tae;Chun, Myung-Geun
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.5 no.1
    • /
    • pp.1-6
    • /
    • 2005
  • In this paper, we propose an emotion recognition method using the facial images and speech signals. Six basic emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Facia] expression recognition is performed by using the multi-resolution analysis based on the discrete wavelet. Here, we obtain the feature vectors through the ICA(Independent Component Analysis). On the other hand, the emotion recognition from the speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and the final recognition is obtained from the multi-decision making scheme. After merging the facial and speech emotion recognition results, we obtained better performance than previous ones.

Recognition of Emotion and Emotional Speech Based on Prosodic Processing

  • Kim, Sung-Ill
    • The Journal of the Acoustical Society of Korea
    • /
    • v.23 no.3E
    • /
    • pp.85-90
    • /
    • 2004
  • This paper presents two kinds of new approaches, one of which is concerned with recognition of emotional speech such as anger, happiness, normal, sadness, or surprise. The other is concerned with emotion recognition in speech. For the proposed speech recognition system handling human speech with emotional states, total nine kinds of prosodic features were first extracted and then given to prosodic identifier. In evaluation, the recognition results on emotional speech showed that the rates using proposed method increased more greatly than the existing speech recognizer. For recognition of emotion, on the other hands, four kinds of prosodic parameters such as pitch, energy, and their derivatives were proposed, that were then trained by discrete duration continuous hidden Markov models(DDCHMM) for recognition. In this approach, the emotional models were adapted by specific speaker's speech, using maximum a posteriori(MAP) estimation. In evaluation, the recognition results on emotional states showed that the rates on the vocal emotions gradually increased with an increase of adaptation sample number.

Face Emotion Recognition by Fusion Model based on Static and Dynamic Image (정지영상과 동영상의 융합모델에 의한 얼굴 감정인식)

  • Lee Dae-Jong;Lee Kyong-Ah;Go Hyoun-Joo;Chun Myung-Geun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.5
    • /
    • pp.573-580
    • /
    • 2005
  • In this paper, we propose an emotion recognition using static and dynamic facial images to effectively design human interface. The proposed method is constructed by HMM(Hidden Markov Model), PCA(Principal Component) and wavelet transform. Facial database consists of six basic human emotions including happiness, sadness, anger, surprise, fear and dislike which have been known as common emotions regardless of nation and culture. Emotion recognition in the static images is performed by using the discrete wavelet. Here, the feature vectors are extracted by using PCA. Emotion recognition in the dynamic images is performed by using the wavelet transform and PCA. And then, those are modeled by the HMM. Finally, we obtained better performance result from merging the recognition results for the static images and dynamic images.

A Comparative Analysis on Facial Expression in Advertisements -By Utilising Facial Action Coding System(FACS) (광고 속의 얼굴 표정에 따른 비교 연구 -FACS를 활용하여)

  • An, Kyoung Hee
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.3
    • /
    • pp.61-71
    • /
    • 2019
  • Due to the limit of the time length of advertisement, facial expressions among the types of nonverbal communication are much more expressive and convincing to appeal to costumers. The purpose of this paper is not only to investigate how facial expressions are portrayed but also to examine how facial expressions convey emotion in TV advertisements. Research subjects are TV advertisements of and which had the wide range of popularity for customer known as one of the most touching commercials. The research method is Facial Action Coding System based on the theoretical perspective of a discrete emotions and designed to measure specific facial muscle movements. This research is to analyse the implications of facial expressions in the both TV ads by using FACS based on Psychology as well as anatomy. From the all the result of this, it is shown that the facial expressions portrayed with the conflict of emotional states and the dramatic emotional relief of the heroin could move more customers' emotions.

The Influence of Suppressing Guilt and Shame on Moral Judgment, Intention, and Behavior (죄책감과 수치심의 억제가 도덕적 판단, 의도, 행동에 미치는 영향)

  • Han, Kyueun;Kim, Min Young;Sohn, Young Woo
    • Science of Emotion and Sensibility
    • /
    • v.19 no.3
    • /
    • pp.121-132
    • /
    • 2016
  • Emotion is considered to be involved in the moral decision-making process consisting of moral judgment, moral intention, and moral behavior. This research investigated the distinct role of two specific moral emotions, guilt and shame, when they are suppressed, on moral judgment, moral intention, and moral behavior through an online experiment. Moral emotion (guilt vs. shame) as well as suppression of these emotions (suppressing vs. control) was manipulated to infer the causality of moral emotions and the moral decision-making process when they are suppressed. The results suggest that suppressing guilt was involved in moral judgment and moral intention, but was not involved in moral behavior. In particular, participants who maintained guilt evaluated moral vignettes as more moral and perceived that they would follow the behavior described in the vignettes than those participants who suppressed their guilt. On the other hand, our data showed that suppressing shame was not involved in moral judgment and intention but was in behavior. Participants who maintained shame engaged in moral behavior more than participants who suppressed shame. We delineate the different mechanisms between guilt and shame on the moral decision-making process with the discrete emotion theory.

Development of Psychophysiological Indices for Discrete Emotions (정서의 심리적.생리적 측정 및 지표개발: 기본정서 구분 모델)

  • 이경화;이임갑;손진훈
    • Science of Emotion and Sensibility
    • /
    • v.2 no.2
    • /
    • pp.43-52
    • /
    • 1999
  • 정서는 생리적 반응을 수반하는 주관적인 경험이다. 뇌파와 자율신경계 반응의 차이에 의한 기본정서 구분 연구는 보고된 바가 없다. 본 연구에서는 1) 여섯 기본정서를 뚜렷하게 유발하는 정서자극을 선정하고, 이를 사용하여 2) 기본정서를 구분할 수 있는 심리생리적 복합 지표 모델을 개발하고자 하였다. 국제정서사진체계에서 여섯 기본정서 (행복, 슬픔, 분노, 혐오, 공포, 놀람) 각각을 신뢰롭게 유발하는 여섯 쌍의 슬라이드를 선택하였다. 슬라이드 제시에 의하여 유발되는 뇌파, 심전도, 호흡, 피부전기반응을 기록하여 분석/비교하였다. 주요결과를 요약하면 다음과 같다. 첫째, 뇌파의 상태적 출현량, 심박률, 호흡률, 피부전도반응은 안정상태와 정서상태간의 비교에서 유의미한 차이가 나타났다. 둘째, 뇌파분석결과에서는 theta (F4, 01), slow alpha (F3, F4), fast alpha (O2), fast beta (F4, O2)파의 상대적 출현량 변화값이 일부 정서들간에 유의미한 차이가 있었다. 셋째, 자율신경계 분석결과에서는 심박률, 호흡률, 피부전도반응이 일부 정서들간에 유의미한 차이를 보여주었다. 이들 결과를 토대로 기본정서를 특정적으로 구분할 수 있는 심리생리적 복합 지표 모델을 구성하였다. 네 기본정서 (공포, 혐오, 슬픔, 분노)는 뇌파와 자율신경계 반응패턴에 의한 구분이 가능하였으나, 행복과 놀람은 본 연구에서 사용한 심리생리지표에 의한 최종 구분이 불가능하였다. 여섯 기본정서를 모두 구분할 수 있는 적절한 지표를 찾아내는 후속연구가 필요하다.

  • PDF

An Emotion Recognition Method using Facial Expression and Speech Signal (얼굴표정과 음성을 이용한 감정인식)

  • 고현주;이대종;전명근
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.6
    • /
    • pp.799-807
    • /
    • 2004
  • In this paper, we deal with an emotion recognition method using facial images and speech signal. Six basic human emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Emotion recognition using the facial expression is performed by using a multi-resolution analysis based on the discrete wavelet transform. And then, the feature vectors are extracted from the linear discriminant analysis method. On the other hand, the emotion recognition from speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and then the final recognition is obtained from a multi-decision making scheme.

Patterns of Autonomic Responses to Affective Visual Stimulation: Skin Conductance Response, Heart Rate and Respiration Rate Vary Across Discrete Elicited-Emotions (정서시각자극에 의해 유발된 자율신경계 반응패턴: 유발정서에 따른 피부전도반응, 심박률 및 호흡률 변화)

  • ;Estate M. Sokhadze
    • Science of Emotion and Sensibility
    • /
    • v.1 no.1
    • /
    • pp.79-91
    • /
    • 1998
  • 이 연구의 목적은 IAPS(국제정저사진체계) 사진자극에 의해 유발된 각각의 주관적 정서상태에 특정적인 자율신경계 반응이 존재하는지를 규명하는 것이다. 부정적 정서(분노, 슬픔, 놀람)와 긍정적 정서(행복, 흥분)를 유발하는 IAPS사진을 각 60초 동안 제시하였을 때 유발되는 심박률, 호흡률, 피부전도반응을 측정하였다. 시각자극이 주어진 초리 30초 동안 통계적으로 유의미한 심박률 감속 및 호흡률 감소를 보여주었으며, 뚜렷한 피부전도반응이 출현하였다. 심박률 감속은 혐오보다 흥분에서 더 크게 나타났고, 피부전도반응의 진폭은 혐오보다 흥분에서 더 큰 것으로 나타났다. 한편, 피부전도반응의 진폭이 상승하는 시간은 슬픔, 행복, 놀람보다 혐오에서 더 짧아지는 경향을 보여주었다. 이와 같은 자율신경계 반응(심박률, 호흡률, 피부전도반응)은 정서상태간에 뚜렷한 차이를 보여주며, 특정 정서상태에서 자율신경계 반응은 개인차가 있기는 하지만 전체적으로 매우 전형적인 반응패턴을 보여주었다. 본 연구의 결과는 정서 특정적인 자율신경계 반응이 존재할 가능성을 시사해주며, 생리신호분석을 통해서 심리적 정서를 결정할 수 있는 형판(template)의 구성을 위해서 다양한 자율신경계 정서반응의 지표를 포괄적으로 측정 분석하는 후속연구가 요구된다.

  • PDF

Genetic correlations between behavioural responses and performance traits in laying hens

  • Rozempolska-Rucinska, Iwona;Zieba, Grzegorz;Kibala, Lucyna;Prochniak, Tomasz;Lukaszewicz, Marek
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.30 no.12
    • /
    • pp.1674-1678
    • /
    • 2017
  • Objective: The aim of the study was to evaluate genetic correlations between the behavioural profile and performance in laying hens as an indirect answer to the question whether the observed behavioural responses are associated with increased levels of stress in these birds. Methods: The assessment of birds' temperament was carried out using the novel objects test. The behavioural test was conducted in two successive generations comprising 9,483 Rhode Island White (RIW) birds (approx. 4,700 individuals per generation) and 4,326 Rhode Island Red (RIR) birds (approx. 2,100 individuals per generation). Based on the recorded responses, the birds were divided into two groups: a fearful profile (1,418 RIW hens and 580 RIR hens) and a brave/curious profile (8,065 RIW hens and 3,746 RIR hens). The birds were subjected to standard assessment of their performance traits, including SM, age at sexual maturity; ST, shell thickness; SG, egg specific gravity; EW, mean egg weight; IP, initial egg production; and HC, number of hatched chicks. The pedigree was three generations deep (including two behaviourrecorded generations). Estimation of the (co)variance components was performed with the Gibbs sampling method, which accounts for the discrete character of the behavioural profile denotation. Results: The analyses revealed negative correlations between the performance traits of the laying hens and the behavioural profile defined as fearful. In the group of fearful RIW birds, delayed sexual maturation (0.22) as well as a decrease in the initial egg production (-0.30), egg weight (-0.54), egg specific gravity (-0.331), shell thickness (-0.11), and the number of hatched chicks (-0.24) could be expected. These correlations were less pronounced in the RIR breed, in which the fearful birds exhibited a decline in hatchability (-0.37), egg specific gravity (-0.11), and the number of hatched chicks (-0.18). There were no correlations in the case of the other traits or they were positive but exhibited a substantial standard error, as for the egg weight. Conclusion: To sum up the results obtained, it can be noted that behavioural responses indicating fearfulness, i.e. escape, avoidance, and approach-avoidance may reflect negative emotions experienced by birds. The negative correlations with performance in the group of fearful hens may indirectly indicate a high level of stress in these birds, especially in the white-feathered birds, where stronger performance-fearfulness correlations were found. Fearful birds should be eliminated from breeding by inclusion of the behavioural profile in the selection criterion in the case of laying hens.