• Title/Summary/Keyword: Emotion Model

Search Result 886, Processing Time 0.024 seconds

The Relationships between Emotion Display Rule and Nurse's Job Attitudes;Mediation Effects of Negative Emotion Suppress and Inauthenticity (정서표현규범과 간호사의 직무태도의 관계;부정정서억제와 가식의 매개효과)

  • Park, Jung-Ae;Han, Tae-Young
    • Journal of Korean Academy of Nursing Administration
    • /
    • v.12 no.2
    • /
    • pp.213-224
    • /
    • 2006
  • Purpose: This study examined Korean nurses' job attitude (i.e., satisfaction and burnout) regarding the extent to which organization's emotion display rule influences on job attitudes through emotion-related variables (i.e., negative emotion suppress behavior and inauthenticity), thereby examining mediation effects of negative emotion suppress behavior and inauthenticity. Method: Using a survey to nurses working in various general hospitals in Korea, structural equation modeling was adopted. Result: Emotion display rule affected outcome variables but these effects were mostly observed via the two mediators, negative emotion suppress and inauthenticity, which largely supports the research model. Conclusion: Individuals who perceived stronger emotional display rule suppressed more their negative emotion, and also perceived more inauthenticity. In turn, negative emotion suppress behavior only affected job burnout positively. Inauthenticity showed a negative effect on job satisfaction while it had a positive effect on burnout. The study provided directions for future research and practical implications to help nurses' effective job performance.

  • PDF

Dimensionality of emotion suppression and psychosocial adaptation: Based on the cognitive process model of emotion processing (정서 처리의 인지 평가모델을 기반으로 한 정서 억제의 차원성과 심리 사회적 적응)

  • Woo, Sungbum
    • Korean Journal of Culture and Social Issue
    • /
    • v.27 no.4
    • /
    • pp.475-503
    • /
    • 2021
  • The purpose of this study is to clarify the constructs of emotion suppression and help understanding on the multidimensional nature of emotion suppression by classifying constructs for suppression according to the KMW model. Also, this study examined the gender differences of emotion suppression. For this purpose, 657 adult male and female subjects were evaluated for attitude toward emotions, and difficulty in emotional regulation, as well as depression, state anger and daily stress scale. As a result of the exploratory factor analysis on the scales related to the emotion suppression factors, the emotion suppression factors corresponding to each stage of the KMW model were found to be 'distraction against emotional information, 'difficulty in understanding and interpretation of emotions', 'emotion control beliefs', 'vulnerability on emotional expression beliefs'. Next, the study participants were classified by performing a cluster analysis based on each emotion suppression factor. As a result, four clusters were extracted and named 'emotional control belief cluster', 'emotional expression cluster', 'emotional attention failure cluster', and 'general emotional suppression cluster'. As a result of examining the average difference of male depression, depression, state anger, and daily stress for each group, significant differences were found in all dependent variables. As a result of examining whether there is a difference in the frequency of emotional suppression clusters according to gender, the frequency of emotional suppression clusters was high in men, and the ratio of emotional expression clusters was high in women. Finally, it was analyzed whether there was a gender difference in the effect of the emotional suppression cluster on psychosocial adaptation, and the implications were discussed based on the results of this study.

A Study on the Emotion-Responsive Interior Design centered on a Color Coordinate Digital Wall (감성반응형 실내디자인에 관한 연구 - 감성어휘별 색채배색에 의한 디지털 벽면을 중심으로 -)

  • 김주연;이현수
    • Science of Emotion and Sensibility
    • /
    • v.6 no.2
    • /
    • pp.1-7
    • /
    • 2003
  • The objectives of this study is to develop an adaptable digital wall model whose color can be changed dynamically according to the identified emotional state of a user. This study addresses how to capture a specific user's emotion through the web and use it for modifying VR model mainly for color adaptation. This adaptation process of a VR model consists of three phases: 1) identification of the user's emotional state projected onto the list of emotional keywords 2) translation of those captured emotional keywords into a pertinent set of color coordinations, and finally, 3) automated color adaptation process for the given model. This process derives an on-line viewer's emotional state, which can be utilized to find a new color scheme reflecting the identified emotion.

  • PDF

Analysis of Facial Movement According to Opposite Emotions (상반된 감성에 따른 안면 움직임 차이에 대한 분석)

  • Lee, Eui Chul;Kim, Yoon-Kyoung;Bea, Min-Kyoung;Kim, Han-Sol
    • The Journal of the Korea Contents Association
    • /
    • v.15 no.10
    • /
    • pp.1-9
    • /
    • 2015
  • In this paper, a study on facial movements are analyzed in terms of opposite emotion stimuli by image processing of Kinect facial image. To induce two opposite emotion pairs such as "Sad - Excitement"and "Contentment - Angry" which are oppositely positioned onto Russell's 2D emotion model, both visual and auditory stimuli are given to subjects. Firstly, 31 main points are chosen among 121 facial feature points of active appearance model obtained from Kinect Face Tracking SDK. Then, pixel changes around 31 main points are analyzed. In here, local minimum shift matching method is used in order to solve a problem of non-linear facial movement. At results, right and left side facial movements were occurred in cases of "Sad" and "Excitement" emotions, respectively. Left side facial movement was comparatively more occurred in case of "Contentment" emotion. In contrast, both left and right side movements were occurred in case of "Angry" emotion.

Efficient Emotion Classification Method Based on Multimodal Approach Using Limited Speech and Text Data (적은 양의 음성 및 텍스트 데이터를 활용한 멀티 모달 기반의 효율적인 감정 분류 기법)

  • Mirr Shin;Youhyun Shin
    • The Transactions of the Korea Information Processing Society
    • /
    • v.13 no.4
    • /
    • pp.174-180
    • /
    • 2024
  • In this paper, we explore an emotion classification method through multimodal learning utilizing wav2vec 2.0 and KcELECTRA models. It is known that multimodal learning, which leverages both speech and text data, can significantly enhance emotion classification performance compared to methods that solely rely on speech data. Our study conducts a comparative analysis of BERT and its derivative models, known for their superior performance in the field of natural language processing, to select the optimal model for effective feature extraction from text data for use as the text processing model. The results confirm that the KcELECTRA model exhibits outstanding performance in emotion classification tasks. Furthermore, experiments using datasets made available by AI-Hub demonstrate that the inclusion of text data enables achieving superior performance with less data than when using speech data alone. The experiments show that the use of the KcELECTRA model achieved the highest accuracy of 96.57%. This indicates that multimodal learning can offer meaningful performance improvements in complex natural language processing tasks such as emotion classification.

Human emotional elements and external stimulus information-based Artificial Emotion Expression System for HRI (HRI를 위한 사람의 내적 요소 기반의 인공 정서 표현 시스템)

  • Oh, Seung-Won;Hahn, Min-Soo
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.7-12
    • /
    • 2008
  • In human and robot interaction, the role of emotion becomes more important Therefore, robots need the emotion expression mechanism similar to human. In this paper, we suggest a new emotion expression system based on the psychological studies and it consists of five affective elements, i.e., the emotion, the mood, the personality, the tendency, and the machine rhythm. Each element has somewhat peculiar influence on the emotion expression pattern change according to their characteristics. As a result, although robots were exposed to the same external stimuli, each robot can show a different emotion expression pattern. The proposed system may contribute to make a rather natural and human-friendly human-robot interaction and to promote more intimate relationships between people and robots.

  • PDF

Robust Real-time Tracking of Facial Features with Application to Emotion Recognition (안정적인 실시간 얼굴 특징점 추적과 감정인식 응용)

  • Ahn, Byungtae;Kim, Eung-Hee;Sohn, Jin-Hun;Kweon, In So
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.4
    • /
    • pp.266-272
    • /
    • 2013
  • Facial feature extraction and tracking are essential steps in human-robot-interaction (HRI) field such as face recognition, gaze estimation, and emotion recognition. Active shape model (ASM) is one of the successful generative models that extract the facial features. However, applying only ASM is not adequate for modeling a face in actual applications, because positions of facial features are unstably extracted due to limitation of the number of iterations in the ASM fitting algorithm. The unaccurate positions of facial features decrease the performance of the emotion recognition. In this paper, we propose real-time facial feature extraction and tracking framework using ASM and LK optical flow for emotion recognition. LK optical flow is desirable to estimate time-varying geometric parameters in sequential face images. In addition, we introduce a straightforward method to avoid tracking failure caused by partial occlusions that can be a serious problem for tracking based algorithm. Emotion recognition experiments with k-NN and SVM classifier shows over 95% classification accuracy for three emotions: "joy", "anger", and "disgust".

Design of an Artificial Emotion for visualizing emotion (감정의 시각화를 위한 인공감정 설계)

  • Ham, Jun-Seok;Son, Chung-Yeon;Jeong, Chan-Sun;Park, Jun-Hyeong;Yeo, Ji-Hye;Go, Il-Ju
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2009.11a
    • /
    • pp.91-94
    • /
    • 2009
  • 인공감정에 관련된 기존의 연구는 대부분 감정의 인식과 물리적 표현에 중점 되어 연구되었다. 하지만 감정은 성격에 따라 달리 표출되고, 시간에 따라 변화 양상을 갖는다. 또한 새로운 감정자극을 받기 이 전의 감정상태에 따라서 표출 될 감정은 달라진다. 본 논문은 감정을 성격, 시간, 감정간의 관계에 따라 관리하여 현재 표출될 감정을 시각화 해주는 인공감정을 제안한다. 감정을 시각화하기 위해서 본 논문의 인공감정은 감정그래프와 감정장을 갖는다. 감정그래프는 특정 감정을 성격과 시간에 따라 표현하는 2차원 형태의 그래프 이다. 감정장은 감정그래프에서 표현된 서로 다른 종류의 감정들을 시간과 감정간의 관계에 따라 시각화 해주는 3차원 형태의 모델이다. 제안된 인공감정을 통해 감정을 시각화해 보기 위해, 감정의 인식과 물리적 표현을 텍스트 기반으로 간소화시킨 시뮬레이터에 적용했다.

  • PDF