• 제목/요약/키워드: Emotion Model

Search Result 876, Processing Time 0.025 seconds

An Authoring Framework for Emotion-Aware User Interface of Mobile Applications (모바일 어플리케이션의 감정 적응형 사용자 인터페이스 저작 프레임워크)

  • Lee, Eunjung;Kim, Gyu-Wan;Kim, Woo-Bin
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.3
    • /
    • pp.376-386
    • /
    • 2015
  • Since affective computing has been introduced in 90s, affect recognition technology has achieved substantial progress recently. However, the application of user emotion recognition into software user interface is in its early stages. In this paper, we describe a new approach for developing mobile user interface which could react differently depending on user emotion states. First, an emotion reaction model is presented which determines user interface reactions for each emotional state. We introduce a pair of mappings from user states to different user interface versions. The reacting versions are implemented by a set of variations for a view. Further, we present an authoring framework to help developers/designers to create emotion-aware reactions based on the proposed emotion reaction model. The authoring framework is necessary to alleviate the burden of creating and handling multi versions for views at the development process. A prototype implementation is presented as an extension of the existing authoring tool DAT4UX. Moreover, a proof-of-concept application featuring an emotion-aware interface is developed using the tool.

DIFFERENTIATION OF BASIC EMOTIONS BY EEG AND AUTONOMIC RESPONSES (뇌파 및 자율신경계 반응특성에 의한 기본정서의 구분)

  • 이경화;이임갑;손진훈
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1999.03a
    • /
    • pp.11-15
    • /
    • 1999
  • The discrete state theory on emotion postulated that there existed discrete emotions, such as happiness, anger, fear, disgust, and so forth. Many investigators who emphasized discreteness of emotions have suggested that discrete emotions entailed their specific activities in the autonomic nervous system. The purposes of this study were to develop a model of emotion-specific physiological response patterns. The study postulated six emotions (i.e., happiness, sadness, anger, disgust, fear, and surprise) as the basic discrete emotions. Thirty eight college students participated in the present study. Twelve slides (2 for each emotion category) were presented to the subjects in random order. During resting period of 30 s prior to the presentation of each slide, four presentation of each slide, four physiological measures (EEG, ECG, EDA, and respiration) were recorded to establish a baseline. The same physiological measures were recorded while each slide was being presented for 60 s (producing an emotional sate). Then, the subjects were asked to rate the degree of emotion induced by the slide on semantic differential scales. This procedure was repeated for every slide. Based upon the results, a model of emotion-specific physiological response patterns was developed: four emotion (fear, disgust, sadness, and anger) were classified according to the characteristics of EEG and autonomic responses. However, emotions of happiness and surprise were not distinguished by any combination of the physiological measures employed in this study, suggesting another appropriate measure should be adopted for differentiation.

  • PDF

Nurses' Colleague Solidarity and Job Performance: Mediating Effect of Positive Emotion and Turnover Intention

  • Jizhe Wang;Shao Liu;Xiaoyan Qu;Xingrong He;Laixiang Zhang;Kun Guo;Xiuli Zhu
    • Safety and Health at Work
    • /
    • v.14 no.3
    • /
    • pp.309-316
    • /
    • 2023
  • Background: Job performance is known as an essential reflection of nursing quality. Colleague solidarity, positive emotion, and turnover intention play effective roles in a clinical working environment, but their impacts on job performance are unclear. Investigating the association between nurses' colleague solidarity and job performance may be valuable, both directly and through the mediating roles of positive emotion and turnover intention. Methods: In this cross-sectional study, a total of 324 Chinese nurses were recruited by convenience sampling method from July 2016 to January 2017. Descriptive analysis, Spearman's correlation analysis, and the structural equation model were applied for analysis by SPSS 26.0 and AMOS 24.0. Results: A total of 49.69% of participants were under 30 years old, and 90.12% of participants were female. Colleague solidarity and positive emotion were positively connected with job performance. The results indicated the mediating effects of positive emotion and turnover intention in this relationship, respectively, as well as the chain mediating effect of positive emotion and turnover intention. Conclusions: In conclusion, dynamic and multiple supportive strategies are needed for nurse managers to ameliorate nursing job performance by improving colleague solidarity and positive emotion and decreasing turnover intention based on the job demand-resource model.

Emotion recognition from speech using Gammatone auditory filterbank

  • Le, Ba-Vui;Lee, Young-Koo;Lee, Sung-Young
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2011.06a
    • /
    • pp.255-258
    • /
    • 2011
  • An application of Gammatone auditory filterbank for emotion recognition from speech is described in this paper. Gammatone filterbank is a bank of Gammatone filters which are used as a preprocessing stage before applying feature extraction methods to get the most relevant features for emotion recognition from speech. In the feature extraction step, the energy value of output signal of each filter is computed and combined with other of all filters to produce a feature vector for the learning step. A feature vector is estimated in a short time period of input speech signal to take the advantage of dependence on time domain. Finally, in the learning step, Hidden Markov Model (HMM) is used to create a model for each emotion class and recognize a particular input emotional speech. In the experiment, feature extraction based on Gammatone filterbank (GTF) shows the better outcomes in comparison with features based on Mel-Frequency Cepstral Coefficient (MFCC) which is a well-known feature extraction for speech recognition as well as emotion recognition from speech.

Research of Emotion Model on Disaster and Safety based on Analyzing Social Media (소셜미디어 분석기반 재난안전 감성모델 연구)

  • Choi, Seon Hwa
    • Journal of the Korean Society of Safety
    • /
    • v.31 no.6
    • /
    • pp.113-120
    • /
    • 2016
  • People use social media platforms such as Twitter to leave traces of their personal thoughts and opinions. In other words, social media platforms retain the emotions of the people as it is, and accurately understanding the emotions of the people through social media will be used as a significant index for disaster management. In this research, emotion type modeling method and emotional quotient quantification method will be proposed to understand the emotions present in social media platforms. Emotion types are primarily analyzed based on 3 major emotions of affirmation, caution, and observation. Then, in order to understand the public's emotional progress according to the progress of disaster or accident and government response in detail, negative emotions are broken down into anxiety, seriousness, sadness, and complaint to enhance the analysis. Ultimately, positive emotions are further broken down into 3 more emotions, and Russell emotion model was used as a reference to develop a model of 8 primary emotions in order to acquire an overall understanding of the public's emotions. Then, the emotional quotient of each emotion was quantified. Based on the results, overall emotional status of the public is monitored, and in the event of a disaster, the public's emotional fluctuation rate could be quantitatively observed.

Emotion-based music visualization using LED lighting control system (LED조명 시스템을 이용한 음악 감성 시각화에 대한 연구)

  • Nguyen, Van Loi;Kim, Donglim;Lim, Younghwan
    • Journal of Korea Game Society
    • /
    • v.17 no.3
    • /
    • pp.45-52
    • /
    • 2017
  • This paper proposes a new strategy of emotion-based music visualization. Emotional LED lighting control system is suggested to help audiences enhance the musical experience. In the system, emotion in music is recognized by a proposed algorithm using a dimensional approach. The algorithm used a method of music emotion variation detection to overcome some weaknesses of Thayer's model in detecting emotion in a one-second music segment. In addition, IRI color model is combined with Thayer's model to determine LED light colors corresponding to 36 different music emotions. They are represented on LED lighting control system through colors and animations. The accuracy of music emotion visualization achieved to over 60%.

Emotion Detection Model based on Sequential Neural Networks in Smart Exhibition Environment (스마트 전시환경에서 순차적 인공신경망에 기반한 감정인식 모델)

  • Jung, Min Kyu;Choi, Il Young;Kim, Jae Kyeong
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.1
    • /
    • pp.109-126
    • /
    • 2017
  • In the various kinds of intelligent services, many studies for detecting emotion are in progress. Particularly, studies on emotion recognition at the particular time have been conducted in order to provide personalized experiences to the audience in the field of exhibition though facial expressions change as time passes. So, the aim of this paper is to build a model to predict the audience's emotion from the changes of facial expressions while watching an exhibit. The proposed model is based on both sequential neural network and the Valence-Arousal model. To validate the usefulness of the proposed model, we performed an experiment to compare the proposed model with the standard neural-network-based model to compare their performance. The results confirmed that the proposed model considering time sequence had better prediction accuracy.

Dynamic Emotion Classification through Facial Recognition (얼굴 인식을 통한 동적 감정 분류)

  • Han, Wuri;Lee, Yong-Hwan;Park, Jeho;Kim, Youngseop
    • Journal of the Semiconductor & Display Technology
    • /
    • v.12 no.3
    • /
    • pp.53-57
    • /
    • 2013
  • Human emotions are expressed in various ways. It can be expressed through language, facial expression and gestures. In particular, the facial expression contains many information about human emotion. These vague human emotion appear not in single emotion, but in combination of various emotion. This paper proposes a emotional expression algorithm using Active Appearance Model(AAM) and Fuzz k- Nearest Neighbor which give facial expression in similar with vague human emotion. Applying Mahalanobis distance on the center class, determine inclusion level between center class and each class. Also following inclusion level, appear intensity of emotion. Our emotion recognition system can recognize a complex emotion using Fuzzy k-NN classifier.

Visualizing Emotions with an Artificial Emotion Model Based on Psychology -Focused on Characters in Hamlet- (심리학 기반 인공감정모델을 이용한 감정의 시각화 -햄릿의 등장인물을 중심으로-)

  • Ham, Jun-Seok;Ryeo, Ji-Hye;Ko, Il-Ju
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.541-552
    • /
    • 2008
  • We cannot express emotions correctly with only speech because it is hard to estimate the kind, size, amount of emotions. Hamlet who is a protagonist in 'Hamlet' of Shakespeare has emotions which cannot be expressed within only speech because he is in various dramatic situations. So we supposed an artificial emotion, instead of expressing emotion with speech, expressing and visualizing current emotions with color and location. And we visualized emotions of characters in 'Hamlet' with the artificial emotion. We designed the artificial emotion to four steps considering peculiarities of emotion. First, the artificial emotion analyzes inputted emotional stimulus as relationship between causes and effects and analyzes its kinds and amounts. Second, we suppose Emotion Graph Unit to express generating, maintaining, decaying of analyzed one emotional stimuli which is outputted by first step, according to characteristic. Third, using Emotion Graph Unit, we suppose Emotion Graph that expresses continual same emotional stimulus. And we make Emotion Graph at each emotions, managing generation and decay of emotion individually. Last, we suppose Emotion Field can express current combined value of Emotion Graph according to co-relation of various emotions, and visualize current emotion by a color and a location in Emotion Field. We adjusted the artificial emotion to the play 'Hamlet' to test and visualize changes of emotion of Hamlet and his mother, Gertrude.

  • PDF

Emotion recognition modeling in considering physical and cognitive factors (물리적 인지적 상황을 고려한 감성 인식 모델링)

  • Song S.H.;Park H.H.;Ji Y.K.;Park J.H.;Park J.H.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.1937-1943
    • /
    • 2005
  • The technology of emotion recognition is a crucial factor in day of ubiquitous that it provides various intelligent services for human. This paper intends to make the system which recognizes the human emotions based on 2-dimensional model with two bio signals, GSR and HRV. Since it is too difficult to make model the human's bio system analytically, as a statistical method, Hidden Markov Model(HMM) is used, which uses the transition probability among various states and measurable observation variance. As a result of experiments for each emotion, we can get average recognition rates of 64% for first HMM results and 55% for second HMM results

  • PDF