• Title/Summary/Keyword: facial emotional expression

Search Result 125, Processing Time 0.027 seconds

The effects of the usability of products on user's emotions - with emphasis on suggestion of methods for measuring user's emotions expressed while using a product -

  • Jeong, Sang-Hoon
    • Archives of design research
    • /
    • v.20 no.2 s.70
    • /
    • pp.5-16
    • /
    • 2007
  • The main objective of our research is analyzing user's emotional changes while using a product, to reveal the influence of usability on human emotions. In this study we have extracted some emotional words that can come up during user interaction with a product and reveal emotional changes through three methods. Finally, we extracted 88 emotional words for measuring user's emotions expressed while using products. And we categorized the 88 words to form 6 groups by using factor analysis. The 6 categories that were extracted as a result of this study were found to be user's representative emotions expressed while using products. It is expected that emotional words and user's representative emotions extracted in this study will be used as subjective evaluation data that is required to measure user's emotional changes while using a product. Also, we proposed the effective methods for measuring user's emotion expressed while using a product in the environment which is natural and accessible for the field of design, by using the emotion mouse and the Eyegaze. An examinee performs several tasks with the emotion mouse through the mobile phone simulator on the computer monitor connected to the Eyegaze. While testing, the emotion mouse senses user's EDA and PPG and transmits the data to the computer. In addition, the Eyegaze can observe the change of pupil size. And a video camera records user's facial expression while testing. After each testing, a subjective evaluation on the emotional changes expressed by the user is performed by the user him/herself using the emotional words extracted from the above study. We aim to evaluate the satisfaction level of usability of the product and compare it with the actual experiment results. Through continuous studies based on these researches, we hope to supply a basic framework for the development of interface with consideration to the user's emotions.

  • PDF

Attentional Bias to Emotional Stimuli and Effects of Anxiety on the Bias in Neurotypical Adults and Adolescents

  • Mihee Kim;Jejoong Kim;So-Yeon Kim
    • Science of Emotion and Sensibility
    • /
    • v.25 no.4
    • /
    • pp.107-118
    • /
    • 2022
  • Human can rapidly detect and deal with dangerous elements in their environment, and they generally manifest as attentional bias toward threat. Past studies have reported that this attentional bias is affected by anxiety level. Other studies, however, have argued that children and adolescents show attentional bias to threatening stimuli, regardless of their anxiety levels. Few studies directly have compared the two age groups in terms of attentional bias to threat, and furthermore, most previous studies have focused on attentional capture and the early stages of attention, without investigating further attentional holding by the stimuli. In this study, we investigated both attentional bias patterns (attentional capture and holding) with respect to negative emotional stimulus in neurotypical adults and adolescents. The effects of anxiety level on attentional bias were also examined. The results obtained for adult participants showed that abrupt onset of a distractor delayed attentional capture to the target, regardless of distractor type (angry or neutral faces), while it had no effect on attention holding. In adolescents, on the other hand, only the angry face distractor resulted in longer reaction time for detecting a target. Regarding anxiety, state anxiety revealed a significant positive correlation with attentional capture to a face distractor in adult participants but not in adolescents. Overall, this is the first study to investigate developmental tendencies of attentional bias to negative facial emotion in both adults and adolescents, providing novel evidence on attentional bias to threats at different ages. Our results can be applied to understanding the attentional mechanisms in people with emotion-related developmental disorders, as well as typical development.

Power affects emotional awareness: The moderating role of emotional intelligence and goal-relevance (정서인식과 권력의 관계: 정서지능과 목표관련성의 조절효과 검증)

  • Lee, Suran;Lee, Won Pyo;Kim, Kaeun;Youm, Joon-Kyoo;Sohn, Young Woo
    • Science of Emotion and Sensibility
    • /
    • v.16 no.4
    • /
    • pp.433-444
    • /
    • 2013
  • The purpose of this study is to investigate the moderating role of emotional intelligence (EI) and goal-relevance in the relationship between power and emotional awareness. In Study 1, participants were ask to correctly indicate presented facial expressions of others after completing EI survey. Half of the participants were randomly assigned to the "power" condition and the other half to the "powerless" condition. In Study 2, goal-relevance of expressed emotion was manipulated. The results showed that EI moderated the relationship between power and emotion decoding ability. While participants with high and low levels of EI were not significantly affected by power condition, participants with middle level of EI were strongly influenced by the effect of power. In addition, the role of goal-relevance significantly moderated the relationship between power and emotional awareness. When correctly indicating other's emotion became important and thus emotional awareness was strongly associated with participants' goal, those who had power performed better than before.

A neural network model for recognizing facial expressions based on perceptual hierarchy of facial feature points (얼굴 특징점의 지각적 위계구조에 기초한 표정인식 신경망 모형)

  • 반세범;정찬섭
    • Korean Journal of Cognitive Science
    • /
    • v.12 no.1_2
    • /
    • pp.77-89
    • /
    • 2001
  • Applying perceptual hierarchy of facial feature points, a neural network model for recognizing facial expressions was designed. Input data were convolution values of 150 facial expression pictures by Gabor-filters of 5 different sizes and 8 different orientations for each of 39 mesh points defined by MPEG-4 SNHC (Synthetic/Natural Hybrid Coding). A set of multiple regression analyses was performed with the rating value of the affective states for each facial expression and the Gabor-filtered values of 39 feature points. The results show that the pleasure-displeasure dimension of affective states is mainly related to the feature points around the mouth and the eyebrows, while a arousal-sleep dimension is closely related to the feature points around eyes. For the filter sizes. the affective states were found to be mostly related to the low spatial frequency. and for the filter orientations. the oblique orientations. An optimized neural network model was designed on the basis of these results by reducing original 1560(39x5x8) input elements to 400(25x2x8) The optimized model could predict human affective rating values. up to the correlation value of 0.886 for the pleasure-displeasure, and 0.631 for the arousal-sleep. Mapping the results of the optimized model to the six basic emotional categories (happy, sad, fear, angry, surprised, disgusted) fit 74% of human responses. Results of this study imply that, using human principles of recognizing facial expressions, a system for recognizing facial expressions can be optimized even with a a relatively little amount of information.

  • PDF

A Study on Visual Perception based Emotion Recognition using Body-Activity Posture (사용자 행동 자세를 이용한 시각계 기반의 감정 인식 연구)

  • Kim, Jin-Ok
    • The KIPS Transactions:PartB
    • /
    • v.18B no.5
    • /
    • pp.305-314
    • /
    • 2011
  • Research into the visual perception of human emotion to recognize an intention has traditionally focused on emotions of facial expression. Recently researchers have turned to the more challenging field of emotional expressions through body posture or activity. Proposed work approaches recognition of basic emotional categories from body postures using neural model applied visual perception of neurophysiology. In keeping with information processing models of the visual cortex, this work constructs a biologically plausible hierarchy of neural detectors, which can discriminate 6 basic emotional states from static views of associated body postures of activity. The proposed model, which is tolerant to parameter variations, presents its possibility by evaluating against human test subjects on a set of body postures of activities.

Face Recognition using Emotional Face Images and Fuzzy Fisherface (감정이 있는 얼굴영상과 퍼지 Fisherface를 이용한 얼굴인식)

  • Koh, Hyun-Joo;Chun, Myung-Geun;Paliwal, K.K.
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.1
    • /
    • pp.94-98
    • /
    • 2009
  • In this paper, we deal with a face recognition method for the emotional face images. Since the face recognition is one of the most natural and straightforward biometric methods, there have been various research works. However, most of them are focused on the expressionless face images and have had a very difficult problem if we consider the facial expression. In real situations, however, it is required to consider the emotional face images. Here, three basic human emotions such as happiness, sadness, and anger are investigated for the face recognition. And, this situation requires a robust face recognition algorithm then we use a fuzzy Fisher's Linear Discriminant (FLD) algorithm with the wavelet transform. The fuzzy Fisherface is a statistical method that maximizes the ratio of between-scatter matrix and within-scatter matrix and also handles the fuzzy class information. The experimental results obtained for the CBNU face databases reveal that the approach presented in this paper yields better recognition performance in comparison with the results obtained by other recognition methods.

Adaptive Facial Expression Recognition System based on Gabor Wavelet Neural Network (가버 웨이블릿 신경망 기반 적응 표정인식 시스템)

  • Lee, Sang-Wan;Kim, Dae-Jin;Kim, Yong-Soo;Bien, Zeungnam
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.1
    • /
    • pp.1-7
    • /
    • 2006
  • In this paper, adaptive Facial Emotional Recognition system based on Gabor Wavelet Neural Network, considering six feature Points in face image to extract specific features of facial expression, is proposed. Levenberg-Marquardt-based training methodology is used to formulate initial network, including feature extraction stage. Therefore, heuristics in determining feature extraction process can be excluded. Moreover, to make an adaptive network for new user, Q-learning which has enhanced reward function and unsupervised fuzzy neural network model are used. Q-learning enables the system to ge optimal Gabor filters' sets which are capable of obtaining separable features, and Fuzzy Neural Network enables it to adapt to the user's change. Therefore, proposed system has a good on-line adaptation capability, meaning that it can trace the change of user's face continuously.

Emotional Expression Technique using Facial Recognition in User Review (사용자 리뷰에서 표정 인식을 이용한 감정 표현 기법)

  • Choi, Wongwan;Hwang, Mansoo;Kim, Neunghoe
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.22 no.5
    • /
    • pp.23-28
    • /
    • 2022
  • Today, the online market has grown rapidly due to the development of digital platforms and the pandemic situation. Therefore, unlike the existing offline market, the distinctiveness of the online market has prompted users to check online reviews. It has been established that reviews play a significant part in influencing the user's purchase intention through precedents of several studies. However, the current review writing method makes it difficult for other users to understand the writer's emotions by expressing them through elements like tone and words. If the writer also wanted to emphasize something, it was very cumbersome to thicken the parts or change the colors to reflect their emotions. Therefore, in this paper, we propose a technique to check the user's emotions through facial expression recognition using a camera, to automatically set colors for each emotion using research on existing emotions and colors, and give colors based on the user's intention.

Classification of Three Different Emotion by Physiological Parameters

  • Jang, Eun-Hye;Park, Byoung-Jun;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.271-279
    • /
    • 2012
  • Objective: This study classified three different emotional states(boredom, pain, and surprise) using physiological signals. Background: Emotion recognition studies have tried to recognize human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 122 college students participated in this experiment. Three different emotional stimuli were presented to participants and physiological signals, i.e., EDA(Electrodermal Activity), SKT(Skin Temperature), PPG(Photoplethysmogram), and ECG (Electrocardiogram) were measured for 1 minute as baseline and for 1~1.5 minutes during emotional state. The obtained signals were analyzed for 30 seconds from the baseline and the emotional state and 27 features were extracted from these signals. Statistical analysis for emotion classification were done by DFA(discriminant function analysis) (SPSS 15.0) by using the difference values subtracting baseline values from the emotional state. Results: The result showed that physiological responses during emotional states were significantly differed as compared to during baseline. Also, an accuracy rate of emotion classification was 84.7%. Conclusion: Our study have identified that emotions were classified by various physiological signals. However, future study is needed to obtain additional signals from other modalities such as facial expression, face temperature, or voice to improve classification rate and to examine the stability and reliability of this result compare with accuracy of emotion classification using other algorithms. Application: This could help emotion recognition studies lead to better chance to recognize various human emotions by using physiological signals as well as is able to be applied on human-computer interaction system for emotion recognition. Also, it can be useful in developing an emotion theory, or profiling emotion-specific physiological responses as well as establishing the basis for emotion recognition system in human-computer interaction.

Effects of Working Memory Load on Negative Facial Emotion Processing: an ERP study (작업기억 부담이 부적 얼굴정서 처리에 미치는 영향: ERP 연구)

  • Park, Taejin;Kim, Junghee
    • Korean Journal of Cognitive Science
    • /
    • v.29 no.1
    • /
    • pp.39-59
    • /
    • 2018
  • To elucidate the effect of working memory (WM) load on negative facial emotion processing, we examined ERP components (P1 and N170) elicited by fearful and neutral expressions each of which was presented during 0-back (low-WM load) or 2-back (high-WM load) tasks. During N-back tasks, visual objects were presented one by one as targets and each of facial expressions was presented as a passively observed stimulus during intervals between targets. Behavioral results showed more accurate and fast responses at low-WM load condition compared to high-WM load condition. Analysis of mean amplitudes of P1 on the occipital region showed significant WM load effect (high-WM load > low-WM load) but showed nonsignificant facial emotion effect. Analysis of mean amplitudes of N170 on the posterior occipito-temporal region showed significant overall facial emotion effect (fearful > neutral), but, in detail, significant facial emotion effect was observed only at low-WM load condition on the left hemisphere, but was observed at high-WM load condition as well as low-WM load condition on the right hemisphere. To summarize, facial emotion effect observed by N170 amplitudes was modulated by WM load only on the left hemisphere. These results show that early emotional processing of negative facial expression could be eliminated or reduced by high load of WM on the left hemisphere, but could not be eliminated by high load on the right hemisphere, and suggest right hemispheric lateralization of negative facial emotion processing.