• Title/Summary/Keyword: expression of emotion

Search Result 603, Processing Time 0.031 seconds

Effects of users and interface agents' gender on users' assessment of the agent (사용자 및 인터페이스 에이전트의 성별이 사용자의 평가에 미치는 효과)

  • Chung, Duk-Hwan;Cho, Kyung-Ja;Han, Kwang-Hee
    • Science of Emotion and Sensibility
    • /
    • v.10 no.4
    • /
    • pp.523-538
    • /
    • 2007
  • This study examined effects of gender and empathic emotional expression of an anthropomorphic interface agent on users. assessment of the agent. In addition, it examined effects of gender and emotional expression regardless of whether visual fidelity of the agent. In Study 1, The agents were manipulated by photographs of human face. The agent expressed empathic emotion by making an other-oriented emotional response congruent with another's perceived welfare. Subjects participated in a task with the agent and then they assessed the agent by rating interpersonal assessment scale. The result reported their preference to the female agent. In addition, they tended to make positive assessment to the agent of opposite gender. In the study 2, gender and expressed emotion of the agent with low fidelity was manipulated. Subjects participated in a task with the agent and then they assessed the agent by rating the same interpersonal assessment scale as study 1. The result reported their preference to the female agent. In addition, they preferred the agent expressing empathic emotion to the agent expressing self-oriented emotion or no emotion. Though the agent had low visual fidelity, its gender and expressed empathic emotion could make a significant effect on users' assessment.

  • PDF

A Study on the Characteristics of Nature-friendly Expressing the Hotel Lobby Space (호텔 로비공간의 자연친화적 표현 특성에 관한 연구)

  • Hong, So-Jung;Ahn, Hee-Young
    • Proceedings of the Korean Institute of Interior Design Conference
    • /
    • 2008.05a
    • /
    • pp.196-202
    • /
    • 2008
  • The interest in natural environment as a new living culture is turning to the programs that give us direct experience rather than indirect one. This change demands new sort of analysis on space in order to design hotel lobby space which satisfies contemporarians' emotion. Hereupon, this article has its purpose of studying the characteristics of environment-friendly expression as a means to approach to the space design of hotel lobby space in which men and nature can coexist and men can be richer in their emotion. Moreover, through the flow and types of environment-friendly design, this study intends to emphasize the importance of introducing natural factors and suggest the possibility of diversification in environment-friendly expression.

  • PDF

A study on the Interactive Expression of Human Emotions in Typography

  • Lim, Sooyeon
    • International Journal of Advanced Culture Technology
    • /
    • v.10 no.1
    • /
    • pp.122-130
    • /
    • 2022
  • In modern times, text has become an image, and typography is a style that is a combination of image and text that can be easily encountered in everyday life. It is developing not only for the purpose of conveying meaningful communication, but also to bring joy and beauty to our lives as a medium with aesthetic format. This study shows through case analysis that typography is a tool for expressing human emotions, and investigates its characteristics that change along with the media. In particular, interactive communication tools and methods used by interactive typography to express viewers' emotions are described in detail. We created interactive typography using the inputted text, the selected music by the viewer and the viewer's movement. As a result of applying it to the exhibition, we could confirm that interactive typography can function as an effective communication medium that shows the utility of both the iconography of letter signs and the cognitive function when combined with the audience's intentional motion.

Life-like Facial Expression of Mascot-Type Robot Based on Emotional Boundaries (감정 경계를 이용한 로봇의 생동감 있는 얼굴 표정 구현)

  • Park, Jeong-Woo;Kim, Woo-Hyun;Lee, Won-Hyong;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.4 no.4
    • /
    • pp.281-288
    • /
    • 2009
  • Nowadays, many robots have evolved to imitate human social skills such that sociable interaction with humans is possible. Socially interactive robots require abilities different from that of conventional robots. For instance, human-robot interactions are accompanied by emotion similar to human-human interactions. Robot emotional expression is thus very important for humans. This is particularly true for facial expressions, which play an important role in communication amongst other non-verbal forms. In this paper, we introduce a method of creating lifelike facial expressions in robots using variation of affect values which consist of the robot's emotions based on emotional boundaries. The proposed method was examined by experiments of two facial robot simulators.

  • PDF

Development of Facial Expression Recognition System based on Bayesian Network using FACS and AAM (FACS와 AAM을 이용한 Bayesian Network 기반 얼굴 표정 인식 시스템 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.562-567
    • /
    • 2009
  • As a key mechanism of the human emotion interaction, Facial Expression is a powerful tools in HRI(Human Robot Interface) such as Human Computer Interface. By using a facial expression, we can bring out various reaction correspond to emotional state of user in HCI(Human Computer Interaction). Also it can infer that suitable services to supply user from service agents such as intelligent robot. In this article, We addresses the issue of expressive face modeling using an advanced active appearance model for facial emotion recognition. We consider the six universal emotional categories that are defined by Ekman. In human face, emotions are most widely represented with eyes and mouth expression. If we want to recognize the human's emotion from this facial image, we need to extract feature points such as Action Unit(AU) of Ekman. Active Appearance Model (AAM) is one of the commonly used methods for facial feature extraction and it can be applied to construct AU. Regarding the traditional AAM depends on the setting of the initial parameters of the model and this paper introduces a facial emotion recognizing method based on which is combined Advanced AAM with Bayesian Network. Firstly, we obtain the reconstructive parameters of the new gray-scale image by sample-based learning and use them to reconstruct the shape and texture of the new image and calculate the initial parameters of the AAM by the reconstructed facial model. Then reduce the distance error between the model and the target contour by adjusting the parameters of the model. Finally get the model which is matched with the facial feature outline after several iterations and use them to recognize the facial emotion by using Bayesian Network.

Effects of the facial expression presenting types and facial areas on the emotional recognition (얼굴 표정의 제시 유형과 제시 영역에 따른 정서 인식 효과)

  • Lee, Jung-Hun;Park, Soo-Jin;Han, Kwang-Hee;Ghim, Hei-Rhee;Cho, Kyung-Ja
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.113-125
    • /
    • 2007
  • The aim of the experimental studies described in this paper is to investigate the effects of the face/eye/mouth areas using dynamic facial expressions and static facial expressions on emotional recognition. Using seven-seconds-displays, experiment 1 for basic emotions and experiment 2 for complex emotions are executed. The results of two experiments supported that the effects of dynamic facial expressions are higher than static one on emotional recognition and indicated the higher emotional recognition effects of eye area on dynamic images than mouth area. These results suggest that dynamic properties should be considered in emotional study with facial expressions for not only basic emotions but also complex emotions. However, we should consider the properties of emotion because each emotion did not show the effects of dynamic image equally. Furthermore, this study let us know which facial area shows emotional states more correctly is according to the feature emotion.

  • PDF

Emotion Recognition by Vision System (비젼에 의한 감성인식)

  • 이상윤;오재흥;주영훈;심귀보
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.12a
    • /
    • pp.203-207
    • /
    • 2001
  • In this Paper, we propose the neural network based emotion recognition method for intelligently recognizing the human's emotion using CCD color image. To do this, we first acquire the color image from the CCD camera, and then propose the method for recognizing the expression to be represented the structural correlation of man's feature Points(eyebrows, eye, nose, mouse) It is central technology that the Process of extract, separate and recognize correct data in the image. for representation is expressed by structural corelation of human's feature Points In the Proposed method, human's emotion is divided into four emotion (surprise, anger, happiness, sadness). Had separated complexion area using color-difference of color space by method that have separated background and human's face toughly to change such as external illumination in this paper. For this, we propose an algorithm to extract four feature Points from the face image acquired by the color CCD camera and find normalization face picture and some feature vectors from those. And then we apply back-prapagation algorithm to the secondary feature vector. Finally, we show the Practical application possibility of the proposed method.

  • PDF

Facial expression recognition based on pleasure and arousal dimensions (쾌 및 각성차원 기반 얼굴 표정인식)

  • 신영숙;최광남
    • Korean Journal of Cognitive Science
    • /
    • v.14 no.4
    • /
    • pp.33-42
    • /
    • 2003
  • This paper presents a new system for facial expression recognition based in dimension model of internal states. The information of facial expression are extracted to the three steps. In the first step, Gabor wavelet representation extracts the edges of face components. In the second step, sparse features of facial expressions are extracted using fuzzy C-means(FCM) clustering algorithm on neutral faces, and in the third step, are extracted using the Dynamic Model(DM) on the expression images. Finally, we show the recognition of facial expression based on the dimension model of internal states using a multi-layer perceptron. The two dimensional structure of emotion shows that it is possible to recognize not only facial expressions related to basic emotions but also expressions of various emotion.

  • PDF

A Multimodal Emotion Recognition Using the Facial Image and Speech Signal

  • Go, Hyoun-Joo;Kim, Yong-Tae;Chun, Myung-Geun
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.5 no.1
    • /
    • pp.1-6
    • /
    • 2005
  • In this paper, we propose an emotion recognition method using the facial images and speech signals. Six basic emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Facia] expression recognition is performed by using the multi-resolution analysis based on the discrete wavelet. Here, we obtain the feature vectors through the ICA(Independent Component Analysis). On the other hand, the emotion recognition from the speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and the final recognition is obtained from the multi-decision making scheme. After merging the facial and speech emotion recognition results, we obtained better performance than previous ones.

Context Modulation Effect by Affective Words Influencing on the Judgment of Facial Emotion (얼굴정서 판단에 미치는 감정단어의 맥락조절효과)

  • Lee, Jeongsoo;Yang, Hyeonbo;Lee, Donghoon
    • Science of Emotion and Sensibility
    • /
    • v.22 no.2
    • /
    • pp.37-48
    • /
    • 2019
  • Current research explores the effect of language on the perception of facial emotion as suggested by the psychological construction theory of emotion by using a psychophysical method. In this study, we hypothesize that the perception of facial expression may be influenced if the observer is shown an affective word before he/she judges an expression. Moreover, we suggest that his/her understanding of a facial emotion will be in line with the conceptual context that the word denotes. During the two experiments conducted for this project, a control stimulus or words representing either angry or happy emotions were briefly presented to participants before they were shown a target face. These target faces were randomly selected from seven faces that were gradually morphed to show neutral to angry (in Experiment 1) and neutral to happy (in Experiment 2) expressions. The participants were asked to perform a two-alternative forced choice (2AFC) task to judge the emotion of the target face (i.e., decide whether it is angry or neutral, or happy or neutral). The results of Experiment 1 (when compared with the control condition) showed that words denoting anger decreased the point of subjective equality (PSE) for judging the emotion of the target as anger, whereas words denoting happiness increased the PSE. Experiment 2, in which participants had to judge expressions on a scale from happy to neutral, produced a contrasting pattern of results. The outcomes of this study support the claim of the psychological construction theory of emotion that the perception of facial emotion is an active construction process that may be influenced by information (such as affective words) that provide conceptual context.