• Title/Summary/Keyword: facial emotional expression

Search Result 126, Processing Time 0.028 seconds

Emotional Contagion as an Eliciting Factor of Altruistic Behavior: Moderating Effects by Culture (이타행동의 유발요인으로서 정서전염: 문화변인의 조절효과)

  • Jungsik Kim;Wan-Suk Gim
    • Korean Journal of Culture and Social Issue
    • /
    • v.13 no.2
    • /
    • pp.55-76
    • /
    • 2007
  • This study investigated the relationship between emotional contagion and altruistic behaviors and also examined the moderating effect of self-construals(independent and interdependent self) in this relationship. It was hypothesized that the emotional expression of people in need would be caught by others through automatic mimicry, that emotional information would be internalized through the facial-feedback process and that the transferred emotion would eventually result in a motive to call for altruistic behaviors. In Study 1, participants watched a video clip about a disabled student reporting difficulties in school life but showing facial expression opposite to the contents of message to separate emotional contagion and empathy. Participants' decision to participate in voluntary works for the disabled student was measured. As a result, it was found that the more participants experienced emotional contagion, the more they participated in altruistic behaviors. Study 2 measured the vulnerability to emotional contagion, actual experiences of altruistic behaviors, and self-construals. The results of hierarchical regression showed that interdependent self moderated the influence of emotional contagion on altruistic behaviors whereas independent self moderated the relationship in an opposite direction. The implications of emotion and altruistic behaviors in human evolution process are discussed.

  • PDF

Efficient Emotional Relaxation Framework with Anisotropic Features Based Dijkstra Algorithm

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.4
    • /
    • pp.79-86
    • /
    • 2020
  • In this paper, we propose an efficient emotional relaxation framework using Dijkstra algorithm based on anisotropic features. Emotional relaxation is as important as emotional analysis. This is a framework that can automatically alleviate the person's depression or loneliness. This is very important for HCI (Human-Computer Interaction). In this paper, 1) Emotion value changing from facial expression is calculated using Microsoft's Emotion API, 2) Using these differences in emotion values, we recognize abnormal feelings such as depression or loneliness. 3) Finally, emotional mesh based matching process considering the emotional histogram and anisotropic characteristics is proposed, which suggests emotional relaxation to the user. In this paper, we propose a system which can recognize the change of emotion easily by using face image and train personal emotion by emotion relaxation.

Study of expression in virtual character of facial smile by emotion recognition (감성인식에 따른 가상 캐릭터의 미소 표정변화에 관한 연구)

  • Lee, Dong-Yeop
    • Cartoon and Animation Studies
    • /
    • s.33
    • /
    • pp.383-402
    • /
    • 2013
  • In this study, we apply the facial Facial Action Coding System for coding the muscular system anatomical approach facial expressions to be displayed in response to a change in sensitivity. To verify by applying the virtual character the Duchenne smile to the original. I extracted the Duchenne smile by inducing experiment of emotion (man 2, woman 2) and the movie theater department students trained for the experiment. Based on the expression that has been extracted, I collect the data of the facial muscles. Calculates the frequency of expression of the face and other parts of the body muscles around the mouth and lips, to be applied to the virtual character of the data. Orbicularis muscle to contract end of lips due to shrinkage of the Zygomatic Major is a upward movement, cheek goes up, the movement of the muscles, facial expressions appear the outer eyelid under the eye goes up with a look of smile. Muscle movement of large muscle and surrounding Zygomatic Major is observed together (AU9) muscles around the nose and (AU25, AU26, AU27) muscles around the mouth associated with openness. Duchen smile occurred in the form of Orbicularis Oculi and Zygomatic Major moves at the same time. Based on this, by separating the orbicularis muscle that is displayed in the form of laughter and sympathy to emotional feelings and viable large muscle by the will of the person, by applying to the character of the virtual, and expression of human I try to examine expression of the virtual character's ability to distinguish.

Affective Computing in Education: Platform Analysis and Academic Emotion Classification

  • So, Hyo-Jeong;Lee, Ji-Hyang;Park, Hyun-Jin
    • International journal of advanced smart convergence
    • /
    • v.8 no.2
    • /
    • pp.8-17
    • /
    • 2019
  • The main purpose of this study isto explore the potential of affective computing (AC) platforms in education through two phases ofresearch: Phase I - platform analysis and Phase II - classification of academic emotions. In Phase I, the results indicate that the existing affective analysis platforms can be largely classified into four types according to the emotion detecting methods: (a) facial expression-based platforms, (b) biometric-based platforms, (c) text/verbal tone-based platforms, and (c) mixed methods platforms. In Phase II, we conducted an in-depth analysis of the emotional experience that a learner encounters in online video-based learning in order to establish the basis for a new classification system of online learner's emotions. Overall, positive emotions were shown more frequently and longer than negative emotions. We categorized positive emotions into three groups based on the facial expression data: (a) confidence; (b) excitement, enjoyment, and pleasure; and (c) aspiration, enthusiasm, and expectation. The same method was used to categorize negative emotions into four groups: (a) fear and anxiety, (b) embarrassment and shame, (c) frustration and alienation, and (d) boredom. Drawn from the results, we proposed a new classification scheme that can be used to measure and analyze how learners in online learning environments experience various positive and negative emotions with the indicators of facial expressions.

Difference in reading facial expressions as the empathy-systemizing type - focusing on emotional recognition and emotional discrimination - (공감-체계화 유형에 따른 얼굴 표정 읽기의 차이 - 정서읽기와 정서변별을 중심으로 -)

  • Tae, Eun-Ju;Cho, Kyung-Ja;Park, Soo-Jin;Han, Kwang-Hee;Ghim, Hei-Rhee
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.613-628
    • /
    • 2008
  • Mind reading is an essential part of normal social functioning and empathy plays a key role in social understanding. This study investigated how individual differences can have an effect on reading emotions in facial expressions, focusing on empathizing and systemizing. Two experiments were conducted. In study 1, participants performed emotion recognition test using facial expressions to investigate how emotion recognition can be different as empathy-systemizing type, facial areas, and emotion type. Study 2 examined how emotion recognition can be different as empathy-systemizing type, facial areas, and emotion type. An emotion discrimination test was used instead, with every other condition the same as in studies 1. Results from study 2 showed mostly same results as study 1: there were significant differences among facial areas and emotion type and also have an interaction effect between facial areas and emotion type. On the other hand, there was an interaction effect between empathy-systemizing type and emotion type in study 2. That is, how much people empathize and systemize can make difference in emotional discrimination. These results suggested that the empathy-systemizing type was more appropriate to explain emotion discrimination than emotion recognition.

  • PDF

A Portable Mediate Interface 'Handybot' for the Rich Human-Robot Interaction (인관과 로봇의 다양한 상호작용을 위한 휴대 매개인터페이스 ‘핸디밧’)

  • Hwang, Jung-Hoon;Kwon, Dong-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.8
    • /
    • pp.735-742
    • /
    • 2007
  • The importance of the interaction capability of a robot increases as the application of a robot is extended to a human's daily life. In this paper, a portable mediate interface Handybot is developed with various interaction channels to be used with an intelligent home service robot. The Handybot has a task-oriented channel of an icon language as well as a verbal interface. It also has an emotional interaction channel that recognizes a user's emotional state from facial expression and speech, transmits that state to the robot, and expresses the robot's emotional state to the user. It is expected that the Handybot will reduce spatial problems that may exist in human-robot interactions, propose a new interaction method, and help creating rich and continuous interactions between human users and robots.

Emotional Recognition According to General Characteristics of Stroke Patients (뇌졸중 환자의 일반적 특성에 따른 정서인식의 차이)

  • Park, Sungho;Kim, Minho
    • Journal of The Korean Society of Integrative Medicine
    • /
    • v.3 no.1
    • /
    • pp.63-69
    • /
    • 2015
  • Purpose: The purpose of this study was to investigate the differences in emotion recognition according to general characteristics of stroke patients. Method: The subjects consisted of 38 stroke patients receiving rehabilitation at S Hospital in Busan. Used the eMETT program to assess emotional cognition. Result: The age and duration of disease showed statistically significant differences in emotion recognition ability score, the gender and lesion showed a statistically significant difference in some emotion(p<.05). Conclusion: The results of this study it can be seen that the difference in emotion recognition ability in accordance with the general characteristics of the stroke. There will be a variety of future research related to standardized research or interventions targeted at stroke patients and normal controls to be carried out.

Emotion Recognition and Expression System of User using Multi-Modal Sensor Fusion Algorithm (다중 센서 융합 알고리즘을 이용한 사용자의 감정 인식 및 표현 시스템)

  • Yeom, Hong-Gi;Joo, Jong-Tae;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.1
    • /
    • pp.20-26
    • /
    • 2008
  • As they have more and more intelligence robots or computers these days, so the interaction between intelligence robot(computer) - human is getting more and more important also the emotion recognition and expression are indispensable for interaction between intelligence robot(computer) - human. In this paper, firstly we extract emotional features at speech signal and facial image. Secondly we apply both BL(Bayesian Learning) and PCA(Principal Component Analysis), lastly we classify five emotions patterns(normal, happy, anger, surprise and sad) also, we experiment with decision fusion and feature fusion to enhance emotion recognition rate. The decision fusion method experiment on emotion recognition that result values of each recognition system apply Fuzzy membership function and the feature fusion method selects superior features through SFS(Sequential Forward Selection) method and superior features are applied to Neural Networks based on MLP(Multi Layer Perceptron) for classifying five emotions patterns. and recognized result apply to 2D facial shape for express emotion.

Functions and Driving Mechanisms for Face Robot Buddy (얼굴로봇 Buddy의 기능 및 구동 메커니즘)

  • Oh, Kyung-Geune;Jang, Myong-Soo;Kim, Seung-Jong;Park, Shin-Suk
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.4
    • /
    • pp.270-277
    • /
    • 2008
  • The development of a face robot basically targets very natural human-robot interaction (HRI), especially emotional interaction. So does a face robot introduced in this paper, named Buddy. Since Buddy was developed for a mobile service robot, it doesn't have a living-being like face such as human's or animal's, but a typically robot-like face with hard skin, which maybe suitable for mass production. Besides, its structure and mechanism should be simple and its production cost also should be low enough. This paper introduces the mechanisms and functions of mobile face robot named Buddy which can take on natural and precise facial expressions and make dynamic gestures driven by one laptop PC. Buddy also can perform lip-sync, eye-contact, face-tracking for lifelike interaction. By adopting a customized emotional reaction decision model, Buddy can create own personality, emotion and motive using various sensor data input. Based on this model, Buddy can interact probably with users and perform real-time learning using personality factors. The interaction performance of Buddy is successfully demonstrated by experiments and simulations.

  • PDF

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF