• Title/Summary/Keyword: expression of emotion

Search Result 599, Processing Time 0.032 seconds

Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans (가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석)

  • Kim, Jin Ok
    • Journal of Internet Computing and Services
    • /
    • v.13 no.5
    • /
    • pp.9-19
    • /
    • 2012
  • Virtual human used as HCI in digital contents expresses his various emotions across modalities like facial expression and body posture. However, few studies considered combinations of such nonverbal multimodal in emotion perception. Computational engine models have to consider how a combination of nonverbal modal like facial expression and body posture will be perceived by users to implement emotional virtual human, This paper proposes the impacts of nonverbal multimodal in design of emotion expressed virtual human. First, the relative impacts are analysed between different modals by exploring emotion recognition of modalities for virtual human. Then, experiment evaluates the contribution of the facial and postural congruent expressions to recognize basic emotion categories, as well as the valence and activation dimensions. Measurements are carried out to the impact of incongruent expressions of multimodal on the recognition of superposed emotions which are known to be frequent in everyday life. Experimental results show that the congruence of facial and postural expression of virtual human facilitates perception of emotion categories and categorical recognition is influenced by the facial expression modality, furthermore, postural modality are preferred to establish a judgement about level of activation dimension. These results will be used to implementation of animation engine system and behavior syncronization for emotion expressed virtual human.

Statistical Hierarchical Analysis of Children Emotional Intelligence's Effects on Mural Preference, Emotion Cultivation, and Community Connection

  • Lee, Kang Il;Ko, Young Chun
    • Journal of Integrative Natural Science
    • /
    • v.7 no.1
    • /
    • pp.50-56
    • /
    • 2014
  • To explore effects on each the emotional awareness, emotional expression, emotional empathy, and emotional regulation, of the sub-factors of the child's emotional intelligence, to mural preference, emotion cultivation, and community connection, the hierarchical multiple regression analyses are performed(as in Table 1, 2, and 3). As the results, we found the following facts. Children's mural preference, emotion cultivation, and community connection were expressed by the following equations in order, respectively. Mural Preference = $.170{\times}$[Emotional Awareness](t=2.118, $p=.036^*$) - $.025{\times}$[Emotional Expression](t=-.275, p=.783) + $.088{\times}$[Emotional Empathy](t=.938, p=.350) + $.139{\times}$[Emotional Regulation] (t=1.529, p=.128). Mural Emotion Cultivation = $-.021{\times}$[Emotional Awareness](t=-.294, p=.769) - $.205{\times}$[Emotional Expression](t=-2.573, $p=.011^*$) + $.265{\times}$[Emotional Empathy](t=3.156, $p=.002^*$) + $.192{\times}$[Emotional Regulation](t=2.361, $p=.019^*$). Mural Community Connection = $-.001{\times}$[Emotional Awareness](t=-.007, p=.995) - $.132{\times}$[Emotional Expression](t=-1.478, p=.141) + $.172{\times}$[Emotional Empathy](t=1.732, $p=.027^*$) + $.098{\times}$[Emotional Regulation](t=1.072, p=.285).

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.

Research on Micro-Movement Responses of Facial Muscles by Intimacy, Empathy, Valence (친밀도, 공감도, 긍정도에 따른 얼굴 근육의 미세움직임 반응 차이)

  • Cho, Ji Eun;Park, Sang-In;Won, Myoung Ju;Park, Min Ji;Whang, Min-Cheol
    • The Journal of the Korea Contents Association
    • /
    • v.17 no.2
    • /
    • pp.439-448
    • /
    • 2017
  • Facial expression is important factor on social interaction. Facial muscle movement provides emotion information to develop social network. However, facial movement has less determined to recognize social emotion. This study is to analyze facial micro-movements and to recognize the social emotion such as intimacy, empathy, and valence. 76 university students were presented to the stimuli for social emotions and was measure their facial expression using camera. As a results, facial micro-movement. showed significant difference of social emotion. After extracting the movement amount of 3 unconscious muscles and 18 conscious muscles, Dominant Frequency band was confirmed. While muscle around the nose and cheek showed significant difference in the intimacy, one around mouth did in the empathy and one around jaw in the valence. The results proposed new facial movement to express social emotion in virtual avatars and to recognize social emotion.

Personality-Culture Interaction as a Predictor of Emotion Suppression on Facebook

  • Kim, Jinhee;Stavrositu, Carmen D.
    • Science of Emotion and Sensibility
    • /
    • v.24 no.4
    • /
    • pp.91-106
    • /
    • 2021
  • Although personality and culture have been employed as independent predictors of emotion regulation, less is known about the interplay between them. Thus, the present study tests their interaction by focusing on the match between personality (public self-consciousness) and culture (valuing independence vs. interdependence) in modulating an emotion regulation strategy, namely, emotion suppression, on Facebook. Furthermore, relationship concern related to the expression of positive and negative emotions on Facebook is explored as a potential underlying mechanism. An online survey on Facebook users in the United States (n = 320) and South Korea (n = 336) was conducted through two professional survey companies. The results revealed that the positive association between public self-consciousness and emotion suppression was stronger among respondents who value interdependence (vs. independence), which led to a significant interaction between the two predictors. Furthermore, public self-consciousness was associated with emotion suppression through relationship concern for the expression of positive, but not negative, emotions. Furthermore, this mediated relationship was stronger among respondents who value interdependence (vs. independence). Lastly, the study discussed the importance of exploring the interplay between personality and culture and the implication of dialectic emotions.

The Moderating effect of Maternal Emotion-Related Socialization Behaviors on the Relations Between Preschooler's Effortful control and Prosocial behavior (유아의 의도적 통제와 친사회적 행동 간의 관계에서 어머니 정서사회화 행동의 조절효과)

  • Lee, Yoon-jeong;Lim, Ji-young
    • Korean Journal of Human Ecology
    • /
    • v.23 no.6
    • /
    • pp.1141-1154
    • /
    • 2014
  • The purpose of this study examined the moderating effect of maternal emotion-related socialization behaviors on the relations between preschooler's effortful control and prosocial behavior. In this study, subjects were 153 preschoolers and their mothers. The major results were as follows; there was a moderating effect of maternal emotion-related socialization behaviors on the relationship between preschooler's effortful control and prosocial behavior. Specifically, maternal negative emotion expression and response to the preschooler's positive emotion moderated the effect preschooler's effortful control on prosocial behavior. In conclusion, the impact of effortful control on preschooler's prosocial behavior were significant. Also, maternal emotion-related socialization behaviors(i.e., negative emotion expression and response to the child's positive emotion) affected on preschooler's prosocial behavior. The findings of the study will contribute to help maternal emotional interaction with their preschool aged children.

KOBIE: A Pet-type Emotion Robot (KOBIE: 애완형 감성로봇)

  • Ryu, Joung-Woo;Park, Cheon-Shu;Kim, Jae-Hong;Kang, Sang-Seung;Oh, Jin-Hwan;Sohn, Joo-Chan;Cho, Hyun-Kyu
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.2
    • /
    • pp.154-163
    • /
    • 2008
  • This paper presents the concept for the development of a pet-type robot with an emotion engine. The pet-type robot named KOBIE (KOala roBot with Intelligent Emotion) is able to interact with a person through touch. KOBIE is equipped with tactile sensors on the body for interaction with a person through recognition of his/her touching behaviors such as "Stroke","Tickle","Hit". We have covered KOBIE with synthetic fur fabric in order to can make him/her feel affection as well. KOBIE is able to also express an emotional status that varies according to the circumstances under which it is presented. The emotion engine of KOBIE's emotion expression system generates an emotional status in an emotion vector space which is associated with a predefined needs and mood models. In order to examine the feasibility of our emotion expression system, we verified a changing emotional status in our emotion vector space by a touching behavior. We specially examined the reaction of children who have interacted with three kind of pet-type robots: KOBIE, PARO, AIBO for roughly 10 minutes to investigate the children's preference for pet-type robots.

  • PDF

얼굴근전도와 얼굴표정으로 인한 감성의 정성적 평가에 대한 연구

  • 황민철;김지은;김철중
    • Proceedings of the ESK Conference
    • /
    • 1996.04a
    • /
    • pp.264-269
    • /
    • 1996
  • Facial expression is innate communication skill of human. Human can recognize theri psychological state by facial parameters which contain surface movement, color, humidity and etc. This study is to quantify or qualify human emotion by measurement of facial electromyography (EMG) and facial movement. The measurement is taken at the facial area of frontalis and zygomaticus The results is indicative to discriminate the positive and negative respond of emotion and to extract the parameter sensitive to positive and negative facial-expression. The facial movement according to EMG shows the possibility of non-invasive technique of human emotion.

  • PDF

Facial Expression Recognition with Fuzzy C-Means Clusstering Algorithm and Neural Network Based on Gabor Wavelets

  • Youngsuk Shin;Chansup Chung;Lee, Yillbyung
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2000.04a
    • /
    • pp.126-132
    • /
    • 2000
  • This paper presents a facial expression recognition based on Gabor wavelets that uses a fuzzy C-means(FCM) clustering algorithm and neural network. Features of facial expressions are extracted to two steps. In the first step, Gabor wavelet representation can provide edges extraction of major face components using the average value of the image's 2-D Gabor wavelet coefficient histogram. In the next step, we extract sparse features of facial expressions from the extracted edge information using FCM clustering algorithm. The result of facial expression recognition is compared with dimensional values of internal stated derived from semantic ratings of words related to emotion. The dimensional model can recognize not only six facial expressions related to Ekman's basic emotions, but also expressions of various internal states.

  • PDF