• 제목/요약/키워드: expression of emotion

검색결과 607건 처리시간 0.031초

가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석 (Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans)

  • 김진옥
    • 인터넷정보학회논문지
    • /
    • 제13권5호
    • /
    • pp.9-19
    • /
    • 2012
  • 디지털 콘텐츠에서 HCI로 활용되는 가상 인간은 얼굴 표정과 신체자세와 같은 모달을 이용하여 다양한 감정을 표현하지만 비언어적 다중모달의 조합에 대한 연구는 많지 않다. 감정을 표현하는 가상 인간을 제작하려면 계산 엔진 모델은 얼굴 표정과 신체자세와 같은 비언어적 모달의 조합이 사용자에 의해 어떻게 인식되는지를 고려해야 하기 때문에 본 연구는 가상 인간의 감정 표현 디자인에 필요한 비언어적 다중모달의 영향을 분석하여 제시한다. 먼저 가상 인간에 대한 다중모달 별 감정 인식을 평가하여 다른 모달간의 상대적 영향성을 분석하였다. 그리고 일치하는 얼굴과 자세 모달을 통해 기본 감정 및 정서가와 활성화 인식에 대한 영향을 평가하며 감정이 불일치하는 다중모달을 통해 일상생활에서 빈번하게 드러나는 중첩된 감정의 인식 정도를 관측하였다. 실험 결과, 가상 인간의 얼굴과 신체자세의 표정이 일치하면 감정 인식이 용이하며, 얼굴 표정으로 감정 카테고리를 판별하지만 감정의 활성화 차원 판단에는 자세 모달리티가 선호됨을 확인하였다. 본 연구 결과는 감정을 드러내는 가상 인간의 행동 동기화 및 애니메이션 엔진 시스템 구현에 활용할 수 있다.

Statistical Hierarchical Analysis of Children Emotional Intelligence's Effects on Mural Preference, Emotion Cultivation, and Community Connection

  • Lee, Kang Il;Ko, Young Chun
    • 통합자연과학논문집
    • /
    • 제7권1호
    • /
    • pp.50-56
    • /
    • 2014
  • To explore effects on each the emotional awareness, emotional expression, emotional empathy, and emotional regulation, of the sub-factors of the child's emotional intelligence, to mural preference, emotion cultivation, and community connection, the hierarchical multiple regression analyses are performed(as in Table 1, 2, and 3). As the results, we found the following facts. Children's mural preference, emotion cultivation, and community connection were expressed by the following equations in order, respectively. Mural Preference = $.170{\times}$[Emotional Awareness](t=2.118, $p=.036^*$) - $.025{\times}$[Emotional Expression](t=-.275, p=.783) + $.088{\times}$[Emotional Empathy](t=.938, p=.350) + $.139{\times}$[Emotional Regulation] (t=1.529, p=.128). Mural Emotion Cultivation = $-.021{\times}$[Emotional Awareness](t=-.294, p=.769) - $.205{\times}$[Emotional Expression](t=-2.573, $p=.011^*$) + $.265{\times}$[Emotional Empathy](t=3.156, $p=.002^*$) + $.192{\times}$[Emotional Regulation](t=2.361, $p=.019^*$). Mural Community Connection = $-.001{\times}$[Emotional Awareness](t=-.007, p=.995) - $.132{\times}$[Emotional Expression](t=-1.478, p=.141) + $.172{\times}$[Emotional Empathy](t=1.732, $p=.027^*$) + $.098{\times}$[Emotional Regulation](t=1.072, p=.285).

Classification and Intensity Assessment of Korean Emotion Expressing Idioms for Human Emotion Recognition

  • Park, Ji-Eun;Sohn, Sun-Ju;Sohn, Jin-Hun
    • 대한인간공학회지
    • /
    • 제31권5호
    • /
    • pp.617-627
    • /
    • 2012
  • Objective: The aim of the study was to develop a most widely used Korean dictionary of emotion expressing idioms. This is anticipated to assist the development of software technology that recognizes and responds to verbally expressed human emotions. Method: Through rigorous and strategic classification processes, idiomatic expressions included in this dictionary have been rated in terms of nine different emotions (i.e., happiness, sadness, fear, anger, surprise, disgust, interest, boredom, and pain) for meaning and intensity associated with each expression. Result: The Korean dictionary of emotion expression idioms included 427 expressions, with approximately two thirds classified as 'happiness'(n=96), 'sadness'(n=96), and 'anger'(n=90) emotions. Conclusion: The significance of this study primarily rests in the development of a practical language tool that contains Korean idiomatic expressions of emotions, provision of information on meaning and strength, and identification of idioms connoting two or more emotions. Application: Study findings can be utilized in emotion recognition research, particularly in identifying primary and secondary emotions as well as understanding intensity associated with various idioms used in emotion expressions. In clinical settings, information provided from this research may also enhance helping professionals' competence in verbally communicating patients' emotional needs.

친밀도, 공감도, 긍정도에 따른 얼굴 근육의 미세움직임 반응 차이 (Research on Micro-Movement Responses of Facial Muscles by Intimacy, Empathy, Valence)

  • 조지은;박상인;원명주;박민지;황민철
    • 한국콘텐츠학회논문지
    • /
    • 제17권2호
    • /
    • pp.439-448
    • /
    • 2017
  • 얼굴 표정은 상호간의 소통에 있어 중요한 의미를 갖는다. 얼굴 근육의 움직임은 감성 정보를 제공하는데, 이는 사회적 관계를 향상하는 데 중요한 역할을 한다. 그러나 얼굴의 단순 움직임만으로는 복잡한 사회 감성을 인식하기에 정확하지 않다. 본 연구의 목적은 친밀도, 공감도, 긍정도와 같은 사회감성을 인식하기 위한 얼굴의 미세 움직임을 분석하는 것이다. 76명의 피험자를 대상으로 상기 사회감성을 유발하는 자극을 제시하였고 카메라를 사용하여 얼굴 표정을 측정하였다. 결론적으로 친밀함, 공감도, 긍정도의 사회 감성에서 얼굴의 미세움직임이 다르게 나타났다. 총 44개의 얼굴 근육 중 3개의 무의식 근육과 18개의 의식 근육의 움직임 양을 추출한 후, 고속푸리에변환(Fast Fourier Tranform, FFT)을 통하여 (Dominant) Frequency 대역을 확인하였다. 독립 t-검정 결과, 친밀도 상황에서는 코 주변과 볼 주변 근육, 공감도 상황에서는 입 주변 근육, 긍정도 상황에서는 턱 주변 근육에서 유의한 차이를 보였다. 이는 애니메이션의 가상 아바타 등 얼굴 표정의 새로운 표현요소를 제안하고 근육에 따른 사회감성 인식의 기초 연구로서 활용 가능할 것으로 사료 된다.

Personality-Culture Interaction as a Predictor of Emotion Suppression on Facebook

  • Kim, Jinhee;Stavrositu, Carmen D.
    • 감성과학
    • /
    • 제24권4호
    • /
    • pp.91-106
    • /
    • 2021
  • Although personality and culture have been employed as independent predictors of emotion regulation, less is known about the interplay between them. Thus, the present study tests their interaction by focusing on the match between personality (public self-consciousness) and culture (valuing independence vs. interdependence) in modulating an emotion regulation strategy, namely, emotion suppression, on Facebook. Furthermore, relationship concern related to the expression of positive and negative emotions on Facebook is explored as a potential underlying mechanism. An online survey on Facebook users in the United States (n = 320) and South Korea (n = 336) was conducted through two professional survey companies. The results revealed that the positive association between public self-consciousness and emotion suppression was stronger among respondents who value interdependence (vs. independence), which led to a significant interaction between the two predictors. Furthermore, public self-consciousness was associated with emotion suppression through relationship concern for the expression of positive, but not negative, emotions. Furthermore, this mediated relationship was stronger among respondents who value interdependence (vs. independence). Lastly, the study discussed the importance of exploring the interplay between personality and culture and the implication of dialectic emotions.

유아의 의도적 통제와 친사회적 행동 간의 관계에서 어머니 정서사회화 행동의 조절효과 (The Moderating effect of Maternal Emotion-Related Socialization Behaviors on the Relations Between Preschooler's Effortful control and Prosocial behavior)

  • 이윤정;임지영
    • 한국생활과학회지
    • /
    • 제23권6호
    • /
    • pp.1141-1154
    • /
    • 2014
  • The purpose of this study examined the moderating effect of maternal emotion-related socialization behaviors on the relations between preschooler's effortful control and prosocial behavior. In this study, subjects were 153 preschoolers and their mothers. The major results were as follows; there was a moderating effect of maternal emotion-related socialization behaviors on the relationship between preschooler's effortful control and prosocial behavior. Specifically, maternal negative emotion expression and response to the preschooler's positive emotion moderated the effect preschooler's effortful control on prosocial behavior. In conclusion, the impact of effortful control on preschooler's prosocial behavior were significant. Also, maternal emotion-related socialization behaviors(i.e., negative emotion expression and response to the child's positive emotion) affected on preschooler's prosocial behavior. The findings of the study will contribute to help maternal emotional interaction with their preschool aged children.

KOBIE: 애완형 감성로봇 (KOBIE: A Pet-type Emotion Robot)

  • 류정우;박천수;김재홍;강상승;오진환;손주찬;조현규
    • 로봇학회논문지
    • /
    • 제3권2호
    • /
    • pp.154-163
    • /
    • 2008
  • This paper presents the concept for the development of a pet-type robot with an emotion engine. The pet-type robot named KOBIE (KOala roBot with Intelligent Emotion) is able to interact with a person through touch. KOBIE is equipped with tactile sensors on the body for interaction with a person through recognition of his/her touching behaviors such as "Stroke","Tickle","Hit". We have covered KOBIE with synthetic fur fabric in order to can make him/her feel affection as well. KOBIE is able to also express an emotional status that varies according to the circumstances under which it is presented. The emotion engine of KOBIE's emotion expression system generates an emotional status in an emotion vector space which is associated with a predefined needs and mood models. In order to examine the feasibility of our emotion expression system, we verified a changing emotional status in our emotion vector space by a touching behavior. We specially examined the reaction of children who have interacted with three kind of pet-type robots: KOBIE, PARO, AIBO for roughly 10 minutes to investigate the children's preference for pet-type robots.

  • PDF

얼굴근전도와 얼굴표정으로 인한 감성의 정성적 평가에 대한 연구

  • 황민철;김지은;김철중
    • 대한인간공학회:학술대회논문집
    • /
    • 대한인간공학회 1996년도 춘계학술대회논문집
    • /
    • pp.264-269
    • /
    • 1996
  • Facial expression is innate communication skill of human. Human can recognize theri psychological state by facial parameters which contain surface movement, color, humidity and etc. This study is to quantify or qualify human emotion by measurement of facial electromyography (EMG) and facial movement. The measurement is taken at the facial area of frontalis and zygomaticus The results is indicative to discriminate the positive and negative respond of emotion and to extract the parameter sensitive to positive and negative facial-expression. The facial movement according to EMG shows the possibility of non-invasive technique of human emotion.

  • PDF

Facial Expression Recognition with Fuzzy C-Means Clusstering Algorithm and Neural Network Based on Gabor Wavelets

  • Youngsuk Shin;Chansup Chung;Lee, Yillbyung
    • 한국감성과학회:학술대회논문집
    • /
    • 한국감성과학회 2000년도 춘계 학술대회 및 국제 감성공학 심포지움 논문집 Proceeding of the 2000 Spring Conference of KOSES and International Sensibility Ergonomics Symposium
    • /
    • pp.126-132
    • /
    • 2000
  • This paper presents a facial expression recognition based on Gabor wavelets that uses a fuzzy C-means(FCM) clustering algorithm and neural network. Features of facial expressions are extracted to two steps. In the first step, Gabor wavelet representation can provide edges extraction of major face components using the average value of the image's 2-D Gabor wavelet coefficient histogram. In the next step, we extract sparse features of facial expressions from the extracted edge information using FCM clustering algorithm. The result of facial expression recognition is compared with dimensional values of internal stated derived from semantic ratings of words related to emotion. The dimensional model can recognize not only six facial expressions related to Ekman's basic emotions, but also expressions of various internal states.

  • PDF