• Title/Summary/Keyword: expression of emotion

Search Result 603, Processing Time 0.025 seconds

Emotion Recognition of Facial Expression using the Hybrid Feature Extraction (혼합형 특징점 추출을 이용한 얼굴 표정의 감성 인식)

  • Byun, Kwang-Sub;Park, Chang-Hyun;Sim, Kwee-Bo
    • Proceedings of the KIEE Conference
    • /
    • 2004.05a
    • /
    • pp.132-134
    • /
    • 2004
  • Emotion recognition between human and human is done compositely using various features that are face, voice, gesture and etc. Among them, it is a face that emotion expression is revealed the most definitely. Human expresses and recognizes a emotion using complex and various features of the face. This paper proposes hybrid feature extraction for emotions recognition from facial expression. Hybrid feature extraction imitates emotion recognition system of human by combination of geometrical feature based extraction and color distributed histogram. That is, it can robustly perform emotion recognition by extracting many features of facial expression.

  • PDF

Design of the emotion expression in multimodal conversation interaction of companion robot (컴패니언 로봇의 멀티 모달 대화 인터랙션에서의 감정 표현 디자인 연구)

  • Lee, Seul Bi;Yoo, Seung Hun
    • Design Convergence Study
    • /
    • v.16 no.6
    • /
    • pp.137-152
    • /
    • 2017
  • This research aims to develop the companion robot experience design for elderly in korea based on needs-function deploy matrix of robot and emotion expression research of robot in multimodal interaction. First, Elder users' main needs were categorized into 4 groups based on ethnographic research. Second, the functional elements and physical actuators of robot were mapped to user needs in function- needs deploy matrix. The final UX design prototype was implemented with a robot type that has a verbal non-touch multi modal interface with emotional facial expression based on Ekman's Facial Action Coding System (FACS). The proposed robot prototype was validated through a user test session to analyze the influence of the robot interaction on the cognition and emotion of users by Story Recall Test and face emotion analysis software; Emotion API when the robot changes facial expression corresponds to the emotion of the delivered information by the robot and when the robot initiated interaction cycle voluntarily. The group with emotional robot showed a relatively high recall rate in the delayed recall test and In the facial expression analysis, the facial expression and the interaction initiation of the robot affected on emotion and preference of the elderly participants.

The Relation between Preschoolers' Emotion Understanding and Parents' Emotion Expressiveness and Attitude Toward Children's Emotion Expressiveness (학령전 아동의 정서이해와 부모의 정서표현성 및 아동정서 수용태도와의 관계)

  • 이혜련;최보가
    • Journal of the Korean Home Economics Association
    • /
    • v.40 no.10
    • /
    • pp.103-112
    • /
    • 2002
  • This study investigated the relation between preschoolers' emotion understanding and parents' emotion expressiveness and attitude toward children's emotion expressiveness. Subjects were ninety 3- to 5-year old children and their parents. Parents' emotion socialization was measured by PACES developed by Saami(1989) and FEQ developed by Harberstadt(1986). And preschoolers' identification of basic emotional expressions and expression of their own feelings and others' feelings in various situations were measured. Results revealed that 5-year-old children understood emotion better than 3-year-old children, and mother's positive emotion expression influenced children's emotion understanding. The results are consistent with recent research showing that parents emotion socialization may be important for preschoolers' emotion understanding.

Emotion Recognition and Expression System of Robot Based on 2D Facial Image (2D 얼굴 영상을 이용한 로봇의 감정인식 및 표현시스템)

  • Lee, Dong-Hoon;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.4
    • /
    • pp.371-376
    • /
    • 2007
  • This paper presents an emotion recognition and its expression system of an intelligent robot like a home robot or a service robot. Emotion recognition method in the robot is used by a facial image. We use a motion and a position of many facial features. apply a tracking algorithm to recognize a moving user in the mobile robot and eliminate a skin color of a hand and a background without a facial region by using the facial region detecting algorithm in objecting user image. After normalizer operations are the image enlarge or reduction by distance of the detecting facial region and the image revolution transformation by an angel of a face, the mobile robot can object the facial image of a fixing size. And materialize a multi feature selection algorithm to enable robot to recognize an emotion of user. In this paper, used a multi layer perceptron of Artificial Neural Network(ANN) as a pattern recognition art, and a Back Propagation(BP) algorithm as a learning algorithm. Emotion of user that robot recognized is expressed as a graphic LCD. At this time, change two coordinates as the number of times of emotion expressed in ANN, and change a parameter of facial elements(eyes, eyebrows, mouth) as the change of two coordinates. By materializing the system, expressed the complex emotion of human as the avatar of LCD.

Developmental trends of children's emotional intelligence (유아 정서지능 발달에 관한 연구)

  • Kim, Kyoung Hoe;Kim, Kyoung Hee
    • Korean Journal of Child Studies
    • /
    • v.21 no.4
    • /
    • pp.21-34
    • /
    • 2000
  • This investigation of developmental trends in children's emotional intelligence used the Emotional Intelligence Rating Scale for Preschool Children(Kim, 1998) to study 973 children. Significant age differences were found in 5 factors: 'utilization of emotion', 'regulation of emotion', 'handling of relationship between child and teacher' and 'handling of relationship with peers'. Children's emotional intelligence scores increased with age in 3 factors: 'utilization of emotion', 'empathy', and 'regulation of emotion'. Sex differences were found in 5 factors: 'utilization of emotion', 'empathy', 'appraisal and expression of self emotion', 'regulation of emotion', and 'handling of relationship between child and teacher'. In all factors, the scores of girls were higher than those the scores of boys.

  • PDF

The Mediating Effects of Difficulties in Emotion Regulation between Anger Expression and Interpersonal Problems of College Students (분노표현과 대인관계문제에서 정서조절곤란의 매개효과)

  • Lee, Myung-In;Seo, Hye-Young;Hwang, Soon-Jung
    • Journal of Digital Convergence
    • /
    • v.19 no.1
    • /
    • pp.295-305
    • /
    • 2021
  • This study identifies the relationship between the anger expression and interpersonal problems of college students, and then confirms the influence on emotion regulation difficulties as a mediating effect. And, it was conducted to understand college students who complain of interpersonal problems and to provide basic data for developing programs that can solve their problems. As a result of the study, anger expression showed a positive correlation in interpersonal relations problem and emotion regulation difficulty, and interpersonal relations problem showed somewhat high positive correlation with emotion regulation difficulty. In addition, it was found that emotion control difficulties were partially mediated in the anger expression among college students and interpersonal relationships. In conclusion, we could see the need for future research to develop educational programs that reduce the difficulty of emotion control and lead to positive anger expression among college students, and intervention programs that can improve interpersonal relationships.

The interaction between emotion recognition through facial expression based on cognitive user-centered television (이용자 중심의 얼굴 표정을 통한 감정 인식 TV의 상호관계 연구 -인간의 표정을 통한 감정 인식기반의 TV과 인간의 상호 작용 연구)

  • Lee, Jong-Sik;Shin, Dong-Hee
    • Journal of the HCI Society of Korea
    • /
    • v.9 no.1
    • /
    • pp.23-28
    • /
    • 2014
  • In this study we focus on the effect of the interaction between humans and reactive television when emotion recognition through facial expression mechanism is used. Most of today's user interfaces in electronic products are passive and are not properly fitted into users' needs. In terms of the user centered device, we propose that the emotion based reactive television is the most effective in interaction compared to other passive input products. We have developed and researched next generation cognitive TV models in user centered. In this paper we present a result of the experiment that had been taken with Fraunhofer IIS $SHORE^{TM}$ demo software version to measure emotion recognition. This new approach was based on the real time cognitive TV models and through this approach we studied the relationship between humans and cognitive TV. This study follows following steps: 1) Cognitive TV systems can be on automatic ON/OFF mode responding to motions of people 2) Cognitive TV can directly select channels as face changes (ex, Neutral Mode and Happy Mode, Sad Mode, Angry Mode) 3) Cognitive TV can detect emotion recognition from facial expression of people within the fixed time and then if Happy mode is detected the programs of TV would be shifted into funny or interesting shows and if Angry mode is detected it would be changed to moving or touching shows. In addition, we focus on improving the emotion recognition through facial expression. Furthermore, the improvement of cognition TV based on personal characteristics is needed for the different personality of users in human to computer interaction. In this manner, the study on how people feel and how cognitive TV responds accordingly, plus the effects of media as cognitive mechanism will be thoroughly discussed.

Weighted Soft Voting Classification for Emotion Recognition from Facial Expressions on Image Sequences (이미지 시퀀스 얼굴표정 기반 감정인식을 위한 가중 소프트 투표 분류 방법)

  • Kim, Kyeong Tae;Choi, Jae Young
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.8
    • /
    • pp.1175-1186
    • /
    • 2017
  • Human emotion recognition is one of the promising applications in the era of artificial super intelligence. Thus far, facial expression traits are considered to be the most widely used information cues for realizing automated emotion recognition. This paper proposes a novel facial expression recognition (FER) method that works well for recognizing emotion from image sequences. To this end, we develop the so-called weighted soft voting classification (WSVC) algorithm. In the proposed WSVC, a number of classifiers are first constructed using different and multiple feature representations. In next, multiple classifiers are used for generating the recognition result (namely, soft voting) of each face image within a face sequence, yielding multiple soft voting outputs. Finally, these soft voting outputs are combined through using a weighted combination to decide the emotion class (e.g., anger) of a given face sequence. The weights for combination are effectively determined by measuring the quality of each face image, namely "peak expression intensity" and "frontal-pose degree". To test the proposed WSVC, CK+ FER database was used to perform extensive and comparative experimentations. The feasibility of our WSVC algorithm has been successfully demonstrated by comparing recently developed FER algorithms.

Effects of body-image and emotional expression beliefs on the communication competence of admitted to a psychiatric hospital (정신전문병원에 입원한 알코올 의존 환자의 신체상, 정서표현 신념이 의사소통 능력에 미치는 영향)

  • Ahn, Seong-Ah;Lee, Kyoung-Sook
    • Journal of Digital Convergence
    • /
    • v.17 no.12
    • /
    • pp.289-296
    • /
    • 2019
  • This study was done to explore the relationship alcoholcs' body image, belief emotion expression, communication competence and related factors. Alcoholcs' body image, belief emotion expression, communication competence level scales were used as the measurement tools with a sample of 151 students from 2 hospitals in J-city and S-city. T-test, ANOVA, Pearson correlation and stepwise multiple regression with SPSS/WIN 21.0 version were used to analyze the data. Alcoholcs' body-image was negatively correlated with belief emotion expression but was negatively correlated with communication competence. Belief emotion expression was negatively correlated with communication competence. Hierarchical regression analysis showed that body-image, education degree, and belief emotion expression explained 24.9 % of communication competence. The results of this study can be used as basic data to improve the communication competence of alcoholics.