• Title/Summary/Keyword: Facial emotion

Search Result 309, Processing Time 0.032 seconds

Emotion Recognition Based on Facial Expression by using Context-Sensitive Bayesian Classifier (상황에 민감한 베이지안 분류기를 이용한 얼굴 표정 기반의 감정 인식)

  • Kim, Jin-Ok
    • The KIPS Transactions:PartB
    • /
    • v.13B no.7 s.110
    • /
    • pp.653-662
    • /
    • 2006
  • In ubiquitous computing that is to build computing environments to provide proper services according to user's context, human being's emotion recognition based on facial expression is used as essential means of HCI in order to make man-machine interaction more efficient and to do user's context-awareness. This paper addresses a problem of rigidly basic emotion recognition in context-sensitive facial expressions through a new Bayesian classifier. The task for emotion recognition of facial expressions consists of two steps, where the extraction step of facial feature is based on a color-histogram method and the classification step employs a new Bayesian teaming algorithm in performing efficient training and test. New context-sensitive Bayesian learning algorithm of EADF(Extended Assumed-Density Filtering) is proposed to recognize more exact emotions as it utilizes different classifier complexities for different contexts. Experimental results show an expression classification accuracy of over 91% on the test database and achieve the error rate of 10.6% by modeling facial expression as hidden context.

Dynamic Emotion Model in 3D Affect Space for a Mascot-Type Facial Robot (3차원 정서 공간에서 마스코트 형 얼굴 로봇에 적용 가능한 동적 감정 모델)

  • Park, Jeong-Woo;Lee, Hui-Sung;Jo, Su-Hun;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.3
    • /
    • pp.282-287
    • /
    • 2007
  • Humanoid and android robots are emerging as a trend shifts from industrial robot to personal robot. So human-robot interaction will increase. Ultimate objective of humanoid and android would be a robot like a human. In this aspect, implementation of robot's facial expression is necessary in making a human-like robot. This paper proposes a dynamic emotion model for a mascot-type robot to display similar facial and more recognizable expressions.

  • PDF

Hybrid-Feature Extraction for the Facial Emotion Recognition

  • Byun, Kwang-Sub;Park, Chang-Hyun;Sim, Kwee-Bo;Jeong, In-Cheol;Ham, Ho-Sang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1281-1285
    • /
    • 2004
  • There are numerous emotions in the human world. Human expresses and recognizes their emotion using various channels. The example is an eye, nose and mouse. Particularly, in the emotion recognition from facial expression they can perform the very flexible and robust emotion recognition because of utilization of various channels. Hybrid-feature extraction algorithm is based on this human process. It uses the geometrical feature extraction and the color distributed histogram. And then, through the independently parallel learning of the neural-network, input emotion is classified. Also, for the natural classification of the emotion, advancing two-dimensional emotion space is introduced and used in this paper. Advancing twodimensional emotion space performs a flexible and smooth classification of emotion.

  • PDF

Mood Suggestion Framework Using Emotional Relaxation Matching Based on Emotion Meshes

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.8
    • /
    • pp.37-43
    • /
    • 2018
  • In this paper, we propose a framework that automatically suggests emotion using emotion analysis method based on facial expression change. We use Microsoft's Emotion API to calculate and analyze emotion values in facial expressions to recognize emotions that change over time. In this step, we use standard deviations based on peak analysis to measure and classify emotional changes. The difference between the classified emotion and the normal emotion is calculated, and the difference is used to recognize the emotion abnormality. We match user's emotions to relatively relaxed emotions using histograms and emotional meshes. As a result, we provide relaxed emotions to users through images. The proposed framework helps users to recognize emotional changes easily and to train their emotions through emotional relaxation.

Dynamic Facial Expression of Fuzzy Modeling Using Probability of Emotion (감정확률을 이용한 동적 얼굴표정의 퍼지 모델링)

  • Kang, Hyo-Seok;Baek, Jae-Ho;Kim, Eun-Tai;Park, Mignon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.1
    • /
    • pp.1-5
    • /
    • 2009
  • This paper suggests to apply mirror-reflected method based 2D emotion recognition database to 3D application. Also, it makes facial expression of fuzzy modeling using probability of emotion. Suggested facial expression function applies fuzzy theory to 3 basic movement for facial expressions. This method applies 3D application to feature vector for emotion recognition from 2D application using mirror-reflected multi-image. Thus, we can have model based on fuzzy nonlinear facial expression of a 2D model for a real model. We use average values about probability of 6 basic expressions such as happy, sad, disgust, angry, surprise and fear. Furthermore, dynimic facial expressions are made via fuzzy modelling. This paper compares and analyzes feature vectors of real model with 3D human-like avatar.

Discriminative Effects of Social Skills Training on Facial Emotion Recognition among Children with Attention-Deficit/Hyperactivity Disorder and Autism Spectrum Disorder

  • Lee, Ji-Seon;Kang, Na-Ri;Kim, Hui-Jeong;Kwak, Young-Sook
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.29 no.4
    • /
    • pp.150-160
    • /
    • 2018
  • Objectives: This study investigated the effect of social skills training (SST) on facial emotion recognition and discrimination in children with attention-deficit/hyperactivity disorder (ADHD) and autism spectrum disorder (ASD). Methods: Twenty-three children aged 7 to 10 years participated in our SST. They included 15 children diagnosed with ADHD and 8 with ASD. The participants' parents completed the Korean version of the Child Behavior Checklist (K-CBCL), the ADHD Rating Scale, and Conner's Scale at baseline and post-treatment. The participants completed the Korean Wechsler Intelligence Scale for Children-IV (K-WISC-IV) and the Advanced Test of Attention at baseline and the Penn Emotion Recognition and Discrimination Task at baseline and post-treatment. Results: No significant changes in facial emotion recognition and discrimination occurred in either group before and after SST. However, when controlling for the processing speed of K-WISC and the social subscale of K-CBCL, the ADHD group showed more improvement in total (p=0.049), female (p=0.039), sad (p=0.002), mild (p=0.015), female extreme (p=0.005), male mild (p=0.038), and Caucasian (p=0.004) facial expressions than did the ASD group. Conclusion: SST improved facial expression recognition for children with ADHD more effectively than it did for children with ASD, in whom additional training to help emotion recognition and discrimination is needed.

A Study on the Applicability of Facial Action Coding System for Product Design Process (제품 디자인 프로세스를 위한 표정 부호화 시스템(FACS) 적용성에 대한 연구)

  • Huang, Chao;Go, Jung-Wook
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.3
    • /
    • pp.80-88
    • /
    • 2019
  • With more emphasis on emotional communication with users in product design field, designers' clear and prompt grasp of user's emotion has become the core activity in product design research. To increase the flexibility applying emotion measurement in the process of product design, this study has used Facial Action Coding System (FACS) of behavioral emotion measurement method in product design evaluation. To select specimens, it has flexibly used the emotional product Image Map. Then this study has selected six product irritants inducing positive, negative and neutral emotions, and conducted FACS experiment with ordinary product users of 20 generations as the experimental subject, and analyzed users' emotional state in response to the irritants through their facial expressions. It also analyzes the advantages and disadvantages of FACS in the process of product design, such as "recording users' unconscious facial expressions" and puts forward some applicable schemes, such as "choosing a product stimulus with high user response". It is expected that this paper can be helpful to the flexibility of FACS as a method to predict user's emotion in advance at the trial stage of product design before launching them to the market.

The Effect of Impulsivity and the Ability to Recognize Facial Emotion on the Aggressiveness of Children with Attention-Deficit Hyperactivity Disorder (주의력결핍 과잉행동장애 아동에서 감정인식능력 및 충동성이 공격성에 미치는 영향)

  • Bae, Seung-Min;Shin, Dong-Won;Lee, Soo-Jung
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.20 no.1
    • /
    • pp.17-22
    • /
    • 2009
  • Objectives : A higher level of aggression has been reported for children with attention-deficit/hyperactivity disorder (ADHD) than for non-ADHD children. Aggression was shown to have a negative effect on the social functioning of children with ADHD. The ability to recognize facial emotion expression has also been related to aggression. In this study, we examined whether impulsivity and dysfunctional recognition of facial emotion expression could explain the aggressiveness of children with ADHD. Methods : 67 children with ADHD participated in this study. We measured the ability to recognize facial emotion expression by using the Emotion Recognition Test (ERT) and we measured aggression by the T score of the aggression subscale of the Child Behavior Checklist (CBCL). Impulsivity was measured by the ADHD diagnostic system (ADS). Results : The teacher rated level of aggression was related to the score of recognizing negative affect. After controlling for the effect of impulsivity, this relationship is not significant. Only the score of the visual commission errors ex plained the level of aggression of children with ADHD. Conclusion : Impulsivity seems to have a major role in explaining the aggression of children with ADHD. The clinical implication of this study is that effective intervention for controlling impulsivity may be expected to reduce the aggression of children with ADHD.

  • PDF

Development of Emotion Recongition System Using Facial Image (얼굴 영상을 이용한 감정 인식 시스템 개발)

  • Kim, M.H.;Joo, Y.H.;Park, J.B.;Lee, J.;Cho, Y.J.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.2
    • /
    • pp.191-196
    • /
    • 2005
  • Although the technology for emotion recognition is important one which was demanded in various fields, it still remains as the unsolved problems. Especially, there is growing demand for emotion recognition technology based on racial image. The facial image based emotion recognition system is complex system comprised of various technologies. Therefore, various techniques such that facial image analysis, feature vector extraction, pattern recognition technique, and etc, are needed in order to develop this system. In this paper, we propose new emotion recognition system based un previously studied facial image analysis technique. The proposed system recognizes the emotion by using the fuzzy classifier. The facial image database is built up and the performance of the proposed system is verified by using built database.

Development of Facial Expression Recognition System based on Bayesian Network using FACS and AAM (FACS와 AAM을 이용한 Bayesian Network 기반 얼굴 표정 인식 시스템 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.562-567
    • /
    • 2009
  • As a key mechanism of the human emotion interaction, Facial Expression is a powerful tools in HRI(Human Robot Interface) such as Human Computer Interface. By using a facial expression, we can bring out various reaction correspond to emotional state of user in HCI(Human Computer Interaction). Also it can infer that suitable services to supply user from service agents such as intelligent robot. In this article, We addresses the issue of expressive face modeling using an advanced active appearance model for facial emotion recognition. We consider the six universal emotional categories that are defined by Ekman. In human face, emotions are most widely represented with eyes and mouth expression. If we want to recognize the human's emotion from this facial image, we need to extract feature points such as Action Unit(AU) of Ekman. Active Appearance Model (AAM) is one of the commonly used methods for facial feature extraction and it can be applied to construct AU. Regarding the traditional AAM depends on the setting of the initial parameters of the model and this paper introduces a facial emotion recognizing method based on which is combined Advanced AAM with Bayesian Network. Firstly, we obtain the reconstructive parameters of the new gray-scale image by sample-based learning and use them to reconstruct the shape and texture of the new image and calculate the initial parameters of the AAM by the reconstructed facial model. Then reduce the distance error between the model and the target contour by adjusting the parameters of the model. Finally get the model which is matched with the facial feature outline after several iterations and use them to recognize the facial emotion by using Bayesian Network.