• Title/Summary/Keyword: Facial Emotions

Search Result 159, Processing Time 0.025 seconds

A Study on Effective Facial Expression of 3D Character through Variation of Emotions (Model using Facial Anatomy) (감정변화에 따른 3D캐릭터의 표정연출에 관한 연구 (해부학적 구조 중심으로))

  • Kim, Ji-Ae
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.7
    • /
    • pp.894-903
    • /
    • 2006
  • Rapid technology growth of hardware have brought about development and expansion of various digital motion pictured information including 3-Dimension. 3D digital techniques can be used to be diversity in Animation, Virtual-Reality, Movie, Advertisement, Game and so on. 3D characters in digital motion picture take charge of the core as to communicate emotions and information to users through sounds, facial expression and characteristic motions. Concerns about 3D motion and facial expression is getting higher with extension of frequency in use and range about 3D character design. In this study, the facial expression can be used as a effective method about implicit emotions will be studied and research 3D character's facial expressions and muscles movement which are based on human anatomy and then try to find effective method of facial expression. Finally, also, study the difference and distinguishing between 2D and 3D character through the preceding study what I have researched before.

  • PDF

Computer-Based Training Program to Facilitate Learning of the Relationship between Facial-Based and Situation-Based Emotions and Prosocial Behaviors

  • Takezawa, Tomohiro;Ogoshi, Sakiko;Ogoshi, Yasuhiro;Mitsuhashi, Yoshinori;Hiratani, Michio
    • Industrial Engineering and Management Systems
    • /
    • v.11 no.2
    • /
    • pp.142-147
    • /
    • 2012
  • Individuals with autistic spectrum disorders (ASD) have difficulty inferring other people's feelings from their facial expressions and/or from situational cues, and therefore, they are less able to respond with prosocial behavior. We developed a computer-based training program to help teach the connection between facial-based or situation-based emotions and prosocial behavioral responses. An 8-year-old male school child with ASD participated in the study. In this program, he was trained to identify persons in need of help and appropriate prosocial responses using novel photo-based scenarios. When he misidentified emotions from photographs of another's face, the program highlighted those parts of the face which effectively communicate emotion. To increase the likelihood that he would learn a generalized repertoire of emotional understanding, multiple examples of emotional expressions and situations were provided. When he misidentified persons expressing a need for help, or failed to identify appropriate helping behaviors, role playing was used to help him appreciate the state of mind of a person in need of help. The results of the training indicated increases in prosocial behaviors during a laboratory task that required collaborative work. His homeroom teacher, using a behavioral rating scale, reported that he now understood another's emotion or situation better than before training. These findings indicate the effects of the training are not limited to the artificial experiment situation, but also carried over to his school life.

A Study on the System of Facial Expression Recognition for Emotional Information and Communication Technology Teaching (감성ICT 교육을 위한 얼굴감성 인식 시스템에 관한 연구)

  • Song, Eun Jee
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.4 no.2
    • /
    • pp.171-175
    • /
    • 2012
  • Recently, the research on ICT (Information and Communication Technology), which cognizes and communicates human's emotion through information technology, is increasing. For instance, there are researches on phones and services that perceive users' emotions through detecting people's voices, facial emotions, and biometric data. In short, emotions which were used to be predicted only by humans are, now, predicted by digital equipment instead. Among many ICT researches, research on emotion recognition from face is fully expected as the most effective and natural human interface. This paper studies about sensitivity ICT and examines mechanism of facial expression recognition system as an example of sensitivity ICT.

  • PDF

A Study on Facial Expression Acting in Genre Drama - with Focus on K-Drama Voice2 - (장르 드라마에서의 표정연기연구 - 드라마 '보이스2'를 중심으로 -)

  • Oh, Youn-Hong
    • Journal of Korea Entertainment Industry Association
    • /
    • v.13 no.8
    • /
    • pp.313-323
    • /
    • 2019
  • For the actors on video, facial expression acting can easily become 'forced facial expression' or 'over-acting'. Also, if self-restraint is emphasized too much, then it becomes 'flat acting' with insufficient emotions. By bringing forth questions in regard to such facial expression acting methods, this study analyzed the facial expression acting of the actors in genre dramas with strong commercial aspects. In conclusion, the facial expression acting methods of the actors in genre dramas were being conducted in a typical way. This means that in visual conventions of video acting, the aesthetic standard has become the important standard in the facial expression acting of the actors. In genre dramas, the emotions of the characters are often revealed in close-up shots. Within the close-up shot, the most important expressive medium in a 'zoomed-in face' is the 'pupil of the eye', and emotions are mostly expressed through the movements of the eye and muscles around it. The second most important expressive medium is the 'mouth'. The differences in the degree of opening and closing the mouth convey diverse emotions along with the expression of the 'eye'. In addition, tensions in the facial muscles greatly hinder the expression of emotions, and the movement of facial muscles must be minimized to prevent excessive wrinkles from forming on the surface of the face. Facial expressions are not completed just with the movement of the muscles. Ultimately, the movement of the muscle is the result of emotions. Facial expression acting takes place after having emotional feelings. For this, the actor needs to go through the process of 'personalization' of a character, such as 'emotional memory', 'concentration' and 'relaxation' which are psychological acting techniques of Stanislavsky. Also, the characteristics of close-up shots that visually reveal the 'inner world' should be recognized. In addition, it was discovered that the facial expression acting is the reaction acting that provides the important point in the unfolding of narratives, and that the method of facial expression and the size of the shots required for the actors are different depending on the roles of main and supporting characters.

The Congruent Effects of Gesture and Facial Expression of Virtual Character on Emotional Perception: What Facial Expression is Significant? (가상 캐릭터의 몸짓과 얼굴표정의 일치가 감성지각에 미치는 영향: 어떤 얼굴표정이 중요한가?)

  • Ryu, Jeeheon;Yu, Seungbeom
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.5
    • /
    • pp.21-34
    • /
    • 2016
  • In the design and develop a virtual character, it is important to correctly deliver target emotions generated by the combination of facial expression and gesture. The purpose of this study is to examine the effect of congruent/incongruent between gesture and facial expression on target emotion. In this study four emotions, joy, sadness, fear, and anger, are applied. The results of study showed that sadness emotion were incorrectly perceived. Moreover, it was perceived as anger instead of sadness. Sadness can be easily confused when facial expression and gestures were simultaneously presented. However, in the other emotional status, the intended emotional expressions were correctly perceived. The overall evaluation of virtual character's emotional expression was significantly low when joy gesture was combined with sad facial expression. The results of this study suggested that emotional gesture is more influential correctly to deliver target emotions to users. This study suggested that social cues like gender or age of virtual character should be further studied.

Video-based Facial Emotion Recognition using Active Shape Models and Statistical Pattern Recognizers (Active Shape Model과 통계적 패턴인식기를 이용한 얼굴 영상 기반 감정인식)

  • Jang, Gil-Jin;Jo, Ahra;Park, Jeong-Sik;Seo, Yong-Ho
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.14 no.3
    • /
    • pp.139-146
    • /
    • 2014
  • This paper proposes an efficient method for automatically distinguishing various facial expressions. To recognize the emotions from facial expressions, the facial images are obtained by digital cameras, and a number of feature points were extracted. The extracted feature points are then transformed to 49-dimensional feature vectors which are robust to scale and translational variations, and the facial emotions are recognized by statistical pattern classifiers such Naive Bayes, MLP (multi-layer perceptron), and SVM (support vector machine). Based on the experimental results with 5-fold cross validation, SVM was the best among the classifiers, whose performance was obtained by 50.8% for 6 emotion classification, and 78.0% for 3 emotions.

The Effects of the Emotion Regulation Strategy to the Disgust Stimulus on Facial Expression and Emotional Experience (혐오자극에 대한 정서조절전략이 얼굴표정 및 정서경험에 미치는 영향)

  • Jang, Sung-Lee;Lee, Jang-Han
    • Korean Journal of Health Psychology
    • /
    • v.15 no.3
    • /
    • pp.483-498
    • /
    • 2010
  • This study is to examine the effects of emotion regulation strategies in facial expressions and emotional experiences, based on the facial expressions of groups, using antecedent- and response- focused regulation. 50 female undergraduate students were instructed to use different emotion regulation strategies during the viewing of a disgust inducing film. While watching, their facial expressions and emotional experiences were measured. As a result, participants showed the highest frequency of action units related to disgust in the EG(expression group), and they reported in the following order of DG(expressive dissonance group), CG(cognitive reappraisal group), and SG(expressive suppression group). Also, the upper region of the face reflected real emotions. In this region, the frequency of action units related to disgust were lower in the CG than in the EG or DG. The results of the PANAS indicated the largest decrease of positive emotions reported in the DG, but an increase of positive emotions reported in the CG. This study suggests that cognitive reappraisal to an event is a more functional emotion regulation strategy compared to other strategies related to facial expression and emotional experience that affect emotion regulation strategies.

Core Affect Dimensional Structures Derived from Facial Expressions of Older Adults (고령자의 연령별 얼굴 정서 차원)

  • Jongwan Kim
    • Science of Emotion and Sensibility
    • /
    • v.27 no.3
    • /
    • pp.51-60
    • /
    • 2024
  • Previous research reported a decline in facial emotion recognition with aging, but whether this was due to a genuine decline in recognition ability or own-age face recognition bias remains unclear, as most studies used stimuli from younger models. Thus, this study recruited older adults as participants and utilized stimuli identical to Kim (2021) study for direct comparison. Participants rated the similarity of pairs of facial expressions representing six emotions (anger, disgust, fear, happiness, neutrality, and sadness) from three age groups (young, middle-aged, and old). Multidimensional scaling analysis revealed that, regardless of age, all three core dimensions were confirmed, indicating similar representation of facial emotions across age groups. The older participants assigned lower arousal and dominance to younger faces expressing disgust and higher arousal and dominance to younger faces expressing fear, indicating that they rated younger faces' disgust expressions less strongly and overestimated fear expressions. These findings suggest that the own-age face recognition bias in facial expression perception may be emotion-specific rather than universally applicable to all emotions.

Emotion Training: Image Color Transfer with Facial Expression and Emotion Recognition (감정 트레이닝: 얼굴 표정과 감정 인식 분석을 이용한 이미지 색상 변환)

  • Kim, Jong-Hyun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.4
    • /
    • pp.1-9
    • /
    • 2018
  • We propose an emotional training framework that can determine the initial symptom of schizophrenia by using emotional analysis method through facial expression change. We use Emotion API in Microsoft to obtain facial expressions and emotion values at the present time. We analyzed these values and recognized subtle facial expressions that change with time. The emotion states were classified according to the peak analysis-based variance method in order to measure the emotions appearing in facial expressions according to time. The proposed method analyzes the lack of emotional recognition and expressive ability by using characteristics that are different from the emotional state changes classified according to the six basic emotions proposed by Ekman. As a result, the analyzed values are integrated into the image color transfer framework so that users can easily recognize and train their own emotional changes.

A Comparative Analysis on Facial Expression in Advertisements -By Utilising Facial Action Coding System(FACS) (광고 속의 얼굴 표정에 따른 비교 연구 -FACS를 활용하여)

  • An, Kyoung Hee
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.3
    • /
    • pp.61-71
    • /
    • 2019
  • Due to the limit of the time length of advertisement, facial expressions among the types of nonverbal communication are much more expressive and convincing to appeal to costumers. The purpose of this paper is not only to investigate how facial expressions are portrayed but also to examine how facial expressions convey emotion in TV advertisements. Research subjects are TV advertisements of and which had the wide range of popularity for customer known as one of the most touching commercials. The research method is Facial Action Coding System based on the theoretical perspective of a discrete emotions and designed to measure specific facial muscle movements. This research is to analyse the implications of facial expressions in the both TV ads by using FACS based on Psychology as well as anatomy. From the all the result of this, it is shown that the facial expressions portrayed with the conflict of emotional states and the dramatic emotional relief of the heroin could move more customers' emotions.