• Title/Summary/Keyword: Emotional Facial Expression

Search Result 126, Processing Time 0.023 seconds

A Study on The Expression of Digital Eye Contents for Emotional Communication (감성 커뮤니케이션을 위한 디지털 눈 콘텐츠 표현 연구)

  • Lim, Yoon-Ah;Lee, Eun-Ah;Kwon, Jieun
    • Journal of Digital Convergence
    • /
    • v.15 no.12
    • /
    • pp.563-571
    • /
    • 2017
  • The purpose of this paper is to establish an emotional expression factors of digital eye contents that can be applied to digital environments. The emotion which can be applied to the smart doll is derived and we suggest guidelines for expressive factors of each emotion. For this paper, first, we research the concepts and characteristics of emotional expression are shown in eyes by the publications, animation and actual video. Second, we identified six emotions -Happy, Angry, Sad, Relaxed, Sexy, Pure- and extracted the emotional expression factors. Third, we analyzed the extracted factors to establish guideline for emotional expression of digital eyes. As a result, this study found that the factors to distinguish and represent each emotion are classified four categories as eye shape, gaze, iris size and effect. These can be used as a way to enhance emotional communication effects such as digital contents including animations, robots and smart toys.

Design and Implementation of a Real-Time Emotional Avatar (실시간 감정 표현 아바타의 설계 및 구현)

  • Jung, Il-Hong;Cho, Sae-Hong
    • Journal of Digital Contents Society
    • /
    • v.7 no.4
    • /
    • pp.235-243
    • /
    • 2006
  • This paper presents the development of certain efficient method for expressing the emotion of an avatar based on the facial expression recognition. This new method is not changing a facial expression of the avatar manually. It can be changing a real time facial expression of the avatar based on recognition of a facial pattern which can be captured by a web cam. It provides a tool for recognizing some part of images captured by the web cam. Because of using the model-based approach, this tool recognizes the images faster than other approaches such as the template-based or the network-based. It is extracting the shape of user's lip after detecting the information of eyes by using the model-based approach. By using changes of lip's patterns, we define 6 patterns of avatar's facial expression by using 13 standard lip's patterns. Avatar changes a facial expression fast by using the pre-defined avatar with corresponding expression.

  • PDF

Korean Emotional Speech and Facial Expression Database for Emotional Audio-Visual Speech Generation (대화 영상 생성을 위한 한국어 감정음성 및 얼굴 표정 데이터베이스)

  • Baek, Ji-Young;Kim, Sera;Lee, Seok-Pil
    • Journal of Internet Computing and Services
    • /
    • v.23 no.2
    • /
    • pp.71-77
    • /
    • 2022
  • In this paper, a database is collected for extending the speech synthesis model to a model that synthesizes speech according to emotions and generating facial expressions. The database is divided into male and female data, and consists of emotional speech and facial expressions. Two professional actors of different genders speak sentences in Korean. Sentences are divided into four emotions: happiness, sadness, anger, and neutrality. Each actor plays about 3300 sentences per emotion. A total of 26468 sentences collected by filming this are not overlap and contain expression similar to the corresponding emotion. Since building a high-quality database is important for the performance of future research, the database is assessed on emotional category, intensity, and genuineness. In order to find out the accuracy according to the modality of data, the database is divided into audio-video data, audio data, and video data.

The Effect of Young Children's Emotional Reading Ability on Prosocial Behavior: Centered on Facial Expression (유아의 정서읽기능력이 친사회적 행동에 미치는 영향: 얼굴표정을 중심으로)

  • Go, Jeong-Wan
    • Journal of Digital Convergence
    • /
    • v.17 no.6
    • /
    • pp.433-438
    • /
    • 2019
  • This study investigated the effects of young children's emotional reading ability on prosocial behavior. The participants in this study were 192 young children's. From December 17, December 27, 2018, after conducting a survey on emotional reading ability and prosocial behavior of infants, the data was analyzed using the SPSS WIN 22.0 program for pearson correlation analysis and regression analysis. The results of the analysis suggest the following: First, there were significant relationships between young children's emotional reading ability and prosocial Behavior. Second, young children's emotional reading ability affected prosocial behavior. In conclusion, this study is believed to be the basis for the development of programs to improve emotional reading ability and promote prosocial behavior.

A Comic Facial Expression Method for Intelligent Avatar Communications in the Internet Cyberspace (인터넷 가상공간에서 지적 아바타 통신을 위한 코믹한 얼굴 표정의 생성법)

  • 이용후;김상운;청목유직
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.40 no.1
    • /
    • pp.59-73
    • /
    • 2003
  • As a means of overcoming the linguistic barrier between different languages in the Internet, a new sign-language communication system with CG animation techniques has been developed and proposed. In the system, the joint angles of the arms and the hands corresponding to the gesture as a non-verbal communication tool have been considered. The emotional expression, however, could as play also an important role in communicating each other. Especially, a comic expression is more efficient than real facial expression, and the movements of the cheeks and the jaws are more important AU's than those of the eyebrow, eye, mouth etc. Therefore, in this paper, we designed a 3D emotion editor using 2D model, and we extract AU's (called as PAU, here) which play a principal function in expressing emotions. We also proposed a method of generating the universal emotional expression with Avatar models which have different vertex structures. Here, we employed a method of dynamically adjusting the AU movements according to emotional intensities. The proposed system is implemented with Visual C++ and Open Inventor on windows platforms. Experimental results show a possibility that the system could be used as a non-verbal communication means to overcome the linguistic barrier.

Research about the Abstraction of Area Typicality of Emotions for Systematization of Human's Sensitivity Symbol (인간의 감성기호 체계화를 위한 감정영역범주화에 관한 연구)

  • Yun Bong-Shik
    • The Journal of the Korea Contents Association
    • /
    • v.5 no.2
    • /
    • pp.137-145
    • /
    • 2005
  • This study is a model of research for the developing 3D character contents about facial expression as a sort of non-linguistic signs, focusing on an expression of emotion factors of a person. It contributes a framework for symbolic analysis about Human's emotions along with a general review of expression. The human face is the most complex and versatile of all species. For humans, the face is a ich and versatile instrument serving many different functions. It serves as a window to display one's own motivational state. This makes one's behavior more predictable and understandable to others and improves communication. The face can be used to supplement verbal communication. A prompt facial display can reveal the speaker's attitude about the information being conveyed. Alternatively, the face can be used to complement verbal communication, such as lifting of eyebrows to lend additional emphasis to stressed word. The facial expression plays a important role under the digital visual context. This study will present a frame of facial expression categories for effective manufacture of cartoon and animation that appeal to the visual emotion of the human.

  • PDF

A Study on the System of Facial Expression Recognition for Emotional Information and Communication Technology Teaching (감성ICT 교육을 위한 얼굴감성 인식 시스템에 관한 연구)

  • Song, Eun Jee
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.4 no.2
    • /
    • pp.171-175
    • /
    • 2012
  • Recently, the research on ICT (Information and Communication Technology), which cognizes and communicates human's emotion through information technology, is increasing. For instance, there are researches on phones and services that perceive users' emotions through detecting people's voices, facial emotions, and biometric data. In short, emotions which were used to be predicted only by humans are, now, predicted by digital equipment instead. Among many ICT researches, research on emotion recognition from face is fully expected as the most effective and natural human interface. This paper studies about sensitivity ICT and examines mechanism of facial expression recognition system as an example of sensitivity ICT.

  • PDF

The Development of Children's Emotional and Cognitive Perspective-taking Ability (아동의 정서적, 인지적 조망수용능력의 발달에 관한 연구)

  • Kim, Jung Jin;Choi, Kyoung Sook
    • Korean Journal of Child Studies
    • /
    • v.12 no.1
    • /
    • pp.5-20
    • /
    • 1991
  • The purpose of this study was to investigate developmental tendencies and age-related differences in the relationship between children's cognitive and emotional perspective-taking ability. The subjects were 4-year-old (N=60), 6-year-old (N=60) and 8-year-old (N=60) children. In each group, there were an equal number of boys and girls. Feshbach & Roe's child perspective-taking ability test was modified for this study. The test included four facial expression cards and six different stories inducing three types of emotion: happy, sad and angry. This experiment consisted of a 3 (age) by 3 (emotional stories: happy, sad and angry) factorial design. The dependent measures were two response types: emotional and cognitive perspective-taking ability. The results showed that both cognitive and emotional perspective-taking ability increased with age. Happy emotional perspective-taking ability developed earlier than sad and angry perspective-taking ability. The correlation between cognitive and emotional perspective-taking ability increased with age.

  • PDF

Emotional Behavior in Preschoolers’ Peer Conflic: The Role of Peer Conflict Situation and Age (3세 및 5세 유아의 또래 갈등 상황에 따른 정서표현 행동)

  • 김지현;이순형
    • Journal of the Korean Home Economics Association
    • /
    • v.42 no.4
    • /
    • pp.29-43
    • /
    • 2004
  • The purpose of the current study was to investigate peer conflict situations and age differences in preschoolers' emotional behavior of happiness, sadness, and anger. Participants were twenty-two 3-year-olds and twenty 5-year-olds, and each pair of the same age interacted in two standardized conflict situations: object possession conflict and behavioural/interpersonal conflict. Participants' emotional behaviors of happiness, sadness, and anger were obsewationally coded through facial expression, verbal intonation, gesture, and physical contact. Preschoolers expressed more sadness and anger emotional behavior in object possession conflict than in behavioural/interpersonal conflict. In object possession conflict, 3-year-olds expressed more anger emotional behavior than 5-year-olds did. In behavioural/interpersonal conflict,5-year-olds expressed more happiness emotional behavior than 3-year-olds did.

Study on the Relationship Between 12Meridians Flow and Facial Expressions by Emotion (감정에 따른 얼굴 표정변화와 12경락(經絡) 흐름의 상관성 연구)

  • Park, Yu-Jin;Moon, Ju-Ho;Choi, Su-Jin;Shin, Seon-Mi;Kim, Ki-Tae;Ko, Heung
    • Journal of Physiology & Pathology in Korean Medicine
    • /
    • v.26 no.2
    • /
    • pp.253-258
    • /
    • 2012
  • Facial expression was an important communication methods. In oriental medicine, according to the emotion the face has changed shape and difference occurs in physiology and pathology. To verify such a theory, we studied the correlation between emotional facial expressions and meridian and collateral flow. The facial region divided by meridian, outer brow was Gallbladder meridian, inner brow was Bladder meridian, medial canthus was Bladder meridian, lateral canthus was Gallbladder meridian, upper eyelid was Bladder meridian, lower eyelid was Stomach meridian, central cheeks was Stomach meridian, lateral cheeks was Small intestine meridian, upper and lower lips, lip corner, chin were Small and Large intestine meridian. Meridian and collateral associated with happiness was six. This proves happiness is a high importance on facial expression. Meridian and collateral associated with anger was five. Meridian and Collateral associated with fear and sadness was four. This shows fear and sadness are a low importance on facial expression than different emotion. Based on yang meridian which originally descending flow in the body, the ratio of anterograde and retrograde were happiness 3:4, angry 2:5, sadness 5:3, fear 4:1. Based on face of the meridian flow, the ratio of anterograde and retrograde were happiness 5:2, angry 3:4, sadness 3:5, fear 4:1. We found out that practical meridian and collateral flow change by emotion does not correspond to the expected meridian and collateral flow change by emotion.