• Title/Summary/Keyword: Emotional Facial Expression

Search Result 126, Processing Time 0.028 seconds

Representation of Dynamic Facial ImageGraphic for Multi-Dimensional (다차원 데이터의 동적 얼굴 이미지그래픽 표현)

  • 최철재;최진식;조규천;차홍준
    • Journal of the Korea Computer Industry Society
    • /
    • v.2 no.10
    • /
    • pp.1291-1300
    • /
    • 2001
  • This article come to study the visualization representation technique of eye brain of person, basing on the ground of the dynamic graphics which is able to change the real time, manipulating the image as graphic factors of the multi-data. And the important thought in such realization is as follows ; corresponding the character points of human face and the parameter control value which obtains basing on the existing image recognition algorithm to the multi-dimensional data, synthesizing the image, it is to create the virtual image from the emotional expression according to the changing contraction expression. The proposed DyFIG system is realized that it as the completing module and we suggest the module of human face graphics which is able to express the emotional expression by manipulating and experimenting, resulting in realizing the emotional data expression description and technology.

  • PDF

Quantified Lockscreen: Integration of Personalized Facial Expression Detection and Mobile Lockscreen application for Emotion Mining and Quantified Self (Quantified Lockscreen: 감정 마이닝과 자기정량화를 위한 개인화된 표정인식 및 모바일 잠금화면 통합 어플리케이션)

  • Kim, Sung Sil;Park, Junsoo;Woo, Woontack
    • Journal of KIISE
    • /
    • v.42 no.11
    • /
    • pp.1459-1466
    • /
    • 2015
  • Lockscreen is one of the most frequently encountered interfaces by smartphone users. Although users perform unlocking actions every day, there are no benefits in using lockscreens apart from security and authentication purposes. In this paper, we replace the traditional lockscreen with an application that analyzes facial expressions in order to collect facial expression data and provide real-time feedback to users. To evaluate this concept, we have implemented Quantified Lockscreen application, supporting the following contributions of this paper: 1) an unobtrusive interface for collecting facial expression data and evaluating emotional patterns, 2) an improvement in accuracy of facial expression detection through a personalized machine learning process, and 3) an enhancement of the validity of emotion data through bidirectional, multi-channel and multi-input methodology.

Effects of the facial expression's presenting type and areas on emotional recognition (얼굴 표정의 제시 유형과 제시 영역에 따른 정서 인식 효과)

  • Lee, Jung-Hun;Kim, Hyuk;Han, Kwang-Hee
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1393-1400
    • /
    • 2006
  • 정서를 측정하고 나타내는 기술이 발전에 따라 문화적 보편성을 가진 얼굴표정 연구의 필요성이 증가하고 있다. 그리고 지금까지의 많은 얼굴 표정 연구들은 정적인 얼굴사진 위주로 이루어졌다. 그러나 실제 사람들은 단적인 얼굴표정만으로 정서를 인식하기 보다는 미묘한 표정의 변화나 얼굴근육의 움직임 등을 통해 정서상태를 추론한다. 본 연구는 동적인 얼굴표정이 정적인 얼굴표정 보다 정서상태 전달에서 더 큰 효과를 가짐을 밝히고, 동적인 얼굴 표정에서의 눈과 입의 정서인식 효과를 비교해 보고자 하였다. 이에 따라 15 개의 형용사 어휘에 맞는 얼굴 표정을 얼굴전체, 눈, 입의 세 수준으로 나누어 동영상과 스틸사진으로 제시하였다. 정서 판단의 정확성을 측정한 결과, 세 수준 모두에서 동영상의 정서인식 효과가 스틸사진 보다 유의미하게 높게 나타나 동적인 얼굴 표정이 더 많은 내적정보를 보여주는 것을 알 수 있었다. 또한 얼굴전체-눈-입 순서로 정서인식 효과의 차이가 유의미하게 나타났으며, 부정적 정서는 눈에서 더 잘 나타나고 긍정적 정서는 입에서 더 잘 나타났다. 따라서 눈과 입에 따른 정서인식이 정서의 긍정성-부정성 차원에 따라 달라짐을 볼 수 있었다.

  • PDF

Brain Activation to Facial Expressions Among Alcoholics (알코올 중독자의 얼굴 표정 인식과 관련된 뇌 활성화 특성)

  • Park, Mi-Sook;Lee, Bae Hwan;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.20 no.4
    • /
    • pp.1-14
    • /
    • 2017
  • The purpose of this study was to investigate the neural substrates for recognizing facial expressions among alcoholics by using functional magnetic resonance imaging (fMRI). Abstinent inpatient alcoholics (n=18 males) and demographically similar social drinkers (n=16 males) participated in the study. The participants viewed pictures from the Japanese Female Facial Expression Database (JAFFE) and evaluated intensity of facial expressions. the alcoholics had a reduced activation in the limbic areas including amygdala and hippocampus while recognizing the emotional facial expressions compared to the nonalcoholic controls. On the other hand, the alcoholics showed greater brain activations than the controls in the left lingual (BA 19)/fusiform gyrus, the left middle frontal gyrus (BA 8/9/46), and the right superior parietal lobule (BA 7) during the viewing of emotional faces. In sum, specific brain regions were identified that are associated with recognition of facial expressions among alcoholics. The implication of the present study could be used in developing intervention for alcoholism.

Analysis of Visual Attention in Negative Emotional Expression Emoticons using Eye-Tracking Device (시선추적 장치를 활용한 부정적 감정표현 이모티콘의 시각적 주의집중도 분석)

  • Park, Minhee;Kwon, Mahnwoo;Hwang, Mikyung
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.11
    • /
    • pp.1580-1587
    • /
    • 2021
  • Currently, the development and sale of various emoticons has given users a wider range of choices, but a systematic and specific approach to the recognition and use of emoticons by actual users is lacking. Therefore, this study tried to investigate the subjective perception and visual attention concentration of actual users on negative emotional expression emoticons through a survey and eye tracking experiment. First, as a result of subjective recognition analysis, it was found that emoticons are frequently used because their appearance is important, and they can express various emotions in a fun and interesting way. In particular, it was found that emoticons that express negative emotions are often used because they can indirectly express negative emotions through various and concretely expressed visual elements. Next, as a result of the eye tracking experiment, it was found that the negative emotional expression emoticons focused on the large elements that visually emphasized or emphasized the emotional expression elements, and it was found that the focus was not only on the facial expression but also on the physical behavioral responses and language of expression of emotions. These results will be used as basic data to understand users' perceptions and utilization of the diversified emoticons. In addition, for the long-term growth and activation of the emoticon industry market in the future, continuous research should be conducted to understand the various emotions of real users and to develop differentiated emoticons that can maximize the empathy effect appropriate to the situation.

Artificial Intelligence for Assistance of Facial Expression Practice Using Emotion Classification (감정 분류를 이용한 표정 연습 보조 인공지능)

  • Dong-Kyu, Kim;So Hwa, Lee;Jae Hwan, Bong
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.17 no.6
    • /
    • pp.1137-1144
    • /
    • 2022
  • In this study, an artificial intelligence(AI) was developed to help with facial expression practice in order to express emotions. The developed AI used multimodal inputs consisting of sentences and facial images for deep neural networks (DNNs). The DNNs calculated similarities between the emotions predicted by the sentences and the emotions predicted by facial images. The user practiced facial expressions based on the situation given by sentences, and the AI provided the user with numerical feedback based on the similarity between the emotion predicted by sentence and the emotion predicted by facial expression. ResNet34 structure was trained on FER2013 public data to predict emotions from facial images. To predict emotions in sentences, KoBERT model was trained in transfer learning manner using the conversational speech dataset for emotion classification opened to the public by AIHub. The DNN that predicts emotions from the facial images demonstrated 65% accuracy, which is comparable to human emotional classification ability. The DNN that predicts emotions from the sentences achieved 90% accuracy. The performance of the developed AI was evaluated through experiments with changing facial expressions in which an ordinary person was participated.

A Study on Interactive Avatar in Mobile device using facial expression of Animation Character (모바일 기기에서 애니메이션 캐릭터의 얼굴표현을 이용한 인터랙티브 아바타에 관한 연구)

  • Oh Jeong-Seok;Youn Ho-Chang;Jeon Hong-Jun
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2005.05a
    • /
    • pp.229-236
    • /
    • 2005
  • This paper is study about emotional Interactive avatar in cellular phone. When user ask what he want to the avatar, it answer with facial expression based on animation Charac- ter. So the user can approach more friendly to the avatar.

  • PDF

A Study on Customer Feedback using Facial Expression (표정인식을 활용한 고객피드백에 관한 연구)

  • Song, Eun-Jee;Kang, Min-Shik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.10a
    • /
    • pp.685-686
    • /
    • 2015
  • 최근 감성ICT산업은 성숙기 IT산업을 새롭게 도약시킬 핵심 산업으로 인식되면서 관련 산업계의 주목을 받고 있으며 다양한 분야에 접목되고 있다. 특히, 인간표정을 인식하는 IT기술은 사회적 측면에서 인간중심의 미래생활 패러다임의 변화가 감성이해를 통한 사용자 친화적 솔루션으로 발전하고 있다. 효율적인 경영을 위해 기업은 고객의 요구사항을 정확히 파악하는 것이 중요한데 본 연구에서는 이러한 감성ICT 기술을 이용한 새로운 커뮤니케이션의 사례로서 고객의 감정 중에 특히 얼굴표정을 파악하여 고객중심의 맞춤형 서비스 기능을 제공할 수 있도록 얼굴표정에 의해 호감도를 특정할 수 있는 알고리즘을 제안한다. 이것은 기존의 7개 표정인식 알고리즘을 이용하여 고객만족도를 특정할 수 있도록 한 것이다.

  • PDF

On the Implementation of a Facial Animation Using the Emotional Expression Techniques (FAES : 감성 표현 기법을 이용한 얼굴 애니메이션 구현)

  • Kim Sang-Kil;Min Yong-Sik
    • The Journal of the Korea Contents Association
    • /
    • v.5 no.2
    • /
    • pp.147-155
    • /
    • 2005
  • In this paper, we present a FAES(a Facial Animation with Emotion and Speech) system for speech-driven face animation with emotions. We animate face cartoons not only from input speech, but also based on emotions derived from speech signal. And also our system can ensure smooth transitions and exact representation in animation. To do this, after collecting the training data, we have made the database using SVM(Support Vector Machine) to recognize four different categories of emotions: neutral, dislike, fear and surprise. So that, we can make the system for speech-driven animation with emotions. Also, we trained on Korean young person and focused on only Korean emotional face expressions. Experimental results of our system demonstrate that more emotional areas expanded and the accuracies of the emotional recognition and the continuous speech recognition are respectively increased 7% and 5% more compared with the previous method.

  • PDF

A Generation Methodology of Facial Expressions for Avatar Communications (아바타 통신에서의 얼굴 표정의 생성 방법)

  • Kim Jin-Yong;Yoo Jae-Hwi
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.3 s.35
    • /
    • pp.55-64
    • /
    • 2005
  • The avatar can be used as an auxiliary methodology of text and image communications in cyber space. An intelligent communication method can also be utilized to achieve real-time communication, where intelligently coded data (joint angles for arm gestures and action units for facial emotions) are transmitted instead of real or compressed pictures. In this paper. for supporting the action of arm and leg gestures, a method of generating the facial expressions that can represent sender's emotions is provided. The facial expression can be represented by Action Unit(AU), in this paper we suggest the methodology of finding appropriate AUs in avatar models that have various shape and structure. And, to maximize the efficiency of emotional expressions, a comic-style facial model having only eyebrows, eyes, nose, and mouth is employed. Then generation of facial emotion animation with the parameters is also investigated.

  • PDF