• Title/Summary/Keyword: 표정 변화

Search Result 251, Processing Time 0.025 seconds

Emotion Recognition Method of Facial Image using PCA (PCA을 이용한 얼굴 표정의 감정 인식 방법)

  • Kim, Ho-Duck;Yang, Hyun-Chang;Park, Chang-Hyun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.6
    • /
    • pp.772-776
    • /
    • 2006
  • A research about facial image recognition is studied in the most of images in a full race. A representative part, effecting a facial image recognition, is eyes and a mouth. So, facial image recognition researchers have studied under the central eyes, eyebrows, and mouths on the facial images. But most people in front of a camera in everyday life are difficult to recognize a fast change of pupils. And people wear glasses. So, in this paper, we try using Principal Component Analysis(PCA) for facial image recognition in blindfold case.

Facial Expression Recognition with Instance-based Learning Based on Regional-Variation Characteristics Using Models-based Feature Extraction (모델기반 특징추출을 이용한 지역변화 특성에 따른 개체기반 표정인식)

  • Park, Mi-Ae;Ko, Jae-Pil
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.11
    • /
    • pp.1465-1473
    • /
    • 2006
  • In this paper, we present an approach for facial expression recognition using Active Shape Models(ASM) and a state-based model in image sequences. Given an image frame, we use ASM to obtain the shape parameter vector of the model while we locate facial feature points. Then, we can obtain the shape parameter vector set for all the frames of an image sequence. This vector set is converted into a state vector which is one of the three states by the state-based model. In the classification step, we use the k-NN with the proposed similarity measure that is motivated on the observation that the variation-regions of an expression sequence are different from those of other expression sequences. In the experiment with the public database KCFD, we demonstrate that the proposed measure slightly outperforms the binary measure in which the recognition performance of the k-NN with the proposed measure and the existing binary measure show 89.1% and 86.2% respectively when k is 1.

  • PDF

Study of expression in virtual character of facial smile by emotion recognition (감성인식에 따른 가상 캐릭터의 미소 표정변화에 관한 연구)

  • Lee, Dong-Yeop
    • Cartoon and Animation Studies
    • /
    • s.33
    • /
    • pp.383-402
    • /
    • 2013
  • In this study, we apply the facial Facial Action Coding System for coding the muscular system anatomical approach facial expressions to be displayed in response to a change in sensitivity. To verify by applying the virtual character the Duchenne smile to the original. I extracted the Duchenne smile by inducing experiment of emotion (man 2, woman 2) and the movie theater department students trained for the experiment. Based on the expression that has been extracted, I collect the data of the facial muscles. Calculates the frequency of expression of the face and other parts of the body muscles around the mouth and lips, to be applied to the virtual character of the data. Orbicularis muscle to contract end of lips due to shrinkage of the Zygomatic Major is a upward movement, cheek goes up, the movement of the muscles, facial expressions appear the outer eyelid under the eye goes up with a look of smile. Muscle movement of large muscle and surrounding Zygomatic Major is observed together (AU9) muscles around the nose and (AU25, AU26, AU27) muscles around the mouth associated with openness. Duchen smile occurred in the form of Orbicularis Oculi and Zygomatic Major moves at the same time. Based on this, by separating the orbicularis muscle that is displayed in the form of laughter and sympathy to emotional feelings and viable large muscle by the will of the person, by applying to the character of the virtual, and expression of human I try to examine expression of the virtual character's ability to distinguish.

A Study on Effective Facial Expression of 3D Character through Variation of Emotions (Model using Facial Anatomy) (감정변화에 따른 3D캐릭터의 표정연출에 관한 연구 (해부학적 구조 중심으로))

  • Kim, Ji-Ae
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.7
    • /
    • pp.894-903
    • /
    • 2006
  • Rapid technology growth of hardware have brought about development and expansion of various digital motion pictured information including 3-Dimension. 3D digital techniques can be used to be diversity in Animation, Virtual-Reality, Movie, Advertisement, Game and so on. 3D characters in digital motion picture take charge of the core as to communicate emotions and information to users through sounds, facial expression and characteristic motions. Concerns about 3D motion and facial expression is getting higher with extension of frequency in use and range about 3D character design. In this study, the facial expression can be used as a effective method about implicit emotions will be studied and research 3D character's facial expressions and muscles movement which are based on human anatomy and then try to find effective method of facial expression. Finally, also, study the difference and distinguishing between 2D and 3D character through the preceding study what I have researched before.

  • PDF

A Face Recognition Method Robust to Variations in Lighting and Facial Expression (조명 변화, 얼굴 표정 변화에 강인한 얼굴 인식 방법)

  • Yang, Hui-Seong;Kim, Yu-Ho;Lee, Jun-Ho
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.2
    • /
    • pp.192-200
    • /
    • 2001
  • 본 논문은 조명 변화, 표정 변화, 부분적인 오클루전이 있는 얼굴 영상에 강인하고 적은 메모리양과 계산량을 갖는 효율적인 얼굴 인식 방법을 제안한다. SKKUface(Sungkyunkwan University face)라 명명한 이 방법은 먼저 훈련 영상에 PCA(principal component analysis)를 적용하여 차원을 줄일 때 구해지는 특징 벡터 공간에서 조명 변화, 얼굴 표정 변화 등에 해당되는 공간이 최대한 제외된 새로운 특징 벡터 공간을 생성한다. 이러한 특징 벡터 공간은 얼굴의 고유특징만을 주로 포함하는 벡터 공간이므로 이러한 벡터 공간에 Fisher linear discriminant를 적용하면 클래스간의 더욱 효과적인 분리가 이루어져 인식률을 획기적으로 향상시킨다. 또한, SKKUface 방법은 클래스간 분산(between-class covariance) 행렬과 클래스내 분산(within-class covariance) 행렬을 계산할 때 문제가 되는 메모리양과 계산 시간을 획기적으로 줄이는 방법을 제안하여 적용하였다. 제안된 SKKUface 방법의 얼굴 인식 성능을 평가하기 위하여 YALE, SKKU, ORL(Olivetti Research Laboratory) 얼굴 데이타베이스를 가지고 기존의 얼굴 인식 방법으로 널리 알려진 Eigenface 방법, Fisherface 방법과 함께 인식률을 비교 평가하였다. 실험 결과, 제안된 SKKUface 방법이 조명 변화, 부분적인 오클루전이 있는 얼굴 영상에 대해서 Eigenface 방법과 Fisherface 방법에 비해 인식률이 상당히 우수함을 알 수 있었다.

  • PDF

Facial Expression Analysis Framework (표정 분석 프레임워크)

  • Ji, Eun-Mi
    • Journal of the Korea Computer Industry Society
    • /
    • v.8 no.3
    • /
    • pp.187-196
    • /
    • 2007
  • Human being represents his emotion through facial expression on purpose or unconsciously. Several psychologists started the research for analysis of facial expression, and over the last decade, many computer scientists were also interested in it. Facial expression recognition is a future-valuable research that can be applicable in many kinds of field based on man-computer interface. However, in spite of lots of study, it is hard to find any practical systems because of a variety of illumination and scale of face, and high dimensional information to be processed. In this paper, I tried to describe a generic framework for facial expression analysis, the need of each level, and international research tendency. Also, I analyzed the case study of facial expression in Korea. I expect it to be helpful for the scientists willing to make contribution on facial expression.

  • PDF

Recent Research Trends of Facial Expression Recognition (얼굴표정 인식 기법의 최신 연구 동향)

  • Lee, Min Kyu;Song, Byung Cheol
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2019.11a
    • /
    • pp.128-130
    • /
    • 2019
  • 최근 딥러닝의 급격한 발전과 함께 얼굴표정 인식(facial expression recognition) 기술이 상당한 진보를 이루었다. 얼굴표정 인식은 컴퓨터 비전 분야에서 지속적으로 관심을 받고 있으며, 인포테인먼트 시스템(Infotainment system), 인간-로봇 상호작용(human-robot interaction) 등 다양한 분야에서 활용되고 있다. 그럼에도 불구하고 얼굴표정 인식 분야는 학습 데이터의 부족, 얼굴 각도의 변화 또는 occlusion 등과 같은 많은 문제들이 존재한다. 본 논문은 얼굴표정 인식 분야에서의 위와 같은 고유한 문제들을 다룬 기술들을 포함하여 고전적인 기법부터 최신 기법에 대한 연구 동향을 제시한다.

  • PDF

facial Expression Animation Using 3D Face Modelling of Anatomy Base (해부학 기반의 3차원 얼굴 모델링을 이용한 얼굴 표정 애니메이션)

  • 김형균;오무송
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.7 no.2
    • /
    • pp.328-333
    • /
    • 2003
  • This paper did to do with 18 muscle pairs that do fetters in anatomy that influence in facial expression change and mix motion of muscle for face facial animation. After set and change mash and make standard model in individual's image, did mapping to mash using individual facial front side and side image to raise truth stuff. Muscle model who become motive power that can do animation used facial expression creation correcting Waters' muscle model. Created deformed face that texture is dressed using these method. Also, 6 facial expression that Ekman proposes did animation.

A Study on Customer Feedback using Facial Expression (표정인식을 활용한 고객피드백에 관한 연구)

  • Song, Eun-Jee;Kang, Min-Shik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.10a
    • /
    • pp.685-686
    • /
    • 2015
  • 최근 감성ICT산업은 성숙기 IT산업을 새롭게 도약시킬 핵심 산업으로 인식되면서 관련 산업계의 주목을 받고 있으며 다양한 분야에 접목되고 있다. 특히, 인간표정을 인식하는 IT기술은 사회적 측면에서 인간중심의 미래생활 패러다임의 변화가 감성이해를 통한 사용자 친화적 솔루션으로 발전하고 있다. 효율적인 경영을 위해 기업은 고객의 요구사항을 정확히 파악하는 것이 중요한데 본 연구에서는 이러한 감성ICT 기술을 이용한 새로운 커뮤니케이션의 사례로서 고객의 감정 중에 특히 얼굴표정을 파악하여 고객중심의 맞춤형 서비스 기능을 제공할 수 있도록 얼굴표정에 의해 호감도를 특정할 수 있는 알고리즘을 제안한다. 이것은 기존의 7개 표정인식 알고리즘을 이용하여 고객만족도를 특정할 수 있도록 한 것이다.

  • PDF

Anatomy-Based Face Animation for Virtual Reality (가상현실을 위한 해부학에 기반한 얼굴 애니메이션)

  • 김형균;오무송;고석만;김장형
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2003.04c
    • /
    • pp.280-282
    • /
    • 2003
  • 본 논문에서는 가상현실 환경에서 인체 모델의 애니메이션을 위하여 얼굴의 표정 변화에 영향을 주는 해부학에 기반한 18개의 근육군쌍을 바탕으로 하여 얼굴 표정 애니메이션을 위한 근육의 움직임을 조합할 수 있도록 하였다. 개인의 이미지에 맞춰 메쉬를 변형하여 표준 모델을 만든 다음, 사실감을 높이기 위해 개인 얼굴의 정면과 측면 2 장의 이미지를 이용하여 메쉬에 매핑하였다. 얼굴의 표정 생성을 애니메이션 할 수 있는 원동력이 되는 근육 모델은 Waters의 근육 모델을 수정하여 사용하였다.

  • PDF