• Title/Summary/Keyword: Facial Information

Search Result 1,060, Processing Time 0.024 seconds

A Three-Dimensional Facial Modeling and Prediction System (3차원 얼굴 모델링과 예측 시스템)

  • Gu, Bon-Gwan;Jeong, Cheol-Hui;Cho, Sun-Young;Lee, Myeong-Won
    • Journal of the Korea Computer Graphics Society
    • /
    • v.17 no.1
    • /
    • pp.9-16
    • /
    • 2011
  • In this paper, we describe the development of a system for generating a 3-dimensional human face and predicting it's appearance as it ages over subsequent years using 3D scanned facial data and photo images. It is composed of 3-dimensional texture mapping functions, a facial definition parameter input tool, and 3-dimensional facial prediction algorithms. With the texture mapping functions, we can generate a new model of a given face at a specified age using a scanned facial model and photo images. The texture mapping is done using three photo images - a front and two side images of a face. The facial definition parameter input tool is a user interface necessary for texture mapping and used for matching facial feature points between photo images and a 3D scanned facial model in order to obtain material values in high resolution. We have calculated material values for future facial models and predicted future facial models in high resolution with a statistical analysis using 100 scanned facial models.

facial Expression Animation Using 3D Face Modelling of Anatomy Base (해부학 기반의 3차원 얼굴 모델링을 이용한 얼굴 표정 애니메이션)

  • 김형균;오무송
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.7 no.2
    • /
    • pp.328-333
    • /
    • 2003
  • This paper did to do with 18 muscle pairs that do fetters in anatomy that influence in facial expression change and mix motion of muscle for face facial animation. After set and change mash and make standard model in individual's image, did mapping to mash using individual facial front side and side image to raise truth stuff. Muscle model who become motive power that can do animation used facial expression creation correcting Waters' muscle model. Created deformed face that texture is dressed using these method. Also, 6 facial expression that Ekman proposes did animation.

Automatic Estimation of 2D Facial Muscle Parameter Using Neural Network (신경회로망을 이용한 2D 얼굴근육 파라메터의 자동인식)

  • 김동수;남기환;한준희;배철수;권오흥;나상동
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.1029-1032
    • /
    • 1999
  • Muscle based face image synthesis is one of the most realistic approach to realize life-like agent in computer. Facial muscle model is composed of facial tissue elements and muscles. In this model, forces are calculated effecting facial tissue element by contraction of each muscle strength, so the combination of each muscle parameter decide a specific facial expression. Now each muscle parameter is decided on trial and error procedure comparing the sample photograph and generated image using our Muscle-Editor to generate a specific face image. In this paper, we propose the strategy of automatic estimation of facial muscle parameters from 2D marker movement using neural network. This also 3D motion estimation from 2D point or flow information in captered image under restriction of physics based face model.

  • PDF

Recognition of Facial Expressions Using Muscle-eased Feature Models (근육기반의 특징모델을 이용한 얼굴표정인식에 관한 연구)

  • 김동수;남기환;한준희;박호식;차영석;최현수;배철수;권오홍;나상동
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 1999.11a
    • /
    • pp.416-419
    • /
    • 1999
  • We Present a technique for recognizing facial expressions from image sequences. The technique uses muscle-based feature models for tracking facial features. Since the feature models are constructed with a small number of parameters and are deformable in the limited range and directions, each search space for a feature can be limited. The technique estimates muscular contractile degrees for classifying six principal facial express expressions. The contractile vectors are obtained from the deformations of facial muscle models. Similarities are defined between those vectors and representative vectors of principal expressions and are used for determining facial expressions.

  • PDF

Local Feature Based Facial Expression Recognition Using Adaptive Decision Tree (적응형 결정 트리를 이용한 국소 특징 기반 표정 인식)

  • Oh, Jihun;Ban, Yuseok;Lee, Injae;Ahn, Chunghyun;Lee, Sangyoun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.39A no.2
    • /
    • pp.92-99
    • /
    • 2014
  • This paper proposes the method of facial expression recognition based on decision tree structure. In the image of facial expression, ASM(Active Shape Model) and LBP(Local Binary Pattern) make the local features of a facial expressions extracted. The discriminant features gotten from local features make the two facial expressions of all combination classified. Through the sum of true related to classification, the combination of facial expression and local region are decided. The integration of branch classifications generates decision tree. The facial expression recognition based on decision tree shows better recognition performance than the method which doesn't use that.

Real-Time Face Avatar Creation and Warping Algorithm Using Local Mean Method and Facial Feature Point Detection

  • Lee, Eung-Joo;Wei, Li
    • Journal of Korea Multimedia Society
    • /
    • v.11 no.6
    • /
    • pp.777-786
    • /
    • 2008
  • Human face avatar is important information in nowadays, such as describing real people in virtual world. In this paper, we have presented a face avatar creation and warping algorithm by using face feature analysis method, in order to detect face feature, we utilized local mean method based on facial feature appearance and face geometric information. Then detect facial candidates by using it's character in $YC_bC_r$ color space. Meanwhile, we also defined the rules which are based on face geometric information to limit searching range. For analyzing face feature, we used face feature points to describe their feature, and analyzed geometry relationship of these feature points to create the face avatar. Then we have carried out simulation on PC and embed mobile device such as PDA and mobile phone to evaluate efficiency of the proposed algorithm. From the simulation results, we can confirm that our proposed algorithm will have an outstanding performance and it's execution speed can also be acceptable.

  • PDF

Electrophysiologic Examination and Physiotherapy for Facial Nerve Palsy (안면신경 마비의 전기생리학적 검사 및 물리치료)

  • Ryoo, Jae-Kwan;Kim, Jong-Soon
    • Journal of Korean Physical Therapy Science
    • /
    • v.4 no.3
    • /
    • pp.499-509
    • /
    • 1997
  • The facial nerve have a long pathway. Thus facial nerve fibers easily involved at any point along their course will lead to a facial palsy of lower motor neuron type and upper motor neuron type. The electrophysiologic examination can evaluate and anticipating that prognosis of facial nerve palsy. The electrophysiologic examination are Nerve Excitability Test(NET), Elecctroneurography(ENG), Electro-myography(EMG), Blink Reflex, and Electrogustometry et.al. The NET is very useful method for assessment of prognosis and distinguish between nerve degeneration and physiological block as early as 72 hour after onset of the facial palsy. And other examination also give objectively information of facial nerve for prognosis and treatment. Treatment goal of physiotherapy are prevent contracture and disuse atrophy of facial muscle with muscle reeducation and strengthening and maintain symmetry facial motion. The treatment better start as early as possible.

  • PDF

Risk Situation Recognition Using Facial Expression Recognition of Fear and Surprise Expression (공포와 놀람 표정인식을 이용한 위험상황 인지)

  • Kwak, Nae-Jong;Song, Teuk Seob
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.3
    • /
    • pp.523-528
    • /
    • 2015
  • This paper proposes an algorithm for risk situation recognition using facial expression. The proposed method recognitions the surprise and fear expression among human's various emotional expression for recognizing risk situation. The proposed method firstly extracts the facial region from input, detects eye region and lip region from the extracted face. And then, the method applies Uniform LBP to each region, discriminates facial expression, and recognizes risk situation. The proposed method is evaluated for Cohn-Kanade database image to recognize facial expression. The DB has 6 kinds of facial expressions of human being that are basic facial expressions such as smile, sadness, surprise, anger, disgust, and fear expression. The proposed method produces good results of facial expression and discriminates risk situation well.

3-D Facial Animation on the PDA via Automatic Facial Expression Recognition (얼굴 표정의 자동 인식을 통한 PDA 상에서의 3차원 얼굴 애니메이션)

  • Lee Don-Soo;Choi Soo-Mi;Kim Hae-Hwang;Kim Yong-Guk
    • The KIPS Transactions:PartB
    • /
    • v.12B no.7 s.103
    • /
    • pp.795-802
    • /
    • 2005
  • In this paper, we present a facial expression recognition-synthesis system that recognizes 7 basic emotion information automatically and renders face with non-photorelistic style in PDA For the recognition of the facial expressions, first we need to detect the face area within the image acquired from the camera. Then, a normalization procedure is applied to it for geometrical and illumination corrections. To classify a facial expression, we have found that when Gabor wavelets is combined with enhanced Fisher model the best result comes out. In our case, the out put is the 7 emotional weighting. Such weighting information transmitted to the PDA via a mobile network, is used for non-photorealistic facial expression animation. To render a 3-D avatar which has unique facial character, we adopted the cartoon-like shading method. We found that facial expression animation using emotional curves is more effective in expressing the timing of an expression comparing to the linear interpolation method.

Face classification and analysis based on geometrical feature of face (얼굴의 기하학적 특징정보 기반의 얼굴 특징자 분류 및 해석 시스템)

  • Jeong, Kwang-Min;Kim, Jung-Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.16 no.7
    • /
    • pp.1495-1504
    • /
    • 2012
  • This paper proposes an algorithm to classify and analyze facial features such as eyebrow, eye, mouth and chin based on the geometric features of the face. As a preprocessing process to classify and analyze the facial features, the algorithm extracts the facial features such as eyebrow, eye, nose, mouth and chin. From the extracted facial features, it detects the shape and form information and the ratio of distance between the features and formulated them to evaluation functions to classify 12 eyebrows types, 3 eyes types, 9 mouth types and 4 chine types. Using these facial features, it analyzes a face. The face analysis algorithm contains the information about pixel distribution and gradient of each feature. In other words, the algorithm analyzes a face by comparing such information about the features.