• Title/Summary/Keyword: Facial appearance

Search Result 299, Processing Time 0.027 seconds

Robust Facial Expression Recognition Based on Local Directional Pattern

  • Jabid, Taskeed;Kabir, Md. Hasanul;Chae, Oksam
    • ETRI Journal
    • /
    • v.32 no.5
    • /
    • pp.784-794
    • /
    • 2010
  • Automatic facial expression recognition has many potential applications in different areas of human computer interaction. However, they are not yet fully realized due to the lack of an effective facial feature descriptor. In this paper, we present a new appearance-based feature descriptor, the local directional pattern (LDP), to represent facial geometry and analyze its performance in expression recognition. An LDP feature is obtained by computing the edge response values in 8 directions at each pixel and encoding them into an 8 bit binary number using the relative strength of these edge responses. The LDP descriptor, a distribution of LDP codes within an image or image patch, is used to describe each expression image. The effectiveness of dimensionality reduction techniques, such as principal component analysis and AdaBoost, is also analyzed in terms of computational cost saving and classification accuracy. Two well-known machine learning methods, template matching and support vector machine, are used for classification using the Cohn-Kanade and Japanese female facial expression databases. Better classification accuracy shows the superiority of LDP descriptor against other appearance-based feature descriptors.

Development of Facial Expression Recognition System based on Bayesian Network using FACS and AAM (FACS와 AAM을 이용한 Bayesian Network 기반 얼굴 표정 인식 시스템 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.562-567
    • /
    • 2009
  • As a key mechanism of the human emotion interaction, Facial Expression is a powerful tools in HRI(Human Robot Interface) such as Human Computer Interface. By using a facial expression, we can bring out various reaction correspond to emotional state of user in HCI(Human Computer Interaction). Also it can infer that suitable services to supply user from service agents such as intelligent robot. In this article, We addresses the issue of expressive face modeling using an advanced active appearance model for facial emotion recognition. We consider the six universal emotional categories that are defined by Ekman. In human face, emotions are most widely represented with eyes and mouth expression. If we want to recognize the human's emotion from this facial image, we need to extract feature points such as Action Unit(AU) of Ekman. Active Appearance Model (AAM) is one of the commonly used methods for facial feature extraction and it can be applied to construct AU. Regarding the traditional AAM depends on the setting of the initial parameters of the model and this paper introduces a facial emotion recognizing method based on which is combined Advanced AAM with Bayesian Network. Firstly, we obtain the reconstructive parameters of the new gray-scale image by sample-based learning and use them to reconstruct the shape and texture of the new image and calculate the initial parameters of the AAM by the reconstructed facial model. Then reduce the distance error between the model and the target contour by adjusting the parameters of the model. Finally get the model which is matched with the facial feature outline after several iterations and use them to recognize the facial emotion by using Bayesian Network.

The Effect of Nonverbal Communication on Trust, Switching Barrier and Repurchase Intention (서비스제공자의 비언어적 커뮤니케이션이 신뢰와 전환장벽 및 재구매의도에 미치는 영향)

  • Lee, Ok-Hee
    • Fashion & Textile Research Journal
    • /
    • v.14 no.5
    • /
    • pp.803-810
    • /
    • 2012
  • This study investigates the effect of nonverbal communication on trust, switching barrier, and repurchase intention. Sample subjects used in this study were customers of a fashion shop in Sunchon. The questionnaires were conveniently sampled from July 2010 to August 2010. Questionnaire data from 335 customers of a national brand were analyzed through a reliability analysis, factor analysis, and multiple regression analysis. The results of this study are as follows. First, nonverbal communication by the service provider was divided into 3 types, physical appearance and paralanguage, postures and proxemics, and facial expressions. Second, it was found that physical appearance and paralanguage, postures and proxemics, facial expression of nonverbal communication had a significant impact on customer trust. Third, given the relationship between nonverbal communication and switching barrier, it was represented that the postures and proxemics and facial expressions (except physical appearance and paralanguage) had a significantly positive influence on the switching barrier. Forth, physical appearance/paralanguage, postures/proxemics, and facial expressions (nonverbal communication) had a positive influence on repurchase intention. Fifth, given the relationship between trust and repurchase intention as well as switching barrier and repurchase intention, it was represented that trust and switching barrier have a significantly positive influence upon repurchase intention. According to the results of this study, the more positive nonverbal communication by the service provider then the higher the customer repurchase intention as well as trust and switching barrier. Fifth, given the relationship between trust and repurchase intention as well as switching barrier and repurchase intention, it was represented that trust and switching barrier have a significantly positive influence upon repurchase intentions.

Facial Feature Extraction with Its Applications

  • Lee, Minkyu;Lee, Sangyoun
    • Journal of International Society for Simulation Surgery
    • /
    • v.2 no.1
    • /
    • pp.7-9
    • /
    • 2015
  • Purpose In the many face-related application such as head pose estimation, 3D face modeling, facial appearance manipulation, the robust and fast facial feature extraction is necessary. We present the facial feature extraction method based on shape regression and feature selection for real-time facial feature extraction. Materials and Methods The facial features are initialized by statistical shape model and then the shape of facial features are deformed iteratively according to the texture pattern which is selected on the feature pool. Results We obtain fast and robust facial feature extraction result with error less than 4% and processing time less than 12 ms. The alignment error is measured by average of ratio of pixel difference to inter-ocular distance. Conclusion The accuracy and processing time of the method is enough to apply facial feature based application and can be used on the face beautification or 3D face modeling.

What Do We See When We Look at Faces? (우리는 얼굴을 어떻게 평가하는가?)

  • Evans, Carla A.
    • The korean journal of orthodontics
    • /
    • v.33 no.5 s.100
    • /
    • pp.319-322
    • /
    • 2003
  • Recent scientific findings on the perception of facial attractiveness coupled with technological advances in computer imaging make it possible to measure the facial characteristics that nay be associated with specific judgments of facial appearance. These new methods can be used to produce psychometric norms of facial attractiveness which potentially could supplement the conventional population norms or averages used currently in orthodontic treatment planning. It is hypothesized that consideration of psychometric norms will enhance doctor-patient communication and lead to greater patient satisfaction at the completion of orthodontic treatment.

Markerless Image-to-Patient Registration Using Stereo Vision : Comparison of Registration Accuracy by Feature Selection Method and Location of Stereo Bision System (스테레오 비전을 이용한 마커리스 정합 : 특징점 추출 방법과 스테레오 비전의 위치에 따른 정합 정확도 평가)

  • Joo, Subin;Mun, Joung-Hwan;Shin, Ki-Young
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.53 no.1
    • /
    • pp.118-125
    • /
    • 2016
  • This study evaluates the performance of image to patient registration algorithm by using stereo vision and CT image for facial region surgical navigation. For the process of image to patient registration, feature extraction and 3D coordinate calculation are conducted, and then 3D CT image to 3D coordinate registration is conducted. Of the five combinations that can be generated by using three facial feature extraction methods and three registration methods on stereo vision image, this study evaluates the one with the highest registration accuracy. In addition, image to patient registration accuracy was compared by changing the facial rotation angle. As a result of the experiment, it turned out that when the facial rotation angle is within 20 degrees, registration using Active Appearance Model and Pseudo Inverse Matching has the highest accuracy, and when the facial rotation angle is over 20 degrees, registration using Speeded Up Robust Features and Iterative Closest Point has the highest accuracy. These results indicate that, Active Appearance Model and Pseudo Inverse Matching methods should be used in order to reduce registration error when the facial rotation angle is within 20 degrees, and Speeded Up Robust Features and Iterative Closest Point methods should be used when the facial rotation angle is over 20 degrees.

Face Detection and Recognition with Multiple Appearance Models for Mobile Robot Application

  • Lee, Taigun;Park, Sung-Kee;Kim, Munsang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.100.4-100
    • /
    • 2002
  • For visual navigation, mobile robot can use a stereo camera which has large field of view. In this paper, we propose an algorithm to detect and recognize human face on the basis of such camera system. In this paper, a new coarse to fine detection algorithm is proposed. For coarse detection, nearly face-like areas are found in entire image using dual ellipse templates. And, detailed alignment of facial outline and features is performed on the basis of view- based multiple appearance model. Because it hard to finely align with facial features in this case, we try to find most resembled face image area is selected from multiple face appearances using most distinguished facial features- two eye...

  • PDF

Facial Flap Repositioning in Posttraumatic Facial Asymmetry

  • Byun, Il Hwan;Byun, Dahn;Baek, Woo Yeol
    • Archives of Craniofacial Surgery
    • /
    • v.17 no.4
    • /
    • pp.240-243
    • /
    • 2016
  • Perfect facial and body symmetry is an important aesthetic concept which is very difficult, if not impossible, to achieve. Yet, facial asymmetries are commonly encountered by plastic and reconstructive surgeons. Here, we present a case of posttraumatic facial asymmetry successfully treated with a unique concept of facial flap repositioning. A 25-year-old male patient visited our department with severe posttraumatic facial asymmetry. There was deviated nasal bone and implant to the right, and the actual facial appearance asymmetry was much more severe compared to the computed tomography, generally shifted to the right. After corrective rhinoplasty, we approached through intraoral incision, and much adhesion from previous surgeries was noted. We meticulously elevated the facial flap of both sides, mainly involving the cheeks. The elevated facial flap was shifted to the left, and after finding the appropriate location, we sutured the middle portion of the flap to the periosteum of anterior nasal spine for fixation. We successfully freed the deviated facial tissues and repositioned it to improve symmetry in a single stage operation. We conclude that facial flap repositioning is an effective technique for patients with multiple operation history, and such method can successfully apply to other body parts with decreased tissue laxity.

The analysis of relationships between facial impressions and physical features (얼굴 인상과 물리적 특징의 관계 구조 분석)

  • 김효선;한재현
    • Korean Journal of Cognitive Science
    • /
    • v.14 no.4
    • /
    • pp.53-63
    • /
    • 2003
  • We analyzed the relationships between facial impressions and physical features, and investigated the effects of impressions on facial similarity judgments. Using 79 faces extracted from a face database, we collected the ratings of impressions along four dimensions -mild-fierce, bright-dull, feminine-manly and youthful-mature- and the measures of 41 physical features. Multiple Regression Analyses showed that the ratings of impressions and the measures of features are closely connected with each other. Our experiments using facial similarity judgments confirmed the possibility that facial impressions are used in processing of facial information. We found that people tend to perceive faces as similar when they have the same impressions rather than neutral ones, although all of them are alike physically. These results imply that facial impressions are used as a psychological structure representing facial appearance, and that facial processing includes impression information.

  • PDF