• Title/Summary/Keyword: Facial Behavior

Search Result 110, Processing Time 0.028 seconds

Facial Behavior Recognition for Driver's Fatigue Detection (운전자 피로 감지를 위한 얼굴 동작 인식)

  • Park, Ho-Sik;Bae, Cheol-Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.35 no.9C
    • /
    • pp.756-760
    • /
    • 2010
  • This paper is proposed to an novel facial behavior recognition system for driver's fatigue detection. Facial behavior is shown in various facial feature such as head expression, head pose, gaze, wrinkles. But it is very difficult to clearly discriminate a certain behavior by the obtained facial feature. Because, the behavior of a person is complicated and the face representing behavior is vague in providing enough information. The proposed system for facial behavior recognition first performs detection facial feature such as eye tracking, facial feature tracking, furrow detection, head orientation estimation, head motion detection and indicates the obtained feature by AU of FACS. On the basis of the obtained AU, it infers probability each state occur through Bayesian network.

Effects of Facial Expression of Others on Moral Judgment (타인의 얼굴 표정이 도덕적 판단에 미치는 영향)

  • Lee, WonSeob;Kim, ShinWoo
    • Korean Journal of Cognitive Science
    • /
    • v.30 no.2
    • /
    • pp.85-104
    • /
    • 2019
  • Past research showed that presence of others induces morally desirable behavior and stricter judgments. That is, presence of others makes people become a moral being. On the other hand, little research has been conducted to test what effects facial expression of others have on moral judgments. In this research, we tested the effects of emotion exposed by facial expression on moral judgments. To this end, we presented descriptions of immoral or prosocial behavior along with facial expression of various emotions (in particular, disgust and happiness), and asked participants to make moral judgments on the behavior in the descriptions. In Experiment 1, facial expression did not affect moral judgments, but variability of judgments was increased when descriptions and facial expression were incongruent. In experiment 2, we modified potential reasons of the null effect and conducted the experiment using the same procedure. Subjects in Experiment 2 made stricter judgments with disgust faces than with happy faces for immoral behavior, but the effect did not occur for prosocial behavior. In Experiment 3, we repeated the same experiment after having subjects to consider themselves as the actor in the descriptions. The results replicated the effects of facial expression in Experiment 2 but there was no effect of the actor on moral judgments. This research showed that facial expression of others specifically affects moral judgments on immoral behavior but not on prosocial behavior. In general discussion, we provided further discussion on the results and the limitations of this research.

Relationship Between Morphologic measurement of Facial Feature and Eating Behavior During a Meal (얼굴생김새와 식사행동과의 관련성)

  • Kim, Gyeong-Eup;Kim, Seok-Young
    • Journal of the Korean Society of Food Culture
    • /
    • v.16 no.2
    • /
    • pp.109-117
    • /
    • 2001
  • Judging from the studies carried out by Dr. Jo, Yong Jin on the Koreans' faces, Koreans divided into two constitutions according to their facial features and heritages. The one population is the Northern lineage whose ancestor migrated from Siberia in ice age. In order to survive in cold climate, they have developed a high level of metabolic heat production. Cold adaptation for preventing heat loss results in a reduction in the facial surface area with small eyes, nose and lips. The other population is the Southern lineage who is the descent of native in Korean peninsular. They have big eyes with double edged eyelids, broad nose and thick lips. It is generally believed that both genetic and environmetal factors influence eating behaviors. Although we can't recognized their heritage that may contribute to the metabolism and eating behavior, we commonly recognize their physiological heritage acceding to their facial features. In order to investigate the relationship among the size and shape of facial feature, the eating behavior, anthropometric measurement in female college students, the eating behaviors was measured during an instant-noodle lunch eaten in a laboratory setting at the ambient temperature of $23^{\circ}C$. The anterior surface area of left eye and length of right eye were positively correlated with the difference between the peak postprandial and the meal-start core temperature. The surface area of lower lip also negatively correlated with the meal-start core temperature and meal duration. In addition, the total lips' area was positively correlated with the difference between the peak postprandial and the meal-start core temperature and negatively correlated with the meal duration. However anthropometric measurements were not related with the size of facial features.

  • PDF

Image Recognition based on Adaptive Deep Learning (적응적 딥러닝 학습 기반 영상 인식)

  • Kim, Jin-Woo;Rhee, Phill-Kyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.18 no.1
    • /
    • pp.113-117
    • /
    • 2018
  • Human emotions are revealed by various factors. Words, actions, facial expressions, attire and so on. But people know how to hide their feelings. So we can not easily guess its sensitivity using one factor. We decided to pay attention to behaviors and facial expressions in order to solve these problems. Behavior and facial expression can not be easily concealed without constant effort and training. In this paper, we propose an algorithm to estimate human emotion through combination of two results by gradually learning human behavior and facial expression with little data through the deep learning method. Through this algorithm, we can more comprehensively grasp human emotions.

Make-up Behavior Differences Depending on Make-up Preference Image (화장추구이미지에 따른 화장행동의 차이)

  • Lee Hyun-Jung;Kim Mi-Young
    • Journal of the Korean Society of Clothing and Textiles
    • /
    • v.30 no.5 s.153
    • /
    • pp.800-806
    • /
    • 2006
  • The purpose of this study was to investigate the differences in the make-up behavior influenced by make-up preference images, and the order of importance in the make-up behaviors. The questionnaires were given to female residents in the ages between $20\sim45$ in Seoul and Kyung-gi province during October 2004. 322 questionnaires were used for data analysis. The collected data were analyzed by using SPSS 10.0 software with various techniques such as ANOVA test, Duncan test and Paired t-test. The make-up behaviors were divided into three main categories: facial make-up behavior, color make-up behavior, and additional make-up behavior. The results showed that the skin care behavior was considered the most important among other make-up behaviors. Generally facial make-up was thought to be most important and color make-up, and additional make-up were followed orderly. The results of the make-up behavior differences by make-up preference image. The natural make-up preference image group considered all the make-up behavior less important than other make-up preference image groups. The modern make-up preference image group considered additional make-up behavior more important.

The Effects of Skin Recognition on the Purchasing behavior and Propensity to buy Facial Cleanser (피부인식이 세안제 구매행동 및 구매성향에 미치는 영향)

  • Han, Yu-Ree;Kim, Min-Kyoung;Li, Shun-Hua
    • Journal of Digital Convergence
    • /
    • v.16 no.10
    • /
    • pp.465-477
    • /
    • 2018
  • The purpose of this study is how it affect them what the effect of skin recognition on the purchasing behavior and propensity to buy facial cleanser in 311 women in their 20s and 50s. This study analyzed by importance, interest, and satisfaction of skin recognition, and type of impulse buying, type of depending on brand, type of planning buying. The group with high interest in skin recognition had a long time to clean. As they got a purchasing information the group with low knowledge had the information from nearby, and the group with high knowledge got information from internet. At the view of purchasing propensity the women who are highly interested in the skin have a tendency of type of impulse buying and type of planning buying, and the women with high skin importance are less inclined to type of impulse buying. In conclusion, Skin recognition uses purchasing behavior and propensity to buy facial cleanser.

An Intelligent Emotion Recognition Model Using Facial and Bodily Expressions

  • Jae Kyeong Kim;Won Kuk Park;Il Young Choi
    • Asia pacific journal of information systems
    • /
    • v.27 no.1
    • /
    • pp.38-53
    • /
    • 2017
  • As sensor technologies and image processing technologies make collecting information on users' behavior easy, many researchers have examined automatic emotion recognition based on facial expressions, body expressions, and tone of voice, among others. Specifically, many studies have used normal cameras in the multimodal case using facial and body expressions. Thus, previous studies used a limited number of information because normal cameras generally produce only two-dimensional images. In the present research, we propose an artificial neural network-based model using a high-definition webcam and Kinect to recognize users' emotions from facial and bodily expressions when watching a movie trailer. We validate the proposed model in a naturally occurring field environment rather than in an artificially controlled laboratory environment. The result of this research will be helpful in the wide use of emotion recognition models in advertisements, exhibitions, and interactive shows.

Facial Point Classifier using Convolution Neural Network and Cascade Facial Point Detector (컨볼루셔널 신경망과 케스케이드 안면 특징점 검출기를 이용한 얼굴의 특징점 분류)

  • Yu, Je-Hun;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.3
    • /
    • pp.241-246
    • /
    • 2016
  • Nowadays many people have an interest in facial expression and the behavior of people. These are human-robot interaction (HRI) researchers utilize digital image processing, pattern recognition and machine learning for their studies. Facial feature point detector algorithms are very important for face recognition, gaze tracking, expression, and emotion recognition. In this paper, a cascade facial feature point detector is used for finding facial feature points such as the eyes, nose and mouth. However, the detector has difficulty extracting the feature points from several images, because images have different conditions such as size, color, brightness, etc. Therefore, in this paper, we propose an algorithm using a modified cascade facial feature point detector using a convolutional neural network. The structure of the convolution neural network is based on LeNet-5 of Yann LeCun. For input data of the convolutional neural network, outputs from a cascade facial feature point detector that have color and gray images were used. The images were resized to $32{\times}32$. In addition, the gray images were made into the YUV format. The gray and color images are the basis for the convolution neural network. Then, we classified about 1,200 testing images that show subjects. This research found that the proposed method is more accurate than a cascade facial feature point detector, because the algorithm provides modified results from the cascade facial feature point detector.

Difficulty in Facial Emotion Recognition in Children with ADHD (주의력결핍 과잉행동장애의 이환 여부에 따른 얼굴표정 정서 인식의 차이)

  • An, Na Young;Lee, Ju Young;Cho, Sun Mi;Chung, Young Ki;Shin, Yun Mi
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.24 no.2
    • /
    • pp.83-89
    • /
    • 2013
  • Objectives : It is known that children with attention-deficit hyperactivity disorder (ADHD) experience significant difficulty in recognizing facial emotion, which involves processing of emotional facial expressions rather than speech, compared to children without ADHD. This objective of this study is to investigate the differences in facial emotion recognition between children with ADHD and normal children used as control. Methods : The children for our study were recruited from the Suwon Project, a cohort comprising a non-random convenience sample of 117 nine-year-old ethnic Koreans. The parents of the study participants completed study questionnaires such as the Korean version of Child Behavior Checklist, ADHD Rating Scale, Kiddie-Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version. Facial Expression Recognition Test of the Emotion Recognition Test was used for the evaluation of facial emotion recognition and ADHD Rating Scale was used for the assessment of ADHD. Results : ADHD children (N=10) were found to have impaired recognition when it comes to Emotional Differentiation and Contextual Understanding compared with normal controls (N=24). We found no statistically significant difference in the recognition of positive facial emotions (happy and surprise) and negative facial emotions (anger, sadness, disgust and fear) between the children with ADHD and normal children. Conclusion : The results of our study suggested that facial emotion recognition may be closely associated with ADHD, after controlling for covariates, although more research is needed.

Difference of Facial Emotion Recognition and Discrimination between Children with Attention-Deficit Hyperactivity Disorder and Autism Spectrum Disorder (주의력결핍과잉행동장애 아동과 자폐스펙트럼장애 아동에서 얼굴 표정 정서 인식과 구별의 차이)

  • Lee, Ji-Seon;Kang, Na-Ri;Kim, Hui-Jeong;Kwak, Young-Sook
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.27 no.3
    • /
    • pp.207-215
    • /
    • 2016
  • Objectives: This study aimed to investigate the differences in the facial emotion recognition and discrimination ability between children with attention-deficit hyperactivity disorder (ADHD) and autism spectrum disorder (ASD). Methods: Fifty-three children aged 7 to 11 years participated in this study. Among them, 43 were diagnosed with ADHD and 10 with ASD. The parents of the participants completed the Korean version of the Child Behavior Checklist, ADHD Rating Scale and Conner's scale. The participants completed the Korean Wechsler Intelligence Scale for Children-fourth edition and Advanced Test of Attention (ATA), Penn Emotion Recognition Task and Penn Emotion Discrimination Task. The group differences in the facial emotion recognition and discrimination ability were analyzed by using analysis of covariance for the purpose of controlling the visual omission error index of ATA. Results: The children with ADHD showed better recognition of happy and sad faces and less false positive neutral responses than those with ASD. Also, the children with ADHD recognized emotions better than those with ASD on female faces and in extreme facial expressions, but not on male faces or in mild facial expressions. We found no differences in the facial emotion discrimination between the children with ADHD and ASD. Conclusion: Our results suggest that children with ADHD recognize facial emotions better than children with ASD, but they still have deficits. Interventions which consider their different emotion recognition and discrimination abilities are needed.