• Title/Summary/Keyword: Facial Behavior

검색결과 110건 처리시간 0.018초

운전자 피로 감지를 위한 얼굴 동작 인식 (Facial Behavior Recognition for Driver's Fatigue Detection)

  • 박호식;배철수
    • 한국통신학회논문지
    • /
    • 제35권9C호
    • /
    • pp.756-760
    • /
    • 2010
  • 본 논문에서는 운전자 피로 감지를 위한 얼굴 동작을 효과적으로 인식하는 방법을 제안하고자 한다. 얼굴 동작은 얼굴 표정, 얼굴 자세, 시선, 주름 같은 얼굴 특징으로 나타난다. 그러나 얼굴 특징으로 하나의 동작 상태를 뚜렷이 구분한다는 것은 대단히 어려운 문제이다. 왜냐하면 사람의 동작은 복합적이며 그 동작을 표현하는 얼굴은 충분한 정보를 제공하기에는 모호성을 갖기 때문이다. 제안된 얼굴 동작 인식 시스템은 먼저 적외선 카메라로 눈 검출, 머리 방향 추정, 머리 움직임 추정, 얼굴 추적과 주름 검출과 같은 얼굴 특징 등을 감지하고 획득한 특징을 FACS의 AU로 나타낸다. 획득한 AU를 근간으로 동적 베이지안 네트워크를 통하여 각 상태가 일어날 확률을 추론한다.

타인의 얼굴 표정이 도덕적 판단에 미치는 영향 (Effects of Facial Expression of Others on Moral Judgment)

  • 이원섭;김신우
    • 인지과학
    • /
    • 제30권2호
    • /
    • pp.85-104
    • /
    • 2019
  • 기존 연구들은 타인의 존재가 바람직한 행동을 유도하고 도덕적으로 엄격한 판단을 하게 한다는 것을 보여주었다. 즉, 타인의 존재 자체가 사람들을 더 도덕적인 존재로 만든다는 것이다. 반면 타인의 얼굴 표정에 따라 도덕적 판단이 어떻게 달라지는지에 대한 연구는 거의 존재하지 않는다. 본 연구에서는 타인의 얼굴표정이 드러내는 정서가 도덕적 판단에 미치는 영향을 검증하였다. 이를 위해 부도덕한 혹은 친사회적 행위를 묘사한 제시문과 함께 다양한 정서(특히, 혐오와 행복)를 표현하는 얼굴자극을 함께 제시한 뒤, 제시문에 묘사된 행동의 도덕성을 평가하도록 하였다. 실험 1에서는 타인의 얼굴표정이 도덕적 판단에는 영향을 미치지 않았으나, 제시문의 내용과 표정이 불일치할 때 도덕성 평정의 변산성이 높아지는 것을 확인하였다. 실험 1에서 효과가 나타나지 않은 잠재적 원인들을 수정하여 동일한 절차로 실험 2를 실시하였다. 그 결과 비도덕적 행동에 대해 참가자들은 행복표정보다 혐오표정에서 더 엄격한 판단을 했으나, 친사회적 행동에 대해서는 얼굴표정의 효과가 나타나지 않았다. 실험 3에서는 제시문의 행위자를 참가자 자신으로 여기도록 유도한 뒤 동일한 실험을 실시하였다. 그 결과 실험 2에서 나타난 결과들을 재검증할 수 있었으나 행위주체에 따른 차이는 발견할 수 없었다. 본 연구는 타인의 얼굴표정이 비도덕적 행위에 대한 판단에 특정적으로 영향을 미치며 친사회적 행위에 대한 판단에는 영향을 미치지 않는다는 것을 보여준다. 종합논의에서 본 실험의 결과를 논의하고 한계점을 제시하였다.

얼굴생김새와 식사행동과의 관련성 (Relationship Between Morphologic measurement of Facial Feature and Eating Behavior During a Meal)

  • 김경업;김석영
    • 한국식생활문화학회지
    • /
    • 제16권2호
    • /
    • pp.109-117
    • /
    • 2001
  • Judging from the studies carried out by Dr. Jo, Yong Jin on the Koreans' faces, Koreans divided into two constitutions according to their facial features and heritages. The one population is the Northern lineage whose ancestor migrated from Siberia in ice age. In order to survive in cold climate, they have developed a high level of metabolic heat production. Cold adaptation for preventing heat loss results in a reduction in the facial surface area with small eyes, nose and lips. The other population is the Southern lineage who is the descent of native in Korean peninsular. They have big eyes with double edged eyelids, broad nose and thick lips. It is generally believed that both genetic and environmetal factors influence eating behaviors. Although we can't recognized their heritage that may contribute to the metabolism and eating behavior, we commonly recognize their physiological heritage acceding to their facial features. In order to investigate the relationship among the size and shape of facial feature, the eating behavior, anthropometric measurement in female college students, the eating behaviors was measured during an instant-noodle lunch eaten in a laboratory setting at the ambient temperature of $23^{\circ}C$. The anterior surface area of left eye and length of right eye were positively correlated with the difference between the peak postprandial and the meal-start core temperature. The surface area of lower lip also negatively correlated with the meal-start core temperature and meal duration. In addition, the total lips' area was positively correlated with the difference between the peak postprandial and the meal-start core temperature and negatively correlated with the meal duration. However anthropometric measurements were not related with the size of facial features.

  • PDF

적응적 딥러닝 학습 기반 영상 인식 (Image Recognition based on Adaptive Deep Learning)

  • 김진우;이필규
    • 한국인터넷방송통신학회논문지
    • /
    • 제18권1호
    • /
    • pp.113-117
    • /
    • 2018
  • 사람의 감정은 다양한 요소에 의해서 드러난다. 말, 행동, 표정, 옷차림 등등. 하지만 사람은 자신의 감정을 숨길 줄 안다. 따라서 어느 한 가지만으로는 쉽게 그 감성을 짐작할 수 없다. 우리는 이러한 문제를 해결하고 보다 진솔한 사람의 감성을 파악하기 위해 행동과 표정에 주의를 기울이기로 하였다. 행동과 표정은 부단한 노력과 훈련이 없으면 쉽게 감출 수 없기 때문이다. 본 논문에서는 딥러닝 방법을 통해 적은 데이터를 가지고 점진적으로 사람의 행동과 표정을 학습하여 두 가지 결과의 조합을 통해 사람의 감성을 추측하는 알고리즘을 제안한다. 이 알고리즘을 통해 우리는 보다 종합적으로 사람의 감성을 파악할 수 있다.

화장추구이미지에 따른 화장행동의 차이 (Make-up Behavior Differences Depending on Make-up Preference Image)

  • 이현정;김미영
    • 한국의류학회지
    • /
    • 제30권5호
    • /
    • pp.800-806
    • /
    • 2006
  • The purpose of this study was to investigate the differences in the make-up behavior influenced by make-up preference images, and the order of importance in the make-up behaviors. The questionnaires were given to female residents in the ages between $20\sim45$ in Seoul and Kyung-gi province during October 2004. 322 questionnaires were used for data analysis. The collected data were analyzed by using SPSS 10.0 software with various techniques such as ANOVA test, Duncan test and Paired t-test. The make-up behaviors were divided into three main categories: facial make-up behavior, color make-up behavior, and additional make-up behavior. The results showed that the skin care behavior was considered the most important among other make-up behaviors. Generally facial make-up was thought to be most important and color make-up, and additional make-up were followed orderly. The results of the make-up behavior differences by make-up preference image. The natural make-up preference image group considered all the make-up behavior less important than other make-up preference image groups. The modern make-up preference image group considered additional make-up behavior more important.

피부인식이 세안제 구매행동 및 구매성향에 미치는 영향 (The Effects of Skin Recognition on the Purchasing behavior and Propensity to buy Facial Cleanser)

  • 한유리;김민경;리순화
    • 디지털융복합연구
    • /
    • 제16권10호
    • /
    • pp.465-477
    • /
    • 2018
  • 본 연구는 20-50대 여성 311명을 대상으로 피부인식이 세안제 구매행동 및 구매성향에 미치는 영향을 연구하고자 피부인식을 중요도, 관심도, 만족도로, 구매성향을 충동구매형, 브랜드의존형, 계획구매형으로 하위요인을 구성하여 설문조사를 통하여 분석하였다. 피부인식에서 관심도가 높은 그룹이 세안시간이 길었으며 피부지식도가 낮은 그룹은 구매정보를 주변으로부터, 높은 그룹은 인터넷에서 얻는 것으로 나타났다. 구매성향에서 피부관심도가 높은 여성들은 충동구매형과 계획구매형 성향이 강하고 피부중요도가 높은 여성은 충동구매형 성향이 낮은 것으로 나타났다. 결론적으로 피부 관심도, 중요도 인식이 세안제 구매행동 및 구매성향에 영향을 미치기 때문에 고객의 피부인식을 파악하여 세안제 마케팅에 적용하는 것이 필요할 것으로 사료된다.

An Intelligent Emotion Recognition Model Using Facial and Bodily Expressions

  • Jae Kyeong Kim;Won Kuk Park;Il Young Choi
    • Asia pacific journal of information systems
    • /
    • 제27권1호
    • /
    • pp.38-53
    • /
    • 2017
  • As sensor technologies and image processing technologies make collecting information on users' behavior easy, many researchers have examined automatic emotion recognition based on facial expressions, body expressions, and tone of voice, among others. Specifically, many studies have used normal cameras in the multimodal case using facial and body expressions. Thus, previous studies used a limited number of information because normal cameras generally produce only two-dimensional images. In the present research, we propose an artificial neural network-based model using a high-definition webcam and Kinect to recognize users' emotions from facial and bodily expressions when watching a movie trailer. We validate the proposed model in a naturally occurring field environment rather than in an artificially controlled laboratory environment. The result of this research will be helpful in the wide use of emotion recognition models in advertisements, exhibitions, and interactive shows.

컨볼루셔널 신경망과 케스케이드 안면 특징점 검출기를 이용한 얼굴의 특징점 분류 (Facial Point Classifier using Convolution Neural Network and Cascade Facial Point Detector)

  • 유제훈;고광은;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제22권3호
    • /
    • pp.241-246
    • /
    • 2016
  • Nowadays many people have an interest in facial expression and the behavior of people. These are human-robot interaction (HRI) researchers utilize digital image processing, pattern recognition and machine learning for their studies. Facial feature point detector algorithms are very important for face recognition, gaze tracking, expression, and emotion recognition. In this paper, a cascade facial feature point detector is used for finding facial feature points such as the eyes, nose and mouth. However, the detector has difficulty extracting the feature points from several images, because images have different conditions such as size, color, brightness, etc. Therefore, in this paper, we propose an algorithm using a modified cascade facial feature point detector using a convolutional neural network. The structure of the convolution neural network is based on LeNet-5 of Yann LeCun. For input data of the convolutional neural network, outputs from a cascade facial feature point detector that have color and gray images were used. The images were resized to $32{\times}32$. In addition, the gray images were made into the YUV format. The gray and color images are the basis for the convolution neural network. Then, we classified about 1,200 testing images that show subjects. This research found that the proposed method is more accurate than a cascade facial feature point detector, because the algorithm provides modified results from the cascade facial feature point detector.

주의력결핍 과잉행동장애의 이환 여부에 따른 얼굴표정 정서 인식의 차이 (Difficulty in Facial Emotion Recognition in Children with ADHD)

  • 안나영;이주영;조선미;정영기;신윤미
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • 제24권2호
    • /
    • pp.83-89
    • /
    • 2013
  • Objectives : It is known that children with attention-deficit hyperactivity disorder (ADHD) experience significant difficulty in recognizing facial emotion, which involves processing of emotional facial expressions rather than speech, compared to children without ADHD. This objective of this study is to investigate the differences in facial emotion recognition between children with ADHD and normal children used as control. Methods : The children for our study were recruited from the Suwon Project, a cohort comprising a non-random convenience sample of 117 nine-year-old ethnic Koreans. The parents of the study participants completed study questionnaires such as the Korean version of Child Behavior Checklist, ADHD Rating Scale, Kiddie-Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version. Facial Expression Recognition Test of the Emotion Recognition Test was used for the evaluation of facial emotion recognition and ADHD Rating Scale was used for the assessment of ADHD. Results : ADHD children (N=10) were found to have impaired recognition when it comes to Emotional Differentiation and Contextual Understanding compared with normal controls (N=24). We found no statistically significant difference in the recognition of positive facial emotions (happy and surprise) and negative facial emotions (anger, sadness, disgust and fear) between the children with ADHD and normal children. Conclusion : The results of our study suggested that facial emotion recognition may be closely associated with ADHD, after controlling for covariates, although more research is needed.

주의력결핍과잉행동장애 아동과 자폐스펙트럼장애 아동에서 얼굴 표정 정서 인식과 구별의 차이 (Difference of Facial Emotion Recognition and Discrimination between Children with Attention-Deficit Hyperactivity Disorder and Autism Spectrum Disorder)

  • 이지선;강나리;김희정;곽영숙
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • 제27권3호
    • /
    • pp.207-215
    • /
    • 2016
  • Objectives: This study aimed to investigate the differences in the facial emotion recognition and discrimination ability between children with attention-deficit hyperactivity disorder (ADHD) and autism spectrum disorder (ASD). Methods: Fifty-three children aged 7 to 11 years participated in this study. Among them, 43 were diagnosed with ADHD and 10 with ASD. The parents of the participants completed the Korean version of the Child Behavior Checklist, ADHD Rating Scale and Conner's scale. The participants completed the Korean Wechsler Intelligence Scale for Children-fourth edition and Advanced Test of Attention (ATA), Penn Emotion Recognition Task and Penn Emotion Discrimination Task. The group differences in the facial emotion recognition and discrimination ability were analyzed by using analysis of covariance for the purpose of controlling the visual omission error index of ATA. Results: The children with ADHD showed better recognition of happy and sad faces and less false positive neutral responses than those with ASD. Also, the children with ADHD recognized emotions better than those with ASD on female faces and in extreme facial expressions, but not on male faces or in mild facial expressions. We found no differences in the facial emotion discrimination between the children with ADHD and ASD. Conclusion: Our results suggest that children with ADHD recognize facial emotions better than children with ASD, but they still have deficits. Interventions which consider their different emotion recognition and discrimination abilities are needed.