• Title/Summary/Keyword: Facial development

Search Result 402, Processing Time 0.024 seconds

A Simple Way to Find Face Direction (간단한 얼굴 방향성 검출방법)

  • Park Ji-Sook;Ohm Seong-Yong;Jo Hyun-Hee;Chung Min-Gyo
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.2
    • /
    • pp.234-243
    • /
    • 2006
  • The recent rapid development of HCI and surveillance technologies has brought great interests in application systems to process faces. Much of research efforts in these systems has been primarily focused on such areas as face recognition, facial expression analysis and facial feature extraction. However, not many approaches have been reported toward face direction detection. This paper proposes a method to detect the direction of a face using a facial feature called facial triangle, which is formed by two eyebrows and the lower lip. Specifically, based on the single monocular view of the face, the proposed method introduces very simple formulas to estimate the horizontal or vertical rotation angle of the face. The horizontal rotation angle can be calculated by using a ratio between the areas of left and right facial triangles, while the vertical angle can be obtained from a ratio between the base and height of facial triangle. Experimental results showed that our method makes it possible to obtain the horizontal angle within an error tolerance of ${\pm}1.68^{\circ}$, and that it performs better as the magnitude of the vertical rotation angle increases.

  • PDF

STUDIES ON THE SKIN TROUBLE AND THE FACIAL COLOR CHANGE DUE TO HORMONAL CYCLE IN FEMALE

  • Lee, Kun-Kook;Shin, Lee-Young;Gung, Ju-Nam;Kim, Jung-Hang
    • Journal of the Society of Cosmetic Scientists of Korea
    • /
    • v.22 no.2
    • /
    • pp.141-152
    • /
    • 1996
  • Many eastern females concern themselves about the condition and the color of their skin. The purpose of the present study is to classify the skin trouble and the change of the facial color due to hormonal cycle in female. We examined the actual cricumstances by questionnaires, and made patch tests of methyl nicotinate, representing rubefacient, to estimate the epidermal penetratin rate, and measured the facial color change during the menstrual cycle period to invest the correlation factors between skin trouble due to cosmetics and facial color change. Fifty-two percent of subjects had skin trouble relating to cosmetics. One second of subjects with skin trouble due to cosmetic complained the change of sysptom by menstrual cycle. The changes of systptom were related on premenstrual period. The skin trouble developed mainly on the first trimester of the pregnancy. In patch test of methyl nicotinate, most cases showed decreased threshold of the reaction on minstruation, and other cases showed increased reactivity of the skin of menstruation. In facial color measurements, it proves in the appearance of red spot, darkness, increasing the value and turning the hue to yellowish, Also it demonstrates that premenstruation hue turn red and value level decrease. During the period, facial color turns pale and hue progress to yellow. This fact coincides with the questionnaire. We have quqntified through questionnaire, which demonstrates it has good correlation with done to the subjects among the internal environment factors hormonal cycle influences the facial color change and the skin trouble due to cosmetics. Through this paper the development of an more fragmented make-up and skin care products is required to increase the interest to females, to make true the creation of beauty.

  • PDF

Does risk of obstructive sleep apnea have interaction with chronic facial pain?

  • Kang, Jeong-Hyun;Lee, Jeong Keun
    • Journal of the Korean Association of Oral and Maxillofacial Surgeons
    • /
    • v.48 no.5
    • /
    • pp.277-283
    • /
    • 2022
  • Objectives: The main purpose of the present study was to investigate the associations between the risk of obstructive sleep apnea (OSA) and chronic orofacial pain in a nationally representative sample of the Korean population. Materials and Methods: Data from the 8th wave Korean national health and nutrition examination survey, which was conducted from 2019 to 2020 were analyzed. This study included 5,780 Koreans (2,503 males, 3,277 females) over 40 years of age. The presence of subjective chronic facial pain lasting more than 3 months was evaluated based on a self-reported questionnaire. The risk of OSA was determined using the STOP-BANG questionnaire. Data related to anthropometric and sociodemographic factors; diagnostic history of hypertension, depression, and OSA; level of health-related quality of life and stress awareness; health-related behaviors, including smoking and alcohol drinking; and sleep duration were collected. The participants were classified into two groups according to the presence of chronic facial pain. Results: The level of health-related quality of life and stress awareness showed significant differences between the two groups. The sleep duration on weekends also presented significant differences. No significant differences were observed in the presence of snoring and observed apnea, while participants with chronic facial pain showed significantly higher levels of tiredness between the groups. The risk of OSA evaluated by STOP-BANG questionnaire showed significant differences between groups; however, the risk of OSA seemed to be higher in participants without chronic facial pain. Conclusion: The participants with chronic facial pain demonstrated decreased sleep duration, lower health-related quality of life, and increased stress and tiredness. Even though, the role of OSA in the development of chronic facial pain was inconclusive from the study, it is possible that ethnicity play a role in relationship between OSA and chronic facial pain.

Development of Character Input System using Facial Muscle Signal and Minimum List Keyboard (안면근 신호를 이용한 최소 자판 문자 입력 시스템의 개발)

  • Kim, Hong-Hyun;Kim, Eung-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.6
    • /
    • pp.1338-1344
    • /
    • 2010
  • A person does communication between each other using language. But In the case of disabled person can not communication own idea to use writing and gesture. Therefore, In this paper, we embodied communication system using the facial muscle signals so that disabled person can do communication. Especially, After feature extraction of the EEG included facial muscle, it is converted the facial muscle into control signal, and then select character and communication using a minimum list keyboard.

Atayal Facial Tattoo Patterns and Traditional Costumes in Taiwan (대만 태아족(泰雅族)의 경면문양(黥面紋樣)과 전통복식)

  • Cui, Yu-Hua;Park, Ga-Young
    • Journal of the Korea Fashion and Costume Design Association
    • /
    • v.12 no.4
    • /
    • pp.89-102
    • /
    • 2010
  • Atayal studied in this paper is one of the indigenous tribe in Taiwan, which is receiving considerable publicity gradually. Atayal has a quite unique traditional dress and custom as facial tattoo. The study was limited to the conventional culture of body adornment of the Atayal including the clothing which is less preserved and practicing by them at the present day, and the origin and the process of the historical development of those are just a little included in the scope of the present study. Through this study, we can get know about indigenous tribe's cultural background, such as their life, custom, religion, and the influences on traditional costumes. As for the research method, I examined the Atayal's traditional costumes and clothing through related books, magazine, research papers, internet sites, and etc. I also examined the common ground between facial tattoo patterns and their clothing using reference books and official web-site. Traditional clothing materials, basic forms of dress, and the pattern and technique of facial tattoo were examined in the present study in order to deepen the appreciation of the cultural heritage of the Atayal. In way, I hope this study will make a contribution to the field of Korean fashion industry which intends to enter Taiwan market.

  • PDF

Integral Regression Network for Facial Landmark Detection (얼굴 특징점 검출을 위한 적분 회귀 네트워크)

  • Kim, Do Yeop;Chang, Ju Yong
    • Journal of Broadcast Engineering
    • /
    • v.24 no.4
    • /
    • pp.564-572
    • /
    • 2019
  • With the development of deep learning, the performance of facial landmark detection methods has been greatly improved. The heat map regression method, which is a representative facial landmark detection method, is widely used as an efficient and robust method. However, the landmark coordinates cannot be directly obtained through a single network, and the accuracy is reduced in determining the landmark coordinates from the heat map. To solve these problems, we propose to combine integral regression with the existing heat map regression method. Through experiments using various datasets, we show that the proposed integral regression network significantly improves the performance of facial landmark detection.

Development of a Ream-time Facial Expression Recognition Model using Transfer Learning with MobileNet and TensorFlow.js (MobileNet과 TensorFlow.js를 활용한 전이 학습 기반 실시간 얼굴 표정 인식 모델 개발)

  • Cha Jooho
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.19 no.3
    • /
    • pp.245-251
    • /
    • 2023
  • Facial expression recognition plays a significant role in understanding human emotional states. With the advancement of AI and computer vision technologies, extensive research has been conducted in various fields, including improving customer service, medical diagnosis, and assessing learners' understanding in education. In this study, we develop a model that can infer emotions in real-time from a webcam using transfer learning with TensorFlow.js and MobileNet. While existing studies focus on achieving high accuracy using deep learning models, these models often require substantial resources due to their complex structure and computational demands. Consequently, there is a growing interest in developing lightweight deep learning models and transfer learning methods for restricted environments such as web browsers and edge devices. By employing MobileNet as the base model and performing transfer learning, our study develops a deep learning transfer model utilizing JavaScript-based TensorFlow.js, which can predict emotions in real-time using facial input from a webcam. This transfer model provides a foundation for implementing facial expression recognition in resource-constrained environments such as web and mobile applications, enabling its application in various industries.

The Comparison of Influence of Difficulties in Nasal Breathing on Dentition between Different Facial Types (비호흡 장애가 치열에 미치는 영향에 관한 안모 형태별 비교 연구)

  • Lee, Myeong-Jin;Lee, Chang-Kon;Kim, Jong-Sup;Park, Jin-Ho;Chin, Byung-Rho;Lee, Hee-Kyung
    • Journal of Yeungnam Medical Science
    • /
    • v.10 no.1
    • /
    • pp.37-47
    • /
    • 1993
  • It is commonly assumed that nasorespiratory function can exert a dramatic effect upon the development of the dentofacial complex. Specially, it has been stated that chronic nasal obstruction leads to mouth breathing, which causes altered tongue and mandibular positions. If this occurs during a period of active growth, the outcome is development of the "adenoid facies". Such patients characteristically manifest a vertically long lower third facial height, narrow alar bases, lip incompetence, a long and narrow maxillary arch and a greater than normal mandibular plane angle. But several authors have reported that so-called adenoid facies is not always associated with adenoids and mouth breathing, and that a particular type of dentition is not always found in mouth breathers with or without adenoids. Some authors have believed adenoids lead to mouth breathing in cases with particular facial characteristics and types of dentition. We assumed that the ability to adapt to individual's neuromuscular complex is various. So, we compared the difference of influence of mouth breathing between childrens who have different facial types. This study included 60 patients and they were divided into three groups by Rickett's facial type. Their dentition and tongue position were compared. The results are as follows. 1. There is a significant difference in arch width of upper molars between different facial types. Especially dolichofacial type patients have narrowest arch width. 2. There is a significant difference in tongue position between different facial types. Especially dolichofacial type patients have lowest positioned tongue.

  • PDF

Development of Facial Expression Recognition System based on Bayesian Network using FACS and AAM (FACS와 AAM을 이용한 Bayesian Network 기반 얼굴 표정 인식 시스템 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.562-567
    • /
    • 2009
  • As a key mechanism of the human emotion interaction, Facial Expression is a powerful tools in HRI(Human Robot Interface) such as Human Computer Interface. By using a facial expression, we can bring out various reaction correspond to emotional state of user in HCI(Human Computer Interaction). Also it can infer that suitable services to supply user from service agents such as intelligent robot. In this article, We addresses the issue of expressive face modeling using an advanced active appearance model for facial emotion recognition. We consider the six universal emotional categories that are defined by Ekman. In human face, emotions are most widely represented with eyes and mouth expression. If we want to recognize the human's emotion from this facial image, we need to extract feature points such as Action Unit(AU) of Ekman. Active Appearance Model (AAM) is one of the commonly used methods for facial feature extraction and it can be applied to construct AU. Regarding the traditional AAM depends on the setting of the initial parameters of the model and this paper introduces a facial emotion recognizing method based on which is combined Advanced AAM with Bayesian Network. Firstly, we obtain the reconstructive parameters of the new gray-scale image by sample-based learning and use them to reconstruct the shape and texture of the new image and calculate the initial parameters of the AAM by the reconstructed facial model. Then reduce the distance error between the model and the target contour by adjusting the parameters of the model. Finally get the model which is matched with the facial feature outline after several iterations and use them to recognize the facial emotion by using Bayesian Network.

Differences in the mandibular premolar positions in Angle Class I subjects with different vertical facial types: A cone-beam computed tomography study

  • Duan, Jun;Deng, Feng;Li, Wan-Shan;Li, Xue-Lei;Zheng, Lei-Lei;Li, Gui-Yuan;Bai, Yan-Jie
    • The korean journal of orthodontics
    • /
    • v.45 no.4
    • /
    • pp.180-189
    • /
    • 2015
  • Objective: To compare the positions of the mandibular premolars in Angle Class I subjects according to vertical facial type. The results will provide a theoretical basis for predicting effective tooth movement in orthodontic treatment. Methods: Cephalometric parameters were determined using cone-beam computed tomography in 120 Angle Class I subjects. Subjects were categorized as short, normal, and long face types according to the Frankfort mandibular angle. Parameters indicating the position of the mandibular right premolars and the mandible were also measured. Results: The angle between the mandibular first premolar axis and buccal cortex, the distance between the root apex and buccal cortex, angle of vestibularization, arc of vestibularization, and root apex maximum movable distance were significantly greater in the short face type than in the long and norm face types. The angle between the mandibular second premolar axis and buccal cortex, the distance from root apex to buccal cortex, and the arc of vestibularization were significantly greater in the short face type than in the normal face type. Conclusions: There are significant differences in the mandibular premolar positions in Class I subjects according to vertical facial type.