• Title/Summary/Keyword: Emotional Expression Recognition

Search Result 102, Processing Time 0.024 seconds

A Study on Visual Perception based Emotion Recognition using Body-Activity Posture (사용자 행동 자세를 이용한 시각계 기반의 감정 인식 연구)

  • Kim, Jin-Ok
    • The KIPS Transactions:PartB
    • /
    • v.18B no.5
    • /
    • pp.305-314
    • /
    • 2011
  • Research into the visual perception of human emotion to recognize an intention has traditionally focused on emotions of facial expression. Recently researchers have turned to the more challenging field of emotional expressions through body posture or activity. Proposed work approaches recognition of basic emotional categories from body postures using neural model applied visual perception of neurophysiology. In keeping with information processing models of the visual cortex, this work constructs a biologically plausible hierarchy of neural detectors, which can discriminate 6 basic emotional states from static views of associated body postures of activity. The proposed model, which is tolerant to parameter variations, presents its possibility by evaluating against human test subjects on a set of body postures of activities.

Development of a Ream-time Facial Expression Recognition Model using Transfer Learning with MobileNet and TensorFlow.js (MobileNet과 TensorFlow.js를 활용한 전이 학습 기반 실시간 얼굴 표정 인식 모델 개발)

  • Cha Jooho
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.19 no.3
    • /
    • pp.245-251
    • /
    • 2023
  • Facial expression recognition plays a significant role in understanding human emotional states. With the advancement of AI and computer vision technologies, extensive research has been conducted in various fields, including improving customer service, medical diagnosis, and assessing learners' understanding in education. In this study, we develop a model that can infer emotions in real-time from a webcam using transfer learning with TensorFlow.js and MobileNet. While existing studies focus on achieving high accuracy using deep learning models, these models often require substantial resources due to their complex structure and computational demands. Consequently, there is a growing interest in developing lightweight deep learning models and transfer learning methods for restricted environments such as web browsers and edge devices. By employing MobileNet as the base model and performing transfer learning, our study develops a deep learning transfer model utilizing JavaScript-based TensorFlow.js, which can predict emotions in real-time using facial input from a webcam. This transfer model provides a foundation for implementing facial expression recognition in resource-constrained environments such as web and mobile applications, enabling its application in various industries.

Difficulty in Facial Emotion Recognition in Children with ADHD (주의력결핍 과잉행동장애의 이환 여부에 따른 얼굴표정 정서 인식의 차이)

  • An, Na Young;Lee, Ju Young;Cho, Sun Mi;Chung, Young Ki;Shin, Yun Mi
    • Journal of the Korean Academy of Child and Adolescent Psychiatry
    • /
    • v.24 no.2
    • /
    • pp.83-89
    • /
    • 2013
  • Objectives : It is known that children with attention-deficit hyperactivity disorder (ADHD) experience significant difficulty in recognizing facial emotion, which involves processing of emotional facial expressions rather than speech, compared to children without ADHD. This objective of this study is to investigate the differences in facial emotion recognition between children with ADHD and normal children used as control. Methods : The children for our study were recruited from the Suwon Project, a cohort comprising a non-random convenience sample of 117 nine-year-old ethnic Koreans. The parents of the study participants completed study questionnaires such as the Korean version of Child Behavior Checklist, ADHD Rating Scale, Kiddie-Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version. Facial Expression Recognition Test of the Emotion Recognition Test was used for the evaluation of facial emotion recognition and ADHD Rating Scale was used for the assessment of ADHD. Results : ADHD children (N=10) were found to have impaired recognition when it comes to Emotional Differentiation and Contextual Understanding compared with normal controls (N=24). We found no statistically significant difference in the recognition of positive facial emotions (happy and surprise) and negative facial emotions (anger, sadness, disgust and fear) between the children with ADHD and normal children. Conclusion : The results of our study suggested that facial emotion recognition may be closely associated with ADHD, after controlling for covariates, although more research is needed.

Representation of Dynamic Facial ImageGraphic for Multi-Dimensional (다차원 데이터의 동적 얼굴 이미지그래픽 표현)

  • 최철재;최진식;조규천;차홍준
    • Journal of the Korea Computer Industry Society
    • /
    • v.2 no.10
    • /
    • pp.1291-1300
    • /
    • 2001
  • This article come to study the visualization representation technique of eye brain of person, basing on the ground of the dynamic graphics which is able to change the real time, manipulating the image as graphic factors of the multi-data. And the important thought in such realization is as follows ; corresponding the character points of human face and the parameter control value which obtains basing on the existing image recognition algorithm to the multi-dimensional data, synthesizing the image, it is to create the virtual image from the emotional expression according to the changing contraction expression. The proposed DyFIG system is realized that it as the completing module and we suggest the module of human face graphics which is able to express the emotional expression by manipulating and experimenting, resulting in realizing the emotional data expression description and technology.

  • PDF

The Effects of Parent-Adolescent Communication, Emotional Intelligence and Parentification on the Psychological Well-being of Adolescents (청소년의 부모-자녀의사소통, 정서지능 및 부모화경험이 심리적 안녕감에 미치는 영향)

  • Kim, Jung-Min;Lee, Yu-Ri
    • Journal of Families and Better Life
    • /
    • v.28 no.3
    • /
    • pp.13-26
    • /
    • 2010
  • This study investigates the effects of parent-adolescent communication, emotional intelligence and parentification on the psychological well-being of adolescents. Participants were 712 middle and high school students from Seoul. The collected data were analyzed through a Cronbach's $\alpha$, two-way ANOVA, Pearson's correlation and stepwise multiple regression. The results are as follows: 1) While parent-adolescent communication, parentification, and psychological well-being differed by grade, emotional intelligence did not differ by grade. 2) Father-adolescent communication, regulation of emotion, expression of emotion, practice of emotion, recognition of emotion and mother-adolescent communication were significant predictors of the psychological well-being of middle school students. 3) The recognition of emotion, father-adolescent communication, practice of emotion, regulation of emotion, and mother-adolescent communication were significant predictors of the psychological well-being of high school students. 4) Emotional intelligence played a partially mediating role in the relationship between parent-adolescent communication and psychological well-being.

Robust Facial Expression Recognition using PCA Representation (PCA 표상을 이용한 강인한 얼굴 표정 인식)

  • Shin Young-Suk
    • Korean Journal of Cognitive Science
    • /
    • v.16 no.4
    • /
    • pp.323-331
    • /
    • 2005
  • This paper proposes an improved system for recognizing facial expressions in various internal states that is illumination-invariant and without detectable rue such as a neutral expression. As a preprocessing to extract the facial expression information, a whitening step was applied. The whitening step indicates that the mean of the images is set to zero and the variances are equalized as unit variances, which reduces murk of the variability due to lightening. After the whitening step, we used the facial expression information based on principal component analysis(PCA) representation excluded the first 1 principle component. Therefore, it is possible to extract the features in the lariat expression images without detectable cue of neutral expression from the experimental results, we ran also implement the various and natural facial expression recognition because we perform the facial expression recognition based on dimension model of internal states on the images selected randomly in the various facial expression images corresponding to 83 internal emotional states.

  • PDF

Risk Situation Recognition Using Facial Expression Recognition of Fear and Surprise Expression (공포와 놀람 표정인식을 이용한 위험상황 인지)

  • Kwak, Nae-Jong;Song, Teuk Seob
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.3
    • /
    • pp.523-528
    • /
    • 2015
  • This paper proposes an algorithm for risk situation recognition using facial expression. The proposed method recognitions the surprise and fear expression among human's various emotional expression for recognizing risk situation. The proposed method firstly extracts the facial region from input, detects eye region and lip region from the extracted face. And then, the method applies Uniform LBP to each region, discriminates facial expression, and recognizes risk situation. The proposed method is evaluated for Cohn-Kanade database image to recognize facial expression. The DB has 6 kinds of facial expressions of human being that are basic facial expressions such as smile, sadness, surprise, anger, disgust, and fear expression. The proposed method produces good results of facial expression and discriminates risk situation well.

Comparative Study of Abused Children and General Children's Emotional Intelligence and Emotion Regulation (학대받은 아동과 일반 아동의 정서지능과 정서조절 비교연구)

  • Choi, Ji-Kyung;Han, You-Jin
    • Journal of Families and Better Life
    • /
    • v.31 no.3
    • /
    • pp.49-62
    • /
    • 2013
  • The purpose of this study was to investigate the emotional ability between abused children and general children by comparing their emotional intelligence and emotional control. Participants were 17 abused children who had been separated from their abusers and 17 general children, all elementary school students. The answers to the questionnaire items on emotional intelligence and situations of emotional motivation were analyzed by Mann-Whitney U as a study tool. The results of this study were as follows: First, the difference of emotional intelligence between abused children and general children was statistically significant. Abused children received lower scores than general children when it came to their emotional recognition, emotional expression, empathy, and emotional regulation as a subordinate scope of emotional intelligence. Second, the difference of emotional regulation strategy between abused children and general children was statistically significant. Abused children presented negative responses and less frequently used positive strategy, inhibitory avoidance strategy and alternative strategy than general children. Third, the difference of emotional regulation motivation between abused children and general children was statistically significant. Abused children presented less prosocial motivation, motivation of self-preservation and normative motivation than general children.

Development an Emotional Education Program for Young Children (유아용 감성교육 프로그램 개발 연구)

  • Lee, Seung Eun;Lee, Yeung Suk
    • Korean Journal of Child Studies
    • /
    • v.25 no.6
    • /
    • pp.171-189
    • /
    • 2004
  • Children develop emotional intelligence during the early years of life, and according to experts, emotional intelligence(EI) is a more reliable predictor of academic achievement than IQ. However, nowadays children appear to be low on emotional well-being. This has potentially negative consequences, not only for academic achievement but also for personal relationships. The purpose of this study was to develop emotional education program for young children(EEPYC). In this study, EI is defined to carry out reasoning in regard to emotions and to use emotion for enhancement of thought. Designed to facilitate development of young children's EI. EEPYC is based on the four branch model, which is mental EI model and based on the guiding principle of Collaborative to Advance Social and Emotional Learning. The subgroups(curricular) that compose EEPYC are Emotional Perception, appraisal, and expression, Self-recognition program, Self-esteem program, Emotional Stress Regulation, Emotional problem solving & conflict resolution. EEPYC has the potential of fostering emotional intelligence. Moreover, EEPYC can promote a motivation, prosocial activity, and regulation of stress. This helps young children to develope cognition and emotion in harmonious fashion.

  • PDF

A Study on the System of Facial Expression Recognition for Emotional Information and Communication Technology Teaching (감성ICT 교육을 위한 얼굴감성 인식 시스템에 관한 연구)

  • Song, Eun Jee
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.4 no.2
    • /
    • pp.171-175
    • /
    • 2012
  • Recently, the research on ICT (Information and Communication Technology), which cognizes and communicates human's emotion through information technology, is increasing. For instance, there are researches on phones and services that perceive users' emotions through detecting people's voices, facial emotions, and biometric data. In short, emotions which were used to be predicted only by humans are, now, predicted by digital equipment instead. Among many ICT researches, research on emotion recognition from face is fully expected as the most effective and natural human interface. This paper studies about sensitivity ICT and examines mechanism of facial expression recognition system as an example of sensitivity ICT.

  • PDF