• 제목/요약/키워드: Facial Expression Practice

검색결과 6건 처리시간 0.018초

Facial Data Visualization for Improved Deep Learning Based Emotion Recognition

  • Lee, Seung Ho
    • Journal of Information Science Theory and Practice
    • /
    • 제7권2호
    • /
    • pp.32-39
    • /
    • 2019
  • A convolutional neural network (CNN) has been widely used in facial expression recognition (FER) because it can automatically learn discriminative appearance features from an expression image. To make full use of its discriminating capability, this paper suggests a simple but effective method for CNN based FER. Specifically, instead of an original expression image that contains facial appearance only, the expression image with facial geometry visualization is used as input to CNN. In this way, geometric and appearance features could be simultaneously learned, making CNN more discriminative for FER. A simple CNN extension is also presented in this paper, aiming to utilize geometric expression change derived from an expression image sequence. Experimental results on two public datasets (CK+ and MMI) show that CNN using facial geometry visualization clearly outperforms the conventional CNN using facial appearance only.

감정 분류를 이용한 표정 연습 보조 인공지능 (Artificial Intelligence for Assistance of Facial Expression Practice Using Emotion Classification)

  • 김동규;이소화;봉재환
    • 한국전자통신학회논문지
    • /
    • 제17권6호
    • /
    • pp.1137-1144
    • /
    • 2022
  • 본 연구에서는 감정을 표현하기 위한 표정 연습을 보조하는 인공지능을 개발하였다. 개발한 인공지능은 서술형 문장과 표정 이미지로 구성된 멀티모달 입력을 심층신경망에 사용하고 서술형 문장에서 예측되는 감정과 표정 이미지에서 예측되는 감정 사이의 유사도를 계산하여 출력하였다. 사용자는 서술형 문장으로 주어진 상황에 맞게 표정을 연습하고 인공지능은 서술형 문장과 사용자의 표정 사이의 유사도를 수치로 출력하여 피드백한다. 표정 이미지에서 감정을 예측하기 위해 ResNet34 구조를 사용하였으며 FER2013 공공데이터를 이용해 훈련하였다. 자연어인 서술형 문장에서 감정을 예측하기 위해 KoBERT 모델을 전이학습 하였으며 AIHub의 감정 분류를 위한 대화 음성 데이터 세트를 사용해 훈련하였다. 표정 이미지에서 감정을 예측하는 심층신경망은 65% 정확도를 달성하여 사람 수준의 감정 분류 능력을 보여주었다. 서술형 문장에서 감정을 예측하는 심층신경망은 90% 정확도를 달성하였다. 감정표현에 문제가 없는 일반인이 개발한 인공지능을 이용해 표정 연습 실험을 수행하여 개발한 인공지능의 성능을 검증하였다.

치위생과 학생의 자기표현이 임상실습 스트레스에 미치는 영향 (The Effect of Self-Expression on Stress with Clinical Dental Practice among Students in the Department of Dental Hygiene)

  • 전주연;이현옥;김진
    • 치위생과학회지
    • /
    • 제7권2호
    • /
    • pp.89-96
    • /
    • 2007
  • The purpose of this study was to examine the relationship between the self-expression level of dental hygiene students related to communicative competence and their stress during clinical practice and what affected their stress. The subjects in this study were 125 dental hygiene students in W college, on whom a survey was conducted from September 18 through 30, 2006. After the collected data were analyzed with SPSS WIN 10.0 program, the following findings were acquired: 1. When a factor analysis was made to evaluate the self-expression of the students, there appeared three different categories of self-expression: voice/content, facial expression/attitude and sentiment. The three made a 58.1% prediction of their self-expression. As for overall reliability, they turned out highly reliable(Cronbach'a = .881). 2. The dental hygiene students got a mean of 3.58 out of possible five points in self-expression, which indicated that they expressed themselves relatively well. Concerning connections between their general characteristics and self-expression level, those who were inactive during clinical practice got a mean of 3.28, whereas the others who were active got a mean of 3.85. It implied that those who took a more active attitude to clinical practice expressed themselves better(p < .01). The person with whom they found it hard to get along made a statistically significant difference to their self-expression(p < .05). The students who didn't fare well with dental hygienists got the best score(3.70). The second best group(3.53) didn't get along with dentists, followed by assistant nurses(3.46) and patients/caregivers(3.31). As for the impact of the field of dream job, the students who hoped to work or study overseas(4.21) excelled in self-expression those who wanted to be hired in a general hospital, to go onto a school of higher grade and to work in a public dental clinic(p < .05). Among the general characteristics, satisfaction level with major, health status and motivation of choosing dental hygiene made no statistically significant differences to their self-expression. 3. Regarding relations between self-expression level and stress about clinical practice, those who didn't express themselves properly in terms of sentiment scored higher in stress level(3.65). Their stress was statistically significantly different according to self-expression level (p < .05). 4. As for the influence of self-expression and general characteristics on stress with clinical practice, sentiment was selected from among the self-expression categories as a decisive factor to affect stress. Their stress varied statistically significantly with that(p < .05). In contrast, their demographic variables made no statistically significant difference to that, which made a 79.2% prediction of it.

  • PDF

인도 신체표현(āṅgika abhinaya)체계의 전통성과 정체성에 관한 고찰 - 『나띠야 샤스뜨라』의 규정을 중심으로 - (A Study on the Tradition and Identity of Bodily Expression System in India)

  • 허동성
    • 공연문화연구
    • /
    • 제18호
    • /
    • pp.223-255
    • /
    • 2009
  • 이 글은 현전하는 인도 최초의 연극론서인 "나띠야 샤스뜨라"의 규정을 중심으로 인도의 전통연극과 무용에 계승되어 온 신체표현기법인 앙기까 아비나야의 전통성과 정체성을 고찰한다. 이를 위해 마르기와 데시, 나띠야다르미와 로까다르미의 양식구분 개념, 아비나야의 분류, 앙기까 아비나야의 의의와 중요성, 기원과 형성과정, 범주와 종류, 계통성과 지역적 차이를 살펴본다.

1:1 발표력 코칭 애플리케이션의 개발 (Development of a 1:1 Presentation Coaching Application)

  • 위승현;문미경
    • 전기전자학회논문지
    • /
    • 제22권4호
    • /
    • pp.992-998
    • /
    • 2018
  • 발표력이란 다른 사람 앞에서 자신의 생각과 주장을 논리적으로 자신감 있게 전달하는 기술로 학교나 직장에 다닐 경우 필수적이다. 그러나 발표력을 향상시키기 위해서는 많은 시간과 돈, 그리고 노력이 필요하다. 본 논문에서는 프레젠테이션 연습 비디오를 분석하는 프레젠테이션 코칭 응용 프로그램 개발에 대해 설명한다. 응용 프로그램은 프레젠테이션 시간, 화자의 표정, 중복 단어 사용 등을 분석해서 사용자에게 분석결과를 제공함으로 발표력 향상에 도움을 줄 수 있다.

Scientific review of the aesthetic uses of botulinum toxin type A

  • Park, Mee Young;Ahn, Ki Young
    • 대한두개안면성형외과학회지
    • /
    • 제22권1호
    • /
    • pp.1-10
    • /
    • 2021
  • Botulinum toxin type A (BoNT-A), onabotulinumtoxinA (Botox) was approved by the United States Food and Drug Administration for temporary improvement of glabellar lines in patients 65 years and younger in 2002, and has also been used widely for aesthetic purposes such as hyperhidrosis, body shape contouring, and other noninvasive facial procedures. BoNT-A inhibits presynaptic exocytosis of acetylcholine (ACh)-containing vesicles into the neuromuscular junction at cholinergic nerve endings of the peripheral nervous system, thereby paralyzing skeletal muscles. ACh is the most broadly used neurotransmitter in the somatic nervous system, preganglionic and postganglionic fibers of parasympathetic nerves, and preganglionic fibers or postganglionic sudomotor nerves of sympathetic nerves. The scientific basis for using BoNT-A in various cosmetic procedures is that its function goes beyond the dual role of muscle paralysis and neuromodulation by inhibiting the secretion of ACh. Although the major target organs for aesthetic procedures are facial expression muscles, skeletal body muscles, salivary glands, and sweat glands, which are innervated by the somatic or autonomic nerves of the peripheral cholinergic nerve system, few studies have attempted to directly explain the anatomy of the areas targeted for injection by addressing the neural physiology and rationale for specific aesthetic applications of BoNT-A therapy. In this article, we classify the various cosmetic uses of BoNT-A according to the relevant component of the peripheral nervous system, and describe scientific theories regarding the anatomy and physiology of the cholinergic nervous system. We also review critical physiological factors and conditions influencing the efficacy of BoNT-A for the rational aesthetic use of BoNT-A. We hope that this comprehensive review helps promote management policies to support long-term, safe, successful practice. Furthermore, based on this, we look forward to developing and expanding new advanced indications for the aesthetic use of BoNT-A in the future.