• 제목/요약/키워드: Emotional recognition

검색결과 506건 처리시간 0.031초

Discrimination of Emotional States In Voice and Facial Expression

  • Kim, Sung-Ill;Yasunari Yoshitomi;Chung, Hyun-Yeol
    • The Journal of the Acoustical Society of Korea
    • /
    • 제21권2E호
    • /
    • pp.98-104
    • /
    • 2002
  • The present study describes a combination method to recognize the human affective states such as anger, happiness, sadness, or surprise. For this, we extracted emotional features from voice signals and facial expressions, and then trained them to recognize emotional states using hidden Markov model (HMM) and neural network (NN). For voices, we used prosodic parameters such as pitch signals, energy, and their derivatives, which were then trained by HMM for recognition. For facial expressions, on the other hands, we used feature parameters extracted from thermal and visible images, and these feature parameters were then trained by NN for recognition. The recognition rates for the combined parameters obtained from voice and facial expressions showed better performance than any of two isolated sets of parameters. The simulation results were also compared with human questionnaire results.

순서화 로짓모형을 이용한 농협의 선호도 분석: 충남지역 주민을 대상으로 (Analysis of Consumer Preference of Nonghyup by Ordered Logit Model in the Chungnam Province)

  • 우재영
    • 농촌지도와개발
    • /
    • 제16권2호
    • /
    • pp.405-438
    • /
    • 2009
  • This study aims to analyse the consumer and regional dwellers preferences of nonghyup influenced by contributions of socio-economical using ordered logit model. The survey data were obtained from 225 adults in Chungnam province, cross sectional data in 2007. This paper especially estimates the impact of socio-economic characteristics, such as sex, occupation, school career, and emotional and subjective recognition of contributions of regional socio-economical and culture development, social welfare, It also examines the impact of recognition of cooperational level with local government's policy, customer satisfaction ratings, degree of business ethics. The main results are as follows; the consumer and regional dwellers preferences of nonghyup is not affected by sex, occupation, school career. But the consumer and regional dwellers preferences of nonghyup is influenced by emotional and subjective recognition of contributions of regional socio-economical and culture development, social welfare, It also influenced by emotional and subjective recognition of policy cooperation level with local government, customer satisfaction ratings, degree of business ethics.

  • PDF

Emotion Recognition using Facial Thermal Images

  • Eom, Jin-Sup;Sohn, Jin-Hun
    • 대한인간공학회지
    • /
    • 제31권3호
    • /
    • pp.427-435
    • /
    • 2012
  • The aim of this study is to investigate facial temperature changes induced by facial expression and emotional state in order to recognize a persons emotion using facial thermal images. Background: Facial thermal images have two advantages compared to visual images. Firstly, facial temperature measured by thermal camera does not depend on skin color, darkness, and lighting condition. Secondly, facial thermal images are changed not only by facial expression but also emotional state. To our knowledge, there is no study to concurrently investigate these two sources of facial temperature changes. Method: 231 students participated in the experiment. Four kinds of stimuli inducing anger, fear, boredom, and neutral were presented to participants and the facial temperatures were measured by an infrared camera. Each stimulus consisted of baseline and emotion period. Baseline period lasted during 1min and emotion period 1~3min. In the data analysis, the temperature differences between the baseline and emotion state were analyzed. Eyes, mouth, and glabella were selected for facial expression features, and forehead, nose, cheeks were selected for emotional state features. Results: The temperatures of eyes, mouth, glanella, forehead, and nose area were significantly decreased during the emotional experience and the changes were significantly different by the kind of emotion. The result of linear discriminant analysis for emotion recognition showed that the correct classification percentage in four emotions was 62.7% when using both facial expression features and emotional state features. The accuracy was slightly but significantly decreased at 56.7% when using only facial expression features, and the accuracy was 40.2% when using only emotional state features. Conclusion: Facial expression features are essential in emotion recognition, but emotion state features are also important to classify the emotion. Application: The results of this study can be applied to human-computer interaction system in the work places or the automobiles.

자연어 처리를 이용한 감정 스트레스 인지 및 관리 챗봇 개발 (A Development of Chatbot for Emotional Stress Recognition and Management using NLP)

  • 박종진
    • 전기학회논문지
    • /
    • 제67권7호
    • /
    • pp.954-961
    • /
    • 2018
  • In this paper, a chatbot for emotional stress recognition and management using rule-based method and NLP is designed and developed to tackle various emotional stresses of people through questionnaire. For this, Dialogflow as open chatbot development platform and Facebook messenger as chatting platform are used. We can build natural and resourceful conversational experiences through predefined questions by using powerful tools of Dialogflow, and can use developed chatbot on the Facebook page messenger. Developed chatbot perceives emotional stresses of user by user-input which is either text or choice of predefined answer. It also gives user questions according to the user's feeling, and assess the strength of the emotional stresses, and provide a solution to the user. Further research can improve the developed chatbot by using open Korean NLP library and database of emotions and stresses.

우뇌 기능을 활용한 정서조절 미술수업의 효과성 (Effectiveness of emotional regulation art class using right brain function)

  • 김희주;허윤정
    • 한국융합학회논문지
    • /
    • 제12권6호
    • /
    • pp.119-125
    • /
    • 2021
  • 초등학생 시기는 정서조절을 영역에서 발달 단계상 미성숙하기 때문에 정서조절능력 배양이 필요하다. 미술수업에 정서조절을 접목해 진행하면 학생들의 인성 변화에 긍정적인 영향을 기대할 수 있기에 본 연구는 정서조절 증진을 위해 우뇌의 기능을 활용한 정서조절 미술수업을 진행하고 정서조절 증진의 효과성을 사전·사후 질문지와 사후 인터뷰를 통해 분석하여 결과를 도출하였다. 사전 사후 분석 결과 수업 이후 정서조절 하위 요소 중 '자기 정서 인식 및 표현', '타인 정서 인식 및 배려' 그리고 '대인관계'에서 통계적으로 높게 나왔다. 인터뷰 분석 결과 모든 학생이 정서조절 하위항목에서 긍정적인 효과가 있었음을 알 수 있었다. 결과적으로 수업 전보다 수업 후에 자신의 정서를 인식하고 이해하였으며, 부정적인 정서를 긍정적인 정서로 정화함으로 정서를 표현할 수 있는 효과가 있었다. 앞으로 정서조절 미술수업을 위한 다양한 교수 학습 방법을 적용한 프로그램 개발이 필요함을 제언한다.

Emotional Recognition of speech signal using Recurrent Neural Network

  • Park, Chang-Hyun;Sim, Kwee-Bo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2002년도 ICCAS
    • /
    • pp.81.2-81
    • /
    • 2002
  • $\textbullet$ Introduction- Concept and meaning of the emotional Recognition $\textbullet$ The feature of 4-emotions $\textbullet$ Pitch(approach) $\textbullet$ Simulator-structure, RNN(learning algorithm), evaluation function, solution search method $\textbullet$ Result

  • PDF

A Study on Image Recommendation System based on Speech Emotion Information

  • Kim, Tae Yeun;Bae, Sang Hyun
    • 통합자연과학논문집
    • /
    • 제11권3호
    • /
    • pp.131-138
    • /
    • 2018
  • In this paper, we have implemented speeches that utilized the emotion information of the user's speech and image matching and recommendation system. To classify the user's emotional information of speech, the emotional information of speech about the user's speech is extracted and classified using the PLP algorithm. After classification, an emotional DB of speech is constructed. Moreover, emotional color and emotional vocabulary through factor analysis are matched to one space in order to classify emotional information of image. And a standardized image recommendation system based on the matching of each keyword with the BM-GA algorithm for the data of the emotional information of speech and emotional information of image according to the more appropriate emotional information of speech of the user. As a result of the performance evaluation, recognition rate of standardized vocabulary in four stages according to speech was 80.48% on average and system user satisfaction was 82.4%. Therefore, it is expected that the classification of images according to the user's speech information will be helpful for the study of emotional exchange between the user and the computer.

인간의 감정 인식을 위한 신경회로망 기반의 휴먼과 컴퓨터 인터페이스 구현 (Implementation of Human and Computer Interface for Detecting Human Emotion Using Neural Network)

  • 조기호;최호진;정슬
    • 제어로봇시스템학회논문지
    • /
    • 제13권9호
    • /
    • pp.825-831
    • /
    • 2007
  • In this paper, an interface between a human and a computer is presented. The human and computer interface(HCI) serves as another area of human and machine interfaces. Methods for the HCI we used are voice recognition and image recognition for detecting human's emotional feelings. The idea is that the computer can recognize the present emotional state of the human operator, and amuses him/her in various ways such as turning on musics, searching webs, and talking. For the image recognition process, the human face is captured, and eye and mouth are selected from the facial image for recognition. To train images of the mouth, we use the Hopfield Net. The results show 88%$\sim$92% recognition of the emotion. For the vocal recognition, neural network shows 80%$\sim$98% recognition of voice.

의료기관 종사자의 라이프케어 감정노동과 진료비 삭감 인식도가 삭감률에 미치는 영향 (The Impacts of Emotional Labor and The Recognition Level of Medical Service Fee Reduction of Medical Institution Workers Influencing Reduction Rate)

  • 양유정;이혜승
    • 한국엔터테인먼트산업학회논문지
    • /
    • 제14권8호
    • /
    • pp.345-352
    • /
    • 2020
  • 본 연구는 요양의료기관 종사자의 감정노동과 진료비 삭감 인식도가 삭감율에 미치는 영향을 알아보기 위해 대한민국의 의료기관 종사자 414명을 대상으로 설문조사를 실시하여 다음과 같은 결과를 도출하였다. 첫째, 인구사회학적 특성에 따른 삭감률 차이를 살펴본 결과 입원 삭감률과 외래 삭감률은 근무형태, 현 병원 근무경력과 허가 병상 수에서 유의한 차이를 보였다. 둘째, 감정노동, 진료비 삭감 인식도와 삭감률의 상관관계를 분석한 결과 감정노동과 외래 삭감률은 정적 상관, 진료비 삭감 인식도과 입원 삭감률은 부적 상관, 진료비 삭감인식도과 외래 삭감률은 부적 상관에 유의한 것으로 나타났다. 셋째, 감정노동은 입원 삭감률에 유의한 정적 영향 미치며, 진료비삭감인식도는 입원 삭감률에 유의한 부적 영향 미치는 것으로 나타났다. 감정노동은 외래 삭감률에 유의한 정적 영향을 미치며, 진료비 삭감인식도는 외래삭감률에 유의한 부적 영향을 미치는 것으로 나타났다.

Audio and Video Bimodal Emotion Recognition in Social Networks Based on Improved AlexNet Network and Attention Mechanism

  • Liu, Min;Tang, Jun
    • Journal of Information Processing Systems
    • /
    • 제17권4호
    • /
    • pp.754-771
    • /
    • 2021
  • In the task of continuous dimension emotion recognition, the parts that highlight the emotional expression are not the same in each mode, and the influences of different modes on the emotional state is also different. Therefore, this paper studies the fusion of the two most important modes in emotional recognition (voice and visual expression), and proposes a two-mode dual-modal emotion recognition method combined with the attention mechanism of the improved AlexNet network. After a simple preprocessing of the audio signal and the video signal, respectively, the first step is to use the prior knowledge to realize the extraction of audio characteristics. Then, facial expression features are extracted by the improved AlexNet network. Finally, the multimodal attention mechanism is used to fuse facial expression features and audio features, and the improved loss function is used to optimize the modal missing problem, so as to improve the robustness of the model and the performance of emotion recognition. The experimental results show that the concordance coefficient of the proposed model in the two dimensions of arousal and valence (concordance correlation coefficient) were 0.729 and 0.718, respectively, which are superior to several comparative algorithms.