• Title/Summary/Keyword: emotional recognition

Search Result 505, Processing Time 0.051 seconds

Discrimination of Emotional States In Voice and Facial Expression

  • Kim, Sung-Ill;Yasunari Yoshitomi;Chung, Hyun-Yeol
    • The Journal of the Acoustical Society of Korea
    • /
    • v.21 no.2E
    • /
    • pp.98-104
    • /
    • 2002
  • The present study describes a combination method to recognize the human affective states such as anger, happiness, sadness, or surprise. For this, we extracted emotional features from voice signals and facial expressions, and then trained them to recognize emotional states using hidden Markov model (HMM) and neural network (NN). For voices, we used prosodic parameters such as pitch signals, energy, and their derivatives, which were then trained by HMM for recognition. For facial expressions, on the other hands, we used feature parameters extracted from thermal and visible images, and these feature parameters were then trained by NN for recognition. The recognition rates for the combined parameters obtained from voice and facial expressions showed better performance than any of two isolated sets of parameters. The simulation results were also compared with human questionnaire results.

Analysis of Consumer Preference of Nonghyup by Ordered Logit Model in the Chungnam Province (순서화 로짓모형을 이용한 농협의 선호도 분석: 충남지역 주민을 대상으로)

  • Woo, Jae-Young
    • Journal of Agricultural Extension & Community Development
    • /
    • v.16 no.2
    • /
    • pp.405-438
    • /
    • 2009
  • This study aims to analyse the consumer and regional dwellers preferences of nonghyup influenced by contributions of socio-economical using ordered logit model. The survey data were obtained from 225 adults in Chungnam province, cross sectional data in 2007. This paper especially estimates the impact of socio-economic characteristics, such as sex, occupation, school career, and emotional and subjective recognition of contributions of regional socio-economical and culture development, social welfare, It also examines the impact of recognition of cooperational level with local government's policy, customer satisfaction ratings, degree of business ethics. The main results are as follows; the consumer and regional dwellers preferences of nonghyup is not affected by sex, occupation, school career. But the consumer and regional dwellers preferences of nonghyup is influenced by emotional and subjective recognition of contributions of regional socio-economical and culture development, social welfare, It also influenced by emotional and subjective recognition of policy cooperation level with local government, customer satisfaction ratings, degree of business ethics.

  • PDF

Emotion Recognition using Facial Thermal Images

  • Eom, Jin-Sup;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.3
    • /
    • pp.427-435
    • /
    • 2012
  • The aim of this study is to investigate facial temperature changes induced by facial expression and emotional state in order to recognize a persons emotion using facial thermal images. Background: Facial thermal images have two advantages compared to visual images. Firstly, facial temperature measured by thermal camera does not depend on skin color, darkness, and lighting condition. Secondly, facial thermal images are changed not only by facial expression but also emotional state. To our knowledge, there is no study to concurrently investigate these two sources of facial temperature changes. Method: 231 students participated in the experiment. Four kinds of stimuli inducing anger, fear, boredom, and neutral were presented to participants and the facial temperatures were measured by an infrared camera. Each stimulus consisted of baseline and emotion period. Baseline period lasted during 1min and emotion period 1~3min. In the data analysis, the temperature differences between the baseline and emotion state were analyzed. Eyes, mouth, and glabella were selected for facial expression features, and forehead, nose, cheeks were selected for emotional state features. Results: The temperatures of eyes, mouth, glanella, forehead, and nose area were significantly decreased during the emotional experience and the changes were significantly different by the kind of emotion. The result of linear discriminant analysis for emotion recognition showed that the correct classification percentage in four emotions was 62.7% when using both facial expression features and emotional state features. The accuracy was slightly but significantly decreased at 56.7% when using only facial expression features, and the accuracy was 40.2% when using only emotional state features. Conclusion: Facial expression features are essential in emotion recognition, but emotion state features are also important to classify the emotion. Application: The results of this study can be applied to human-computer interaction system in the work places or the automobiles.

A Development of Chatbot for Emotional Stress Recognition and Management using NLP (자연어 처리를 이용한 감정 스트레스 인지 및 관리 챗봇 개발)

  • Park, Jong-Jin
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.7
    • /
    • pp.954-961
    • /
    • 2018
  • In this paper, a chatbot for emotional stress recognition and management using rule-based method and NLP is designed and developed to tackle various emotional stresses of people through questionnaire. For this, Dialogflow as open chatbot development platform and Facebook messenger as chatting platform are used. We can build natural and resourceful conversational experiences through predefined questions by using powerful tools of Dialogflow, and can use developed chatbot on the Facebook page messenger. Developed chatbot perceives emotional stresses of user by user-input which is either text or choice of predefined answer. It also gives user questions according to the user's feeling, and assess the strength of the emotional stresses, and provide a solution to the user. Further research can improve the developed chatbot by using open Korean NLP library and database of emotions and stresses.

Effectiveness of emotional regulation art class using right brain function (우뇌 기능을 활용한 정서조절 미술수업의 효과성)

  • Kim, Hee-Ju;Huh, Yoon-Jung
    • Journal of the Korea Convergence Society
    • /
    • v.12 no.6
    • /
    • pp.119-125
    • /
    • 2021
  • In the elementary school period, since the developmental stage in the area of emotional regulation is immature, it is necessary to develop emotional regulation ability. In order to promote emotional regulation, this study provides an emotional regulation art class that utilizes the right brain function. Results were derived by analyzing through pre- and post-questions and post-interviews. As a result of the pre-post analysis, among the sub-elements of emotional regulation after class, 'Self-Emotion Recognition and Expression', 'Emotional Recognition and Consideration of Others', and 'Interpersonal Relationships' were statistically high. As a result of interview analysis, it was found that all students had a positive effect in the emotional regulation sub-item. As a result, they recognized and understood their emotions after class rather than before class, and had the effect of expressing emotions by purifying negative emotions into positive emotions. It is suggested that it is necessary to develop a program that applies various teaching and learning methods for emotional regulation art class in the future.

Emotional Recognition of speech signal using Recurrent Neural Network

  • Park, Chang-Hyun;Sim, Kwee-Bo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.81.2-81
    • /
    • 2002
  • $\textbullet$ Introduction- Concept and meaning of the emotional Recognition $\textbullet$ The feature of 4-emotions $\textbullet$ Pitch(approach) $\textbullet$ Simulator-structure, RNN(learning algorithm), evaluation function, solution search method $\textbullet$ Result

  • PDF

Implementation of Human and Computer Interface for Detecting Human Emotion Using Neural Network (인간의 감정 인식을 위한 신경회로망 기반의 휴먼과 컴퓨터 인터페이스 구현)

  • Cho, Ki-Ho;Choi, Ho-Jin;Jung, Seul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.9
    • /
    • pp.825-831
    • /
    • 2007
  • In this paper, an interface between a human and a computer is presented. The human and computer interface(HCI) serves as another area of human and machine interfaces. Methods for the HCI we used are voice recognition and image recognition for detecting human's emotional feelings. The idea is that the computer can recognize the present emotional state of the human operator, and amuses him/her in various ways such as turning on musics, searching webs, and talking. For the image recognition process, the human face is captured, and eye and mouth are selected from the facial image for recognition. To train images of the mouth, we use the Hopfield Net. The results show 88%$\sim$92% recognition of the emotion. For the vocal recognition, neural network shows 80%$\sim$98% recognition of voice.

A Study on Image Recommendation System based on Speech Emotion Information

  • Kim, Tae Yeun;Bae, Sang Hyun
    • Journal of Integrative Natural Science
    • /
    • v.11 no.3
    • /
    • pp.131-138
    • /
    • 2018
  • In this paper, we have implemented speeches that utilized the emotion information of the user's speech and image matching and recommendation system. To classify the user's emotional information of speech, the emotional information of speech about the user's speech is extracted and classified using the PLP algorithm. After classification, an emotional DB of speech is constructed. Moreover, emotional color and emotional vocabulary through factor analysis are matched to one space in order to classify emotional information of image. And a standardized image recommendation system based on the matching of each keyword with the BM-GA algorithm for the data of the emotional information of speech and emotional information of image according to the more appropriate emotional information of speech of the user. As a result of the performance evaluation, recognition rate of standardized vocabulary in four stages according to speech was 80.48% on average and system user satisfaction was 82.4%. Therefore, it is expected that the classification of images according to the user's speech information will be helpful for the study of emotional exchange between the user and the computer.

The Impacts of Emotional Labor and The Recognition Level of Medical Service Fee Reduction of Medical Institution Workers Influencing Reduction Rate (의료기관 종사자의 라이프케어 감정노동과 진료비 삭감 인식도가 삭감률에 미치는 영향)

  • Yang, Yu-Jeong;Lee, Hye-Seung
    • Journal of Korea Entertainment Industry Association
    • /
    • v.14 no.8
    • /
    • pp.345-352
    • /
    • 2020
  • This study conducted a survey targeting 414 medical institution workers to identify the impacts of their emotional labor and the recognition level of medical service fee reduction influencing the reduction rate. The results were as follows. First, a review of the difference in the reduction rate by socio-demographic characteristics revealed that in both inpatient and outpatient reduction rate, there is a significant difference in the occupational description, working history at the current hospital, and the numbers of approved beds. Second, there is a correlation between emotional labor, the recognition level of medical service fee reduction, and the reduction rate. As a result of the analysis, there is a significant positive correlation between emotional labor and outpatient reduction rate, a significant negative correlation between the recognition level of medical service fee reduction and inpatient reduction rate, and a significant negative correlation between the recognition level of medical service fee reduction and outpatient reduction rate. Third, emotional labor has a significant positive effect on the inpatient reduction rate, and the recognition level of the medical service fee reduction has a negative effect on the inpatient reduction rate. The emotional labor also has a significant positive effect on outpatient reduction rate, and the recognition level of the medical service fee reduction has a significant negative effect on outpatient reduction rate.

Audio and Video Bimodal Emotion Recognition in Social Networks Based on Improved AlexNet Network and Attention Mechanism

  • Liu, Min;Tang, Jun
    • Journal of Information Processing Systems
    • /
    • v.17 no.4
    • /
    • pp.754-771
    • /
    • 2021
  • In the task of continuous dimension emotion recognition, the parts that highlight the emotional expression are not the same in each mode, and the influences of different modes on the emotional state is also different. Therefore, this paper studies the fusion of the two most important modes in emotional recognition (voice and visual expression), and proposes a two-mode dual-modal emotion recognition method combined with the attention mechanism of the improved AlexNet network. After a simple preprocessing of the audio signal and the video signal, respectively, the first step is to use the prior knowledge to realize the extraction of audio characteristics. Then, facial expression features are extracted by the improved AlexNet network. Finally, the multimodal attention mechanism is used to fuse facial expression features and audio features, and the improved loss function is used to optimize the modal missing problem, so as to improve the robustness of the model and the performance of emotion recognition. The experimental results show that the concordance coefficient of the proposed model in the two dimensions of arousal and valence (concordance correlation coefficient) were 0.729 and 0.718, respectively, which are superior to several comparative algorithms.