• Title/Summary/Keyword: Emotional engineering

Search Result 643, Processing Time 0.03 seconds

Recognition of Emotion and Emotional Speech Based on Prosodic Processing

  • Kim, Sung-Ill
    • The Journal of the Acoustical Society of Korea
    • /
    • v.23 no.3E
    • /
    • pp.85-90
    • /
    • 2004
  • This paper presents two kinds of new approaches, one of which is concerned with recognition of emotional speech such as anger, happiness, normal, sadness, or surprise. The other is concerned with emotion recognition in speech. For the proposed speech recognition system handling human speech with emotional states, total nine kinds of prosodic features were first extracted and then given to prosodic identifier. In evaluation, the recognition results on emotional speech showed that the rates using proposed method increased more greatly than the existing speech recognizer. For recognition of emotion, on the other hands, four kinds of prosodic parameters such as pitch, energy, and their derivatives were proposed, that were then trained by discrete duration continuous hidden Markov models(DDCHMM) for recognition. In this approach, the emotional models were adapted by specific speaker's speech, using maximum a posteriori(MAP) estimation. In evaluation, the recognition results on emotional states showed that the rates on the vocal emotions gradually increased with an increase of adaptation sample number.

Development of Bio-sensor-Based Feature Extraction and Emotion Recognition Model (바이오센서 기반 특징 추출 기법 및 감정 인식 모델 개발)

  • Cho, Ye Ri;Pae, Dong Sung;Lee, Yun Kyu;Ahn, Woo Jin;Lim, Myo Taeg;Kang, Tae Koo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.11
    • /
    • pp.1496-1505
    • /
    • 2018
  • The technology of emotion recognition is necessary for human computer interaction communication. There are many cases where one cannot communicate without considering one's emotion. As such, emotional recognition technology is an essential element in the field of communication. n this regard, it is highly utilized in various fields. Various bio-sensor sensors are used for human emotional recognition and can be used to measure emotions. This paper proposes a system for recognizing human emotions using two physiological sensors. For emotional classification, two-dimensional Russell's emotional model was used, and a method of classification based on personality was proposed by extracting sensor-specific characteristics. In addition, the emotional model was divided into four emotions using the Support Vector Machine classification algorithm. Finally, the proposed emotional recognition system was evaluated through a practical experiment.

Exploration on Interpersonal Problems, Emotional Clarity, and Empathic Ability in Engineering Students (공과대학생의 대인관계문제, 정서인식명확성, 공감능력 탐색)

  • Choi, Jung Ah
    • Journal of Engineering Education Research
    • /
    • v.22 no.6
    • /
    • pp.64-73
    • /
    • 2019
  • The purpose of this study is to explore characteristics of engineering students' Interpersonal Problems, Emotional Clarity, and Empathic Ability compared with humanities and social sciences students. A total of 739 college students participated in the study (459 enginerring students and 280 humanities and social sciences students). We tested research question by employing the t-test. The result showed that engineering students have higher level of clarity of feelings, perspective taking, empathic concern and lower level of attention to feelings, personal distress than humanities and social sciences students. Moreover, engineering students showed lower level of cold, socially avoidant, exploitable problems than humanities and social sciences students. We dicussed that programs aiming at developing engineering students' emotional awareness and improving their interpersonal relationships should be provided.

Analysis of Team Interaction Changes in Capstone-Design Activities by MBTI Modes (Capstone-Design 활동에서 MBTI 성격유형에 따른 팀 상호작용 변화 분석)

  • Lee, Tae-Ho;Kim, Taehoon
    • Journal of Engineering Education Research
    • /
    • v.17 no.1
    • /
    • pp.57-64
    • /
    • 2014
  • This study has a purpose mainly to analyze the team interaction change by the duration of time in the Capstone-Design activities according to MBTI Modes. Study objects are four students of Mechanical Engineering at School of Engineering in C University located in Daejeon, and the team interaction change was analyzed through IPA (Interaction Process Analysis) method. From the result, first, ESTP showed the change of increase in interaction by the time duration of initial, mid, late periods in 'social-emotional area: positive' and 'task area: question', and the change of decrease by the same time duration of periods in 'task area: solution'. Also, there was no change in 'social-emotional area: negative' because there was no interaction. Second, ESFJ showed the change of decrease in interaction by the time duration of initial, mid, late periods in 'social-emotional area: positive' and 'task area: question', and the change of increase by the same time duration of periods in 'task area: solution' and 'social-emotional area: negative'. Third, ISTJ showed the change of decrease in interaction by the time duration of initial, mid, late periods in 'social-emotional area: positive', 'task area: question' and 'social-emotional area: negative', and the change of increase by the same time duration of periods in 'task area: solution'. Fourth, ENFP showed the change of decrease by the time duration of initial, mid, late periods in 'social-emotional area: positive', 'task area: solution' and 'social-emotional area: negative', and the change of increase by the same time duration of periods in 'task area: question'.

Emotional Correlation Test from Binary Gender Perspective using Kansei Engineering Approach on IVML Prototype

  • Nur Faraha Mohd, Naim;Mintae, Hwang
    • Journal of information and communication convergence engineering
    • /
    • v.21 no.1
    • /
    • pp.68-74
    • /
    • 2023
  • This study examines the response of users' feelings from a gender perspective toward interactive video mobile learning (IVML). An IVML prototype was developed for the Android platform allowing users to install and make use of the app for m-learning purposes. This study aims to measure the level of feelings toward the IVML prototype and examine the differences in gender perspectives, identify the most responsive feelings between male, and female users as prominent feelings and measure the correlation between user-friendly feeling traits as an independent variable in accordance with gender attributes. The feelings response could then be extracted from the user experience, user interface, and human-computer interaction based on gender perspectives using the Kansei engineering approach as the measurement method. The statistical results demonstrated the different emotional reactions from a male and female perspective toward the IVML prototype may or may not have a correlation with the user-friendly trait, perhaps having a similar emotional response from one to another.

Design of Model to Recognize Emotional States in a Speech

  • Kim Yi-Gon;Bae Young-Chul
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.1
    • /
    • pp.27-32
    • /
    • 2006
  • Verbal communication is the most commonly used mean of communication. A spoken word carries a lot of informations about speakers and their emotional states. In this paper we designed a model to recognize emotional states in a speech, a first phase of two phases in developing a toy machine that recognizes emotional states in a speech. We conducted an experiment to extract and analyse the emotional state of a speaker in relation with speech. To analyse the signal output we referred to three characteristics of sound as vector inputs and they are the followings: frequency, intensity, and period of tones. Also we made use of eight basic emotional parameters: surprise, anger, sadness, expectancy, acceptance, joy, hate, and fear which were portrayed by five selected students. In order to facilitate the differentiation of each spectrum features, we used the wavelet transform analysis. We applied ANFIS (Adaptive Neuro Fuzzy Inference System) in designing an emotion recognition model from a speech. In our findings, inference error was about 10%. The result of our experiment reveals that about 85% of the model applied is effective and reliable.

An Emotional Communication System Using Emotion Recognition of Users (사용자의 감성인식을 통한 감성통신 시스템)

  • Cho, Myeon-gyun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.6 no.4
    • /
    • pp.201-207
    • /
    • 2011
  • This paper introduces a novel concept of 'Emotional Communication' for future smart phone. While traditional information based communication technologies focus on how to precisely transmit the content of message, emotional communication is intended to support and augment social relationship among people and to comfort the user to be happy. In this paper, we propose future communication services and core technologies which can estimate emotional desire of users and respond to the desire to be happy with connectedness and consolation from peoples. Firstly, we introduce emotion recognition techniques to estimate emotional desire of users. At second, the emotional responding services are categorized to four parts and the details are shown. Lastly we propose the process to implement emotional communication system and the main techniques to fulfill the system requirements for future smart-phone services.

A Novel Method for Modeling Emotional Dimensions using Expansion of Russell's Model (러셀 모델의 확장을 통한 감정차원 모델링 방법 연구)

  • Han, Eui-Hwan;Cha, Hyung-Tai
    • Science of Emotion and Sensibility
    • /
    • v.20 no.1
    • /
    • pp.75-82
    • /
    • 2017
  • We propose a novel method for modeling emotional dimensions using expansion of Russell's (1980) emotional dimensions (Circumplex Model). The Circumplex Model represents emotional words in two axes (Arousal, Valence). However, other researchers have insisted that location of word in Russell's model which is expressed by single point could not represent exact position. Consequently, it is difficult to apply this model in engineering fields (such as Science of Emotion & Sensibility, Human-Computer-Interaction, Ergonomics, etc.). Therefore, we propose a new modeling method which expresses emotional word not as a single point but as a region. We conducted survey to obtain actual data and derived equations using ellipse formula to represent emotional region. Furthermore, we applied ANEW and IAPS which are commonly used in many studies to our emotional model using pattern recognition algorithm. Using our method, we could solve problems with Russell's model and our model is easily applicable to the field of engineering.

Frontal Gamma-band Hypersynchronization in Response to Negative Emotion Elicited by Films (영상에 의해 유발된 부정적 감정 상태에 따른 전두엽 감마대역 신경동기화)

  • Kim, Hyun;Choi, Jongdoo;Choi, Jeong Woo;Yeo, Donghoon;Seo, Pukyeong;Her, Seongjin;Kim, Kyung Hwan
    • Journal of Biomedical Engineering Research
    • /
    • v.39 no.3
    • /
    • pp.124-133
    • /
    • 2018
  • We tried to investigate the changes in cortical activities according to emotional valence states during watching video clips. We examined the neural basis of two emotional states (positive and negative) using spectral power analysis and brain functional connectivity analysis of cortical current density time-series reconstructed from high-density electroencephalograms (EEGs). Fifteen healthy participants viewed a series of thirty-two 2 min emotional video clips. Sixty-four channel EEGs were recorded. Distributed cortical sources were reconstructed using weighted minimum norm estimation. The temporal and spatial characteristics of spectral source powers showing significant differences between positive and negative emotion were examined. Also, correlations between gamma-band activities and affective valence ratings were determined. We observed the changes of cortical current density time-series according to emotional states modulated by video clip. Gamma-band activities showed significant difference between emotional states for thirty seconds at the middle and the latter half of the video clip, mainly in prefrontal area. It was also significantly anti-correlated with the self-ratings of emotional valence. In addition, the gamma-band activities in frontal and temporal areas were strongly phase-synchronized, more strongly for negative emotional states. Cortical activities in frontal and temporal areas showed high spectral power and inter-regional phase synchronization in gamma-band during negative emotional states. It is inferred that the higher amygdala activation induced by negative stimuli resulted in strong emotional effects and caused strong local and global synchronization of neural activities in gamma-band in frontal and temporal areas.