• Title/Summary/Keyword: Emotional States

Search Result 234, Processing Time 0.023 seconds

A Music Recommendation Method Using Emotional States by Contextual Information

  • Kim, Dong-Joo;Lim, Kwon-Mook
    • Journal of the Korea Society of Computer and Information
    • /
    • v.20 no.10
    • /
    • pp.69-76
    • /
    • 2015
  • User's selection of music is largely influenced by private tastes as well as emotional states, and it is the unconsciousness projection of user's emotion. Therefore, we think user's emotional states to be music itself. In this paper, we try to grasp user's emotional states from music selected by users at a specific context, and we analyze the correlation between its context and user's emotional state. To get emotional states out of music, the proposed method extracts emotional words as the representative of music from lyrics of user-selected music through morphological analysis, and learns weights of linear classifier for each emotional features of extracted words. Regularities learned by classifier are utilized to calculate predictive weights of virtual music using weights of music chosen by other users in context similar to active user's context. Finally, we propose a method to recommend some pieces of music relative to user's contexts and emotional states. Experimental results shows that the proposed method is more accurate than the traditional collaborative filtering method.

A Study on the Effects of Cyber Bullying on Cognitive Processing Ability and the Emotional States: Moderating Effect of Social Support of Friends and Parents

  • Yituo Feng;Sundong Kwon
    • Asia pacific journal of information systems
    • /
    • v.30 no.1
    • /
    • pp.167-187
    • /
    • 2020
  • College students experience more cyber bullying than youth and cyber bullying on college students may be more harmful than youth. But many studies of cyber bullying have been conducted in youth, but little has been studied for college students. Therefore, this study investigated the negative effects of college students' cyber bullying experience on cognitive processing ability and emotional states. The social support of friends has a buffering effect that prevents stress and reduces the influence on external damage in stressful situations. But the impact of parental social support is controversial. Traditionally, the social support of parents has been claimed to mitigate the negative effects of external damage. Recently, however, it has been argued that parental social support, without considering the independence and autonomy needs of college students, does not alleviate the negative effects. Therefore, this study examined how the social support of friends and parents moderate the negative impact of cyber bullying. The results show that the more college students experience cyber bullying, the lower their cognitive processing ability and emotional states. And, the higher the social support of friends, the lower the harmful impacts of cyber bullying on cognitive processing ability and emotional states. But, the higher the social support of parents, the higher the harmful impacts of cyber bullying on cognitive processing ability and emotional states.

Design of Model to Recognize Emotional States in a Speech

  • Kim Yi-Gon;Bae Young-Chul
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.1
    • /
    • pp.27-32
    • /
    • 2006
  • Verbal communication is the most commonly used mean of communication. A spoken word carries a lot of informations about speakers and their emotional states. In this paper we designed a model to recognize emotional states in a speech, a first phase of two phases in developing a toy machine that recognizes emotional states in a speech. We conducted an experiment to extract and analyse the emotional state of a speaker in relation with speech. To analyse the signal output we referred to three characteristics of sound as vector inputs and they are the followings: frequency, intensity, and period of tones. Also we made use of eight basic emotional parameters: surprise, anger, sadness, expectancy, acceptance, joy, hate, and fear which were portrayed by five selected students. In order to facilitate the differentiation of each spectrum features, we used the wavelet transform analysis. We applied ANFIS (Adaptive Neuro Fuzzy Inference System) in designing an emotion recognition model from a speech. In our findings, inference error was about 10%. The result of our experiment reveals that about 85% of the model applied is effective and reliable.

Classification of Emotional States of Interest and Neutral Using Features from Pulse Wave Signal

  • Phongsuphap, Sukanya;Sopharak, Akara
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.682-685
    • /
    • 2004
  • This paper investigated a method for classifying emotional states by using pulse wave signal. It focused on finding effective features for emotional state classification. The emptional states considered here consisted of interest and neutral. Classification experiments utilized 65 and 60 samples of interest and neutral states respectively. We have investigated 19 features derived from pulse wave signals by using both time domain and frequency domain analysis methods with 2 classifiers of minimum distance (normalized Euclidean distanece) and ${\kappa}$-Nearest Neighbour. The Leave-one-out cross validation was used as an evaluation mehtod. Based on experimental results, the most efficient features were a combination of 4 features consisting of (i) the mean of the first differences of the smoothed pulse rate time series signal, (ii) the mean of absolute values of the second differences of thel normalized interbeat intervals, (iii) the root mean square successive difference, and (iv) the power in high frequency range in normalized unit, which provided 80.8% average accuracy with ${\kappa}$-Nearest Neighbour classifier.

  • PDF

Inferring Pedestrians' Emotional States through Physiological Responses to Measure Subjective Walkability Indices

  • Kim, Taeeun;Lee, Meesung;Hwang, Sungjoo
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.1245-1246
    • /
    • 2022
  • Walkability is an indicator of how much pedestrians are willing to walk and how well a walking environment is created. As walking can promote pedestrians' mental and physical health, there has been increasing focus on improving walkability in different ways. Thus, plenty of research has been undertaken to measure walkability. When measuring walkability, there are many objective and subjective variables. Subjective variables include a feeling of safety, pleasure, or comfort, which can significantly affect perceived walkability. However, these subjective factors are difficult to measure by making the walkability index more reliant on objective and physical factors. Because many subjective variables are associated with human emotional states, understanding pedestrians' emotional states provides an opportunity to measure the subjective walkability variables more quantitatively. Pedestrians' emotions can be examined through surveys, but there are social and economic difficulties involved when conducting surveys. Recently, an increasing number of studies have employed physiological data to measure pedestrians' stress responses when navigating unpleasant environmental barriers on their walking paths. However, studies investigating the emotional states of pedestrians in the walking environment, including assessing their positive emotions felt, such as pleasure, have rarely been conducted. Using wearable devices, this study examined the various emotional states of pedestrians affected by the walking environment. Specifically, this study aimed to demonstrate the feasibility of monitoring biometric data, such as electrodermal activity (EDA) and heart rate variability (HRV), using wearable devices as an indicator of pedestrians' emotional states-both pleasant-unpleasant and aroused-relaxed states. To this end, various walking environments with different characteristics were set up to collect and analyze the pedestrians' biometric data. Subsequently, the subjects wearing the wearable devices were allowed to walk on the experimental paths as usual. After the experiment, the valence (i.e., pleasant or unpleasant) and arousal (i.e., activated or relaxed) scale of the pedestrians was identified through a bipolar dimension survey. The survey results were compared with many potentially relevant EDA and HRV signal features. The research results revealed the potential for physiological responses to indicate the pedestrians' emotional states, but further investigation is warranted. The research results were expected to provide a method to measure the subjective factors of walkability by measuring emotions and monitoring pedestrians' positive or negative feelings when walking to improve the walking environment. However, due to the lack of samples and other internal and external factors influencing emotions (which need to be studied further), it cannot be comprehensively concluded that the pedestrians' emotional states were affected by the walking environment.

  • PDF

Recognition of Emotion and Emotional Speech Based on Prosodic Processing

  • Kim, Sung-Ill
    • The Journal of the Acoustical Society of Korea
    • /
    • v.23 no.3E
    • /
    • pp.85-90
    • /
    • 2004
  • This paper presents two kinds of new approaches, one of which is concerned with recognition of emotional speech such as anger, happiness, normal, sadness, or surprise. The other is concerned with emotion recognition in speech. For the proposed speech recognition system handling human speech with emotional states, total nine kinds of prosodic features were first extracted and then given to prosodic identifier. In evaluation, the recognition results on emotional speech showed that the rates using proposed method increased more greatly than the existing speech recognizer. For recognition of emotion, on the other hands, four kinds of prosodic parameters such as pitch, energy, and their derivatives were proposed, that were then trained by discrete duration continuous hidden Markov models(DDCHMM) for recognition. In this approach, the emotional models were adapted by specific speaker's speech, using maximum a posteriori(MAP) estimation. In evaluation, the recognition results on emotional states showed that the rates on the vocal emotions gradually increased with an increase of adaptation sample number.

A Study of FO's realization in Emotional speech (감정에 따른 음성의 기본주파수 실현 연구)

  • Park, Mi-Young;Park, Mi-Kyoung
    • Proceedings of the KSPS conference
    • /
    • 2005.11a
    • /
    • pp.79-85
    • /
    • 2005
  • In this Paper, we are trying to compare the normal speech with emotional speech -happy, sad, and angry states- through the changes of fundamental frequency. Based on the distribution charts of the normal and emotional speech, there are distinctive cues such as range of distribution, average, maximum, minimum, and so on. On the whole, the range of the fundamental frequency is extended in happy and angry states. On the other hand, sad states make the range relatively lessened. Nevertheless, the ranges of the 10 frequency in sad states are wider than the normal speech. In addition, we can verify that ending boundary tones reflect the information of whole speech.

  • PDF

Frontal Gamma-band Hypersynchronization in Response to Negative Emotion Elicited by Films (영상에 의해 유발된 부정적 감정 상태에 따른 전두엽 감마대역 신경동기화)

  • Kim, Hyun;Choi, Jongdoo;Choi, Jeong Woo;Yeo, Donghoon;Seo, Pukyeong;Her, Seongjin;Kim, Kyung Hwan
    • Journal of Biomedical Engineering Research
    • /
    • v.39 no.3
    • /
    • pp.124-133
    • /
    • 2018
  • We tried to investigate the changes in cortical activities according to emotional valence states during watching video clips. We examined the neural basis of two emotional states (positive and negative) using spectral power analysis and brain functional connectivity analysis of cortical current density time-series reconstructed from high-density electroencephalograms (EEGs). Fifteen healthy participants viewed a series of thirty-two 2 min emotional video clips. Sixty-four channel EEGs were recorded. Distributed cortical sources were reconstructed using weighted minimum norm estimation. The temporal and spatial characteristics of spectral source powers showing significant differences between positive and negative emotion were examined. Also, correlations between gamma-band activities and affective valence ratings were determined. We observed the changes of cortical current density time-series according to emotional states modulated by video clip. Gamma-band activities showed significant difference between emotional states for thirty seconds at the middle and the latter half of the video clip, mainly in prefrontal area. It was also significantly anti-correlated with the self-ratings of emotional valence. In addition, the gamma-band activities in frontal and temporal areas were strongly phase-synchronized, more strongly for negative emotional states. Cortical activities in frontal and temporal areas showed high spectral power and inter-regional phase synchronization in gamma-band during negative emotional states. It is inferred that the higher amygdala activation induced by negative stimuli resulted in strong emotional effects and caused strong local and global synchronization of neural activities in gamma-band in frontal and temporal areas.

Effects of emotional states on information search pattern on small display (감정 상태가 작은 디스플레이의 정보 탐색에 미치는 효과)

  • Kim, Hyuk;Han, Kwang-Hee
    • Science of Emotion and Sensibility
    • /
    • v.9 no.4
    • /
    • pp.321-329
    • /
    • 2006
  • Recently there has been interested in the role of emotion in human computer interaction fields. The present study investigated whether the users emotional state effects on information search pattern for decision making in small screen display. In experiment, to induce specific emotional states(positive and negative emotional state), the participants were asked to listen to music and imagine autobiographic events with different emotional impacts. Subsequently, they performed time limit search tasks with three travel information on small screen display and their search patterns were recorded on real time. The results indicated that a positive emotional state caused more wide and fast Information search pattern in comparison with neutral and negative emotional state. And neutral and negative emotional state caused more cognitive resource to details in comparison with positive emotional state.

  • PDF

Discrimination of Three Emotions using Parameters of Autonomic Nervous System Response

  • Jang, Eun-Hye;Park, Byoung-Jun;Eum, Yeong-Ji;Kim, Sang-Hyeob;Sohn, Jin-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.6
    • /
    • pp.705-713
    • /
    • 2011
  • Objective: The aim of this study is to compare results of emotion recognition by several algorithms which classify three different emotional states(happiness, neutral, and surprise) using physiological features. Background: Recent emotion recognition studies have tried to detect human emotion by using physiological signals. It is important for emotion recognition to apply on human-computer interaction system for emotion detection. Method: 217 students participated in this experiment. While three kinds of emotional stimuli were presented to participants, ANS responses(EDA, SKT, ECG, RESP, and PPG) as physiological signals were measured in twice first one for 60 seconds as the baseline and 60 to 90 seconds during emotional states. The obtained signals from the session of the baseline and of the emotional states were equally analyzed for 30 seconds. Participants rated their own feelings to emotional stimuli on emotional assessment scale after presentation of emotional stimuli. The emotion classification was analyzed by Linear Discriminant Analysis(LDA, SPSS 15.0), Support Vector Machine (SVM), and Multilayer perceptron(MLP) using difference value which subtracts baseline from emotional state. Results: The emotional stimuli had 96% validity and 5.8 point efficiency on average. There were significant differences of ANS responses among three emotions by statistical analysis. The result of LDA showed that an accuracy of classification in three different emotions was 83.4%. And an accuracy of three emotions classification by SVM was 75.5% and 55.6% by MLP. Conclusion: This study confirmed that the three emotions can be better classified by LDA using various physiological features than SVM and MLP. Further study may need to get this result to get more stability and reliability, as comparing with the accuracy of emotions classification by using other algorithms. Application: This could help get better chances to recognize various human emotions by using physiological signals as well as be applied on human-computer interaction system for recognizing human emotions.