• Title/Summary/Keyword: 슬픔감정

Search Result 123, Processing Time 0.022 seconds

An Expansion of Affective Image Access Points Based on Users' Response on Image (이용자 반응 기반 이미지 감정 접근점 확장에 관한 연구)

  • Chung, Eun Kyung
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.25 no.3
    • /
    • pp.101-118
    • /
    • 2014
  • Given the context of rapid developing ubiquitous computing environment, it is imperative for users to search and use images based on affective meanings. However, it has been difficult to index affective meanings of image since emotions of image are substantially subjective and highly abstract. In addition, utilizing low level features of image for indexing affective meanings of image has been limited for high level concepts of image. To facilitate the access points of affective meanings of image, this study aims to utilize user-provided responses of images. For a data set, emotional words are collected and cleaned from twenty participants with a set of fifteen images, three images for each of basic emotions, love, sad, fear, anger, and happy. A total of 399 unique emotion words are revealed and 1,093 times appeared in this data set. Through co-word analysis and network analysis of emotional words from users' responses, this study demonstrates expanded word sets for five basic emotions. The expanded word sets are characterized with adjective expression and action/behavior expression.

Emotion recognition in speech using hidden Markov model (은닉 마르코프 모델을 이용한 음성에서의 감정인식)

  • 김성일;정현열
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.3 no.3
    • /
    • pp.21-26
    • /
    • 2002
  • This paper presents the new approach of identifying human emotional states such as anger, happiness, normal, sadness, or surprise. This is accomplished by using discrete duration continuous hidden Markov models(DDCHMM). For this, the emotional feature parameters are first defined from input speech signals. In this study, we used prosodic parameters such as pitch signals, energy, and their each derivative, which were then trained by HMM for recognition. Speaker adapted emotional models based on maximum a posteriori(MAP) estimation were also considered for speaker adaptation. As results, the simulation performance showed that the recognition rates of vocal emotion gradually increased with an increase of adaptation sample number.

  • PDF

Recognizing Five Emotional States Using Speech Signals (음성 신호를 이용한 화자의 5가지 감성 인식)

  • Kang Bong-Seok;Han Chul-Hee;Woo Kyoung-Ho;Yang Tae-Young;Lee Chungyong;Youn Dae-Hee
    • Proceedings of the Acoustical Society of Korea Conference
    • /
    • autumn
    • /
    • pp.101-104
    • /
    • 1999
  • 본 논문에서는 음성 신호를 이용해서 화자의 감정을 인식하기 위해 3가지 시스템을 구축하고 이들의 성능을 비교해 보았다. 인식 대상으로 하는 감정은 기쁨, 슬픔, 화남, 두려움, 지루함, 평상시의 감정이고, 각 감정에 대한 감정 음성 데이터베이스를 직접 구축하였다. 피치와 에너지 정보를 감성 인식의 특징으로 이용하였고, 인식 알고리듬은 MLB(Maximum-Likelihood Bayes)분류기, NN(Nearest Neighbor)분류기 및 HMM(Hidden Markov Model)분류기를 이용하였다. 이 중 MLB 분류기와 NN 분류기에서는 특징벡터로 피치와 에너지의 평균과 표준편차, 최대값 등 통계적인 정보를 이용하였고, TMM 분류기에서는 각 프레임에서의 델타 피치와 델타델타 피치, 델타 에너지와 델타델타 에너지 등 시간적 정보를 이용하였다. 실험은 화자종속, 문장독립형 방식으로 하였고, 인식 실험 결과는 MLB를 이용해서 $68.9\%, NN을 이용해서 $66.7\%를 얻었고, HMM 분류기를 이용해서 $89.30\%를 얻었다.

  • PDF

Healing Function of the Sijo "High Peak of Chullyoung" (시조 「철령 높은 봉」의 치유적 기능)

  • Jeon, Yangwoo
    • The Journal of the Convergence on Culture Technology
    • /
    • v.5 no.2
    • /
    • pp.39-43
    • /
    • 2019
  • Recently, interest in literary therapy has been increasing. However, sometimes reading therapy is called literary therapy. But if you want to be a literary therapist, you need to know about the feelings that literary texts convey. To this end, Park Inkwa has long studied the relationship between sentences and emotions that the human body ignites. This study was also done in the view that the human body is so inspired and cured by literary text. It was conducted on the Lee Hangbok's Sijo "High Peak of Chullyoung", as it is suitable to discuss the feelings of literary therapy. In this study, Emotions were causing a transference to the reader with grief, which told the effect of literary therapy. We hope that such research will continue and further develop the technique of literary therapy.

Neural-network based Computerized Emotion Analysis using Multiple Biological Signals (다중 생체신호를 이용한 신경망 기반 전산화 감정해석)

  • Lee, Jee-Eun;Kim, Byeong-Nam;Yoo, Sun-Kook
    • Science of Emotion and Sensibility
    • /
    • v.20 no.2
    • /
    • pp.161-170
    • /
    • 2017
  • Emotion affects many parts of human life such as learning ability, behavior and judgment. It is important to understand human nature. Emotion can only be inferred from facial expressions or gestures, what it actually is. In particular, emotion is difficult to classify not only because individuals feel differently about emotion but also because visually induced emotion does not sustain during whole testing period. To solve the problem, we acquired bio-signals and extracted features from those signals, which offer objective information about emotion stimulus. The emotion pattern classifier was composed of unsupervised learning algorithm with hidden nodes and feature vectors. Restricted Boltzmann machine (RBM) based on probability estimation was used in the unsupervised learning and maps emotion features to transformed dimensions. The emotion was characterized by non-linear classifiers with hidden nodes of a multi layer neural network, named deep belief network (DBN). The accuracy of DBN (about 94 %) was better than that of back-propagation neural network (about 40 %). The DBN showed good performance as the emotion pattern classifier.

A Study on the Performance of Music Retrieval Based on the Emotion Recognition (감정 인식을 통한 음악 검색 성능 분석)

  • Seo, Jin Soo
    • The Journal of the Acoustical Society of Korea
    • /
    • v.34 no.3
    • /
    • pp.247-255
    • /
    • 2015
  • This paper presents a study on the performance of the music search based on the automatically recognized music-emotion labels. As in the other media data, such as speech, image, and video, a song can evoke certain emotions to the listeners. When people look for songs to listen, the emotions, evoked by songs, could be important points to consider. However; very little study has been done on the performance of the music-emotion labels to the music search. In this paper, we utilize the three axes of human music perception (valence, activity, tension) and the five basic emotion labels (happiness, sadness, tenderness, anger, fear) in measuring music similarity for music search. Experiments were conducted on both genre and singer datasets. The search accuracy of the proposed emotion-based music search was up to 75 % of that of the conventional feature-based music search. By combining the proposed emotion-based method with the feature-based method, we achieved up to 14 % improvement of search accuracy.

Recognition of the emotional state through the EEG (뇌파를 통한 감정 상태 인식에 관한 연구)

  • Ji, Hoon;Lee, Chung-heon;Park, Mun-Kyu;An, Young-jun;Lee, Dong-hoon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.05a
    • /
    • pp.958-961
    • /
    • 2015
  • Emotional expression is universal and emotional state impacts important areas in our life. Until now, analyzing the acquired EEG signals under circumstances caused by invoked feelings and efforts to define their emotional state have been made mainly by psychologists based on the results. But, recently emotion-related information was released by research results that it is possible to identify mental activity through measuring and analyzing the brain EEG signals. So, this study has compared and analyzed emotional expressions of human by using brain waves. To get EEG difference for a particular emotion, we showed specific subject images to the people for changing emotions that peace, joy, sadness and stress, etc. After measured EEG signals were converged into frequence domain by FFT signal process, we have showed EEG changes in emotion as a result of the performance analyzing each respective power spectrum of delta, theta, alpha, beta and gamma waves.

  • PDF

Emotional User Experiences on Narrative-Based Social Issue Serious Game : Focused on (내러티브 기반 소셜 이슈 기능성 게임의 사용자 감정 경험 연구 : <나누별 이야기>를 중심으로)

  • Lim, Su-Jin;Doh, Young-Yim;Ryu, Seoung-Ho
    • Journal of Korea Game Society
    • /
    • v.12 no.6
    • /
    • pp.131-144
    • /
    • 2012
  • To affect users' attitudes on social issues, some serious games use narrative which can be an effective tool to provoke users' emotions. In this paper, focus group interview on the experiences about playing was conducted to figure out if the parts of narratives containing educational goals can actually provoke certain emotions or not. From the interview, words having emotional meaning were extracted and matched with Korean Emotion Terms Database. Then the emotion terms were compared with Russell's schematic map of core affect and were categorized. The result showed that users mostly felt unpleasant emotions during the play. The unpleasant emotions helped to achieve the game's goal which is conveying the tragedy of war.

Dynamic Facial Expression of Fuzzy Modeling Using Probability of Emotion (감정확률을 이용한 동적 얼굴표정의 퍼지 모델링)

  • Kang, Hyo-Seok;Baek, Jae-Ho;Kim, Eun-Tai;Park, Mignon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.1
    • /
    • pp.1-5
    • /
    • 2009
  • This paper suggests to apply mirror-reflected method based 2D emotion recognition database to 3D application. Also, it makes facial expression of fuzzy modeling using probability of emotion. Suggested facial expression function applies fuzzy theory to 3 basic movement for facial expressions. This method applies 3D application to feature vector for emotion recognition from 2D application using mirror-reflected multi-image. Thus, we can have model based on fuzzy nonlinear facial expression of a 2D model for a real model. We use average values about probability of 6 basic expressions such as happy, sad, disgust, angry, surprise and fear. Furthermore, dynimic facial expressions are made via fuzzy modelling. This paper compares and analyzes feature vectors of real model with 3D human-like avatar.

Study on Heart Rate Variability and PSD Analysis of PPG Data for Emotion Recognition (감정 인식을 위한 PPG 데이터의 심박변이도 및 PSD 분석)

  • Choi, Jin-young;Kim, Hyung-shin
    • Journal of Digital Contents Society
    • /
    • v.19 no.1
    • /
    • pp.103-112
    • /
    • 2018
  • In this paper, we propose a method of recognizing emotions using PPG sensor which measures blood flow according to emotion. From the existing PPG signal, we use a method of determining positive emotions and negative emotions in the frequency domain through PSD (Power Spectrum Density). Based on James R. Russell's two-dimensional prototype model, we classify emotions as joy, sadness, irritability, and calmness and examine their association with the magnitude of energy in the frequency domain. It is significant that this study used the same PPG sensor used in wearable devices to measure the top four kinds of emotions in the frequency domain through image experiments. Through the questionnaire, the accuracy, the immersion level according to the individual, the emotional change, and the biofeedback for the image were collected. The proposed method is expected to be various development such as commercial application service using PPG and mobile application prediction service by merging with context information of existing smart phone.