• 제목/요약/키워드: Control Emotion

검색결과 595건 처리시간 0.025초

모의 지능로봇에서 음성신호에 의한 감정인식 (Speech Emotion Recognition by Speech Signals on a Simulated Intelligent Robot)

  • 장광동;권오욱
    • 대한음성학회:학술대회논문집
    • /
    • 대한음성학회 2005년도 추계 학술대회 발표논문집
    • /
    • pp.163-166
    • /
    • 2005
  • We propose a speech emotion recognition method for natural human-robot interface. In the proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad and surprised. Features for an input utterance are extracted from statistics of phonetic and prosodic information. Phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; Prosodic information includes pitch, jitter, duration, and rate of speech. Finally a patten classifier based on Gaussian support vector machines decides the emotion class of the utterance. We record speech commands and dialogs uttered at 2m away from microphones in 5different directions. Experimental results show that the proposed method yields 59% classification accuracy while human classifiers give about 50%accuracy, which confirms that the proposed method achieves performance comparable to a human.

  • PDF

프로야구경기장 서비스스케이프와 통제지각, 소비감정, 재방문 의도의 관계 (Relationship among professional baseball stadium servicescape, control perception, consumer emotion, and revisit intention)

  • 마윤성;고경진;이광용
    • 디지털융복합연구
    • /
    • 제17권1호
    • /
    • pp.389-401
    • /
    • 2019
  • 본 연구의 목적은 프로야구 경기장을 방문한 관람객들의 서비스스케이프 체험과 현장에서의 통제지각, 소비감정, 재방문의도 의 관계를 규명하는 것이다. 총 273부의 설문지를 SPSS 20.0과 AMOS 20.0을 사용하여 분석을 실시하였다. 빈도분석과 신뢰도 분석, 확인적 요인분석, 상관관계분석을 통해 자료의 타당성을 검증하였으며, 구조방정식 모형분석을 실시하여 가설을 검증하였다. 첫째, 서비스스케이프는 통제지각에 통계적으로 유의한 영향을 미치는 것으로 나타났다. 둘째, 서비스스케이프에서의 통제지각은 소비감정에 유의한 영향을 미쳤다. 셋째, 서비스스케이프는 소비감정에 영향을 미쳤다. 넷째, 소비감정은 재방문의도에 유의한 영향을 미치는 것으로 나타났다. 본 연구의 결과는 야구경기장을 방문하는 관람객들이 서비스스케이프에 대한 긍정적 체험을 통해 재방문을 유도할 수 있음을 시사하였다. 구체적인 논의와 시사점을 본문에 서술하였다.

제스처와 EEG 신호를 이용한 감정인식 방법 (Emotion Recognition Method using Gestures and EEG Signals)

  • 김호덕;정태민;양현창;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제13권9호
    • /
    • pp.832-837
    • /
    • 2007
  • Electroencephalographic(EEG) is used to record activities of human brain in the area of psychology for many years. As technology develope, neural basis of functional areas of emotion processing is revealed gradually. So we measure fundamental areas of human brain that controls emotion of human by using EEG. Hands gestures such as shaking and head gesture such as nodding are often used as human body languages for communication with each other, and their recognition is important that it is a useful communication medium between human and computers. Research methods about gesture recognition are used of computer vision. Many researchers study Emotion Recognition method which uses one of EEG signals and Gestures in the existing research. In this paper, we use together EEG signals and Gestures for Emotion Recognition of human. And we select the driver emotion as a specific target. The experimental result shows that using of both EEG signals and gestures gets high recognition rates better than using EEG signals or gestures. Both EEG signals and gestures use Interactive Feature Selection(IFS) for the feature selection whose method is based on a reinforcement learning.

생리적 내재반응 및 얼굴표정 간 확률 관계 모델 기반의 감정인식 시스템에 관한 연구 (A Study on Emotion Recognition Systems based on the Probabilistic Relational Model Between Facial Expressions and Physiological Responses)

  • 고광은;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제19권6호
    • /
    • pp.513-519
    • /
    • 2013
  • The current vision-based approaches for emotion recognition, such as facial expression analysis, have many technical limitations in real circumstances, and are not suitable for applications that use them solely in practical environments. In this paper, we propose an approach for emotion recognition by combining extrinsic representations and intrinsic activities among the natural responses of humans which are given specific imuli for inducing emotional states. The intrinsic activities can be used to compensate the uncertainty of extrinsic representations of emotional states. This combination is done by using PRMs (Probabilistic Relational Models) which are extent version of bayesian networks and are learned by greedy-search algorithms and expectation-maximization algorithms. Previous research of facial expression-related extrinsic emotion features and physiological signal-based intrinsic emotion features are combined into the attributes of the PRMs in the emotion recognition domain. The maximum likelihood estimation with the given dependency structure and estimated parameter set is used to classify the label of the target emotional states.

Emotion Detection Algorithm Using Frontal Face Image

  • Kim, Moon-Hwan;Joo, Young-Hoon;Park, Jin-Bae
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.2373-2378
    • /
    • 2005
  • An emotion detection algorithm using frontal facial image is presented in this paper. The algorithm is composed of three main stages: image processing stage and facial feature extraction stage, and emotion detection stage. In image processing stage, the face region and facial component is extracted by using fuzzy color filter, virtual face model, and histogram analysis method. The features for emotion detection are extracted from facial component in facial feature extraction stage. In emotion detection stage, the fuzzy classifier is adopted to recognize emotion from extracted features. It is shown by experiment results that the proposed algorithm can detect emotion well.

  • PDF

VISCOSITY RESISTANCE CONTROL OF INTELLIGENT PROSTHETIC-LEGS

  • Hashimoto, Minoru;Ono, Kenji
    • 한국감성과학회:학술대회논문집
    • /
    • 한국감성과학회 2000년도 춘계 학술대회 및 국제 감성공학 심포지움 논문집 Proceeding of the 2000 Spring Conference of KOSES and International Sensibility Ergonomics Symposium
    • /
    • pp.328-329
    • /
    • 2000
  • A viscosity resistance control method of the intelligent prosthetic legs is studied using an optimal control theory. The simulated results suggests that it is important to control the viscosity of the prosthetic knee joint in one period of walking to improve the usability. In this paper we describe modeling of the thigh prosthetic legs, optimal control and simulated results.

  • PDF

세대간 원예활동 프로그램이 노인과 유아의 정서와 자아존중감에 미치는 영향 (Effects of the Intergenerational Horticultural Activity Program on Emotion and Self-esteem of the Elderly and Young Children)

  • 이은숙;박현구;김미옥;박천호
    • 원예과학기술지
    • /
    • 제28권3호
    • /
    • pp.484-491
    • /
    • 2010
  • 본 연구는 세대간 원예활동 프로그램이 노인과 유아의 정서와 자아존중감에 어떠한 영향을 미치는지 알아보고자 하였다. 세대간 원예활동 프로그램 실시 전, 후 노인의 정서를 비교한 결과, 대조군과 실험군 모두 유의한 차이를 나타내지 않았다. 노인의 자아존중감을 비교한 결과 대조군에서는 유의한 차이를 나타내지 않았지만, 실험군에서는 유의한 차이를 나타냈다($p$<0.01). 세대간 원예활동 프로그램 실시 전, 후 유아의 정서지능을 비교했을 때, 대조군은 유의한 차이를 나타내지 않았으나, 실험군은 유의한 차이를 나타냈다($p$<0.001). 유아의 자아존중감을 비교한 결과 대조군, 실험군 두 집단 모두 유의한 차이를 나타내지 않았다. 본 연구를 통해 세대간 원예활동 프로그램이 노인의 자아존중감과 유아의 정서지능 향상에 기여할 수 있었다.

Robot behavior decision based on Motivation and Hierarchicalized Emotions

  • Ahn, Hyoung-Chul;Park, Myoung-Soo;Choi, Jin-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1776-1780
    • /
    • 2004
  • In this paper, we propose the new emotion model and the robot behavior decision model based on proposed emotion model. As like in human, emotions are hierarchicalized in four levels (momentary emotions, mood, attitude, and personality) and are determined from the robot behavior and human responses. They are combined with motivation (which is determined from the external stimuli) to determine the robot behavior.

  • PDF

Emotion Recognition by CCD Color Image

  • Joo, Young-Hoon;Lee, Sang-Yoon;Oh, Jae-Heung;Sim, Kwee-Bo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.138.2-138
    • /
    • 2001
  • This paper proposes the technique for recognizing the human´s emotion by using the CCD color image. To do this, we first acquire the color image from the CCD camera. And then propose the method for recognizing the expressing to be represented the structural correlation of man´s feature points(eyebrows, eye, nose, mouse), In the proposed method. Human´s emotion is divided into four emotion(surprise, anger, happiness, sadness). Finally, we have proven the effectiveness of the proposed method through the experimentation.

  • PDF