• 제목/요약/키워드: Structure of Emotion

검색결과 290건 처리시간 0.026초

이종 음성 DB 환경에 강인한 감성 분류 체계에 대한 연구 (A Study on Robust Emotion Classification Structure Between Heterogeneous Speech Databases)

  • 윤원중;박규식
    • 한국음향학회지
    • /
    • 제28권5호
    • /
    • pp.477-482
    • /
    • 2009
  • 고객센터 (call-center)와 같은 기업환경의 감성인식 시스템은 감성 훈련용 음성과 불특정 고객들의 질의 음성간의 녹취 환경차이로 인해 상당한 시스템 성능 저하와 불안정성을 겪게 된다. 본 논문에서는 이러한 문제점을 극복하기 위해 기존의 전통적인 평상/화남 감성 분류체계를 남 녀 성별에 따른 감성별 특성 변화를 적용하여 2단계 분류체계로 확장하였다. 실험 결과, 제안한 방법은 녹취 환경 차이로 인한 시스템 불안정성을 해소할 수 있을 뿐 아니라 약 25% 가까운 인식 성능 개선을 가져올 수 있었다.

연령별 향 감성구조 및 향 감성에 따른 자율신경계 반응 (Psychological Structure and ANS Response by Odor Induced Emotion)

  • 박미경;정희윤;이경화;최정인;이배환;손진훈
    • 감성과학
    • /
    • 제4권2호
    • /
    • pp.39-45
    • /
    • 2001
  • 본 연구는 향에 대한 연령별 감성구조를 밝히고, 향 감성에 따른 자율신경계 반응을 규명하기 위해 수행되었다. 실험 참여자는 연령별(10대, 20대, 40대)로 24명씩 총 72명이고, 자극향은 cederwood, grapefruit, teebaum, peppermint, rose향을 사용하였다. 향이 제시되는 동안 혈류량, 피부온도, 피부전기 반응, 심전도 반응을 측정하였고, 향에 대한 주관적 감성을 측정하였다. 향에 의한 심리적 감성구조는 심미성, 강도, 자연성, 개성성, 낭만성의 다섯 요인이 확인되었고, 연령별로 차이가 나타나지 않았다. 향 선호도를 예측하는 감성요인은 연령별에 따라 부분적으로 차이가 있었다. 감성적으로 “나쁜” 향은 “좋은” 향보다 교감신경계 활동에 의한 자율신경계 반응이 더 크게 일어났다.

  • PDF

현대패션에 대한 감성과 감정의 관계 연구(제1보) (A Study of the relationship between Fashion Sensibility and Emotion(Part II))

  • 김유진;이경희
    • 한국의류학회지
    • /
    • 제27권3_4호
    • /
    • pp.418-428
    • /
    • 2003
  • The purpose of this study was to provide the guidance in more objective and proper clothing design reflecting today's consumers' modes in value consumption by identifying the meaning structure and relationship between fashion sensibility and emotion. The stimulus was 54 photos of contemporary costume which represented the Izard' DES. The questionnaire consisted of hi-polar 25 pairs adjective scale of fashion sensibility and the 18 noun scale of emotion was distributed to 970 male and female living in Pusan area. The data were analyzed by Factor analysis, Correlation analysis and Regression analysis using the statistical SPSS package. The major finding of this research were as follows.1. Fashion sensibilities consist of estheticism, maturity, character and feminity to represent 57.17% total varlarlce. 2. Emotions consist of negative emotion, distress afraid, arousal, shame and enjoyment to represent 70.84% total variance. 3. For the relation between fashion sensibility and emotion, they showed significant relationship in most of factors.

패션컬렉션에 나타난 Head Image 연구 (A Study Regarding Head Image′s Through Fashion Collection)

  • 김애경;이경희
    • 한국의류학회지
    • /
    • 제27권8호
    • /
    • pp.904-912
    • /
    • 2003
  • This study for‘Head Image’, which is affected by individual Image, is via fashion collection to analyze formative feature, fashion emotion and meaning structure of emotion and to inquire into correlation. I will offer fundamental data, which is can use Image making from the state of thing. First, to make charm and personal image, if we consider Head image well, it will very effective by the reason that personality and charm operate as important factors in fashion sensibility of Head Image. Second, we can know Head Image has more strong influence the part of emotion than fashion sensibility by showing that the sense of emotion is higher than this point of view of fashion sensibility in Head Image. Third, As a result of the correlation of fashion sensibility and emotion in Head Image, personal Head Image is effective to attract public gaze by causing negative emotion, and attractive Head Image is effective to give pleasant feeling by causing positive emotion. Forth, Avant-garde, Punk, Kitsch Image were estimated as the most personal things and Romantic, Ethnic Image were estimated as the most attractive things of the type of Head Image. Natural Image was estimated as the most feminine thing, and Elegant Image was estimated as the most mature thing. Fifth, when we look into the different appraisals between experts and amateurs about fashion sensibility and emotion of Head Image, a selection of experts are used to peculiar and strong Head Image, so amateurs respond it more sensitively and highly evaluate.

호르몬 모델에 기반한 안드로이드의 감정모델 (Emotional Model for an Android based on Hormone Model)

  • 이동욱;이태근;정준영;소병록;손웅희;백문홍;김홍석;이호길
    • 로봇학회논문지
    • /
    • 제2권4호
    • /
    • pp.341-345
    • /
    • 2007
  • This paper proposes an emotional interaction model between human and robot using an android. An android is a sort of humanoid robot that the outward shape of robot is almost the same as that of human. The android is a robot platform to implement and test emotional expressions and human interaction. In order to behave for the android like human, a structure of internal emotion system is very important. In our research, we propose a novel emotional model of android based on biological hormone and emotion space. Proposed emotion model has an advantage that it can represent emotion change as time by hormone dynamics.

  • PDF

Causal relationship study of human sense for odor

  • Kaneki, N.;Shimada, K.;Yamada, H.;Miura, T.;Kamimura, H.;Tanaka, H.
    • 한국감성과학회:학술대회논문집
    • /
    • 한국감성과학회 2002년도 춘계학술대회 논문집
    • /
    • pp.257-260
    • /
    • 2002
  • The impressions for odors are subjective and have individual differences. In this study, the Impressions of odors were investigated by covariance structure analysis. 46 subjects (men in their twenty) recorded their reactions to ten odorants by grading them on a seven-point scale in terms of twelve adjective pairs. Their reactions were quantified by using factor analysis and covariance structure analysis. The factors were extracted as "preference", "arousal" and "persistency". The subjects were classified into three groups according to the most suitable causal models (structural equation models). Each group had different causal relationship and different impression structure for odors. It was suggested that there is a possibility to evaluate the subjective impression of odor using covariance structure analysis.

  • PDF

데이터 표준화를 위한 패션 감성 분류 체계 (Classification System of Fashion Emotion for the Standardization of Data)

  • 박낭희;최윤미
    • 한국의류학회지
    • /
    • 제45권6호
    • /
    • pp.949-964
    • /
    • 2021
  • Accumulation of high-quality data is crucial for AI learning. The goal of using AI in fashion service is to propose of a creative, personalized solution that is close to the know-how of a human operator. These customized solutions require an understanding of fashion products and emotions. Therefore, it is necessary to accumulate data on the attributes of fashion products and fashion emotion. The first step for accumulating fashion data is to standardize the attribute with coherent system. The purpose of this study is to propose a fashion emotional classification system. For this, images of fashion products were collected, and metadata was obtained by allowing consumers to describe their emotions about fashion images freely. An emotional classification system with a hierarchical structure, was then constructed by performing frequency and CONCOR analyses on metadata. A final classification system was proposed by supplementing attribute values with reference to findings from previous studies and SNS data.

A Multimodal Emotion Recognition Using the Facial Image and Speech Signal

  • Go, Hyoun-Joo;Kim, Yong-Tae;Chun, Myung-Geun
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제5권1호
    • /
    • pp.1-6
    • /
    • 2005
  • In this paper, we propose an emotion recognition method using the facial images and speech signals. Six basic emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Facia] expression recognition is performed by using the multi-resolution analysis based on the discrete wavelet. Here, we obtain the feature vectors through the ICA(Independent Component Analysis). On the other hand, the emotion recognition from the speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and the final recognition is obtained from the multi-decision making scheme. After merging the facial and speech emotion recognition results, we obtained better performance than previous ones.

생리적 내재반응 및 얼굴표정 간 확률 관계 모델 기반의 감정인식 시스템에 관한 연구 (A Study on Emotion Recognition Systems based on the Probabilistic Relational Model Between Facial Expressions and Physiological Responses)

  • 고광은;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제19권6호
    • /
    • pp.513-519
    • /
    • 2013
  • The current vision-based approaches for emotion recognition, such as facial expression analysis, have many technical limitations in real circumstances, and are not suitable for applications that use them solely in practical environments. In this paper, we propose an approach for emotion recognition by combining extrinsic representations and intrinsic activities among the natural responses of humans which are given specific imuli for inducing emotional states. The intrinsic activities can be used to compensate the uncertainty of extrinsic representations of emotional states. This combination is done by using PRMs (Probabilistic Relational Models) which are extent version of bayesian networks and are learned by greedy-search algorithms and expectation-maximization algorithms. Previous research of facial expression-related extrinsic emotion features and physiological signal-based intrinsic emotion features are combined into the attributes of the PRMs in the emotion recognition domain. The maximum likelihood estimation with the given dependency structure and estimated parameter set is used to classify the label of the target emotional states.

독일어 감정음성에서 추출한 포먼트의 분석 및 감정인식 시스템과 음성인식 시스템에 대한 음향적 의미 (An Analysis of Formants Extracted from Emotional Speech and Acoustical Implications for the Emotion Recognition System and Speech Recognition System)

  • 이서배
    • 말소리와 음성과학
    • /
    • 제3권1호
    • /
    • pp.45-50
    • /
    • 2011
  • Formant structure of speech associated with five different emotions (anger, fear, happiness, neutral, sadness) was analysed. Acoustic separability of vowels (or emotions) associated with a specific emotion (or vowel) was estimated using F-ratio. According to the results, neutral showed the highest separability of vowels followed by anger, happiness, fear, and sadness in descending order. Vowel /A/ showed the highest separability of emotions followed by /U/, /O/, /I/ and /E/ in descending order. The acoustic results were interpreted and explained in the context of previous articulatory and perceptual studies. Suggestions for the performance improvement of an automatic emotion recognition system and automatic speech recognition system were made.

  • PDF