• Title/Summary/Keyword: 감정(感情)

Search Result 4,458, Processing Time 0.034 seconds

Design of an Artificial Emotion Model (인공 감정 모델의 설계)

  • Lee, In-K.;Seo, Suk-T.;Jeong, Hye-C.;Kwon, Soon-H.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.5
    • /
    • pp.648-653
    • /
    • 2007
  • Researches on artificial emotion which generates emotion artificially from various external excitations imitating human emotion has been initiated in recent years. But the conventional studies in which the emotion state is changed exponentially or linearly by external emotion excitation have a drawback that the variation of emotion state is changed rapidly and abruptly. In this paper, we propose an artificial emotion generation model which reflects not only strength and frequency of external emotion excitations but also period of it in the emotion state and represents the emotion state with a sigmoid curve w.r.t. time. And we propose an artificial emotion system which generates emotion at the situation of no external emotional excitations through recollection of past emotional excitations, and show its effectiveness through computer simulation results.

The Changing Trace of Emotional state by Memory retrieval and Knowledge Reasoning process (기억회상과 지식추론에 따른 감정 상태 변화의 추이)

  • Shim, JeongYon
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.4
    • /
    • pp.83-88
    • /
    • 2013
  • Many studies adopting brain functions to the engineering systems have been made for recent years as the brain Science has developed. If we investigate the parts which take part in memorizing and emotional process, we can know that Hippocampus of memorizing center and Amygdala of Emotional center closely cooperate each other. Actually Knowledge effects on Emotion and Emotion effects on Knowledge. During the human decision making, emotional factor has much important effects on Decision making process. For implementing more delicate intelligent system, the knowledge base coupled to emotional factor should be designed. Accordingly in this paper starting from the idea of cooperating system between Hippocampus and Amygdala,, we design Knowledge Emotion Binding System and propose Emotional changing mechanism by Memory retrieval and knowledge reasoning process.

Emotion Recognition using Robust Speech Recognition System (강인한 음성 인식 시스템을 사용한 감정 인식)

  • Kim, Weon-Goo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.5
    • /
    • pp.586-591
    • /
    • 2008
  • This paper studied the emotion recognition system combined with robust speech recognition system in order to improve the performance of emotion recognition system. For this purpose, the effect of emotional variation on the speech recognition system and robust feature parameters of speech recognition system were studied using speech database containing various emotions. Final emotion recognition is processed using the input utterance and its emotional model according to the result of speech recognition. In the experiment, robust speech recognition system is HMM based speaker independent word recognizer using RASTA mel-cepstral coefficient and its derivatives and cepstral mean subtraction(CMS) as a signal bias removal. Experimental results showed that emotion recognizer combined with speech recognition system showed better performance than emotion recognizer alone.

An Emotion Recognition Method using Facial Expression and Speech Signal (얼굴표정과 음성을 이용한 감정인식)

  • 고현주;이대종;전명근
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.6
    • /
    • pp.799-807
    • /
    • 2004
  • In this paper, we deal with an emotion recognition method using facial images and speech signal. Six basic human emotions including happiness, sadness, anger, surprise, fear and dislike are investigated. Emotion recognition using the facial expression is performed by using a multi-resolution analysis based on the discrete wavelet transform. And then, the feature vectors are extracted from the linear discriminant analysis method. On the other hand, the emotion recognition from speech signal method has a structure of performing the recognition algorithm independently for each wavelet subband and then the final recognition is obtained from a multi-decision making scheme.

Emotion Recognition from Natural Language Text Using Predicate Logic Form (Predicate Logic Form을 이용한 자연어 텍스트로부터의 감정인식)

  • Seol, Yong-Soo;Kim, Dong-Joo;Kim, Han-Woo;Park, Jung-Ki
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2010.07a
    • /
    • pp.411-412
    • /
    • 2010
  • 전통적으로 자연어 텍스트로부터의 감정인식 연구는 감정 키워드에 기반한다. 그러나 감정 키워드만을 이용하면 자연어 문장이 원래 갖고 있는 통사정보나 의미정보는 잃어버리게 된다. 이를 극복하기 위해 본 논문에서는 자연어 텍스트를 Predicate Logic 형태로 변환하여 감정 정보처리의 기반데이터로 사용한다. Predicate Logic형태로 변환하기 위해서 의존 문법 구문분석기를 사용하였다. 이렇게 생성된 Predicate 데이터 중 감정 정보를 갖고 있는 Predicate만을 찾아내는데 이를 위해 Emotional Predicate Dictionary를 구축하였고 이 사전에는 하나의 Predicate마다 미리 정의된 개념 클래스로 사상 시킬 수 있는 정보를 갖고 있다. 개념 클래스는 감정정보를 갖고 있는지, 어떤 감정인지, 어떤 상황에서 발생하는 감정인지에 대한 정보를 나타낸다. 자연어 텍스트가 Predicate으로 변환되고 다시 개념 클래스로 사상되고 나면 KBANN으로 구현된 Lazarus의 감정 생성 규칙에 적용시켜 최종적으로 인식된 감정을 판단한다. 실험을 통해 구현된 시스템이 인간이 인식한 감정과 약 70%이상 유사한 인식 결과를 나타냄을 보인다.

  • PDF

Relationship among Emotional Labor, Emotional Leadership and Burnout in Emergency Room Nurses - Comparison of employee-focused emotional labor and job-focused emotional labor - (응급실 간호사의 감정노동과 감성리더십, 소진의 관계 -직원중심 감정노동과 직무중심 감정노동 비교-)

  • Eo, Yong-Sook;Kim, Myo-Sung
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.9
    • /
    • pp.136-145
    • /
    • 2017
  • This study aimed to identify the relationship among employee-focused and job-focused emotional labor, emotional leadership, and burnout among emergency room nurses. Data were collected from 168 emergency nurses working at general hospitals in one metropolitan city and analyzed by descriptive statistics, t-test, ANOVA, and Pearson's correlation coefficient using the SPSS/WIN program. According to the results, the mean scores of the employee-focused emotional labor scale were 3.51 points for superficial acting and 3.26 for deep acting. The mean score of the job-focused scale was 3.73 for frequency of emotional surface, 3.36 for duration of interpersonal interactions, and 3.46 for variety of emotional expressions. The mean score of the emotional leadership scale was 3.57, and the mean scores of the burnout were 4.59 for emotional exhaustion, 4.13 for depersonalization, and 3.60 for diminished personal accomplishment. Job-focused emotional labor scores were higher than employee-focused scores. Superficial acting and frequency of interactions in emotional labor were the most frequently performed by emergency nurses. Employee-focused and job-focused emotional labor showed a significantly partial correlation with emotional leadership and burnout. Based on the results, future research needs to develop effective strategies for managing emotional labor and burnout of nurses working in emergency rooms.

Engine of computational Emotion model for emotional interaction with human (인간과 감정적 상호작용을 위한 '감정 엔진')

  • Lee, Yeon Gon
    • Science of Emotion and Sensibility
    • /
    • v.15 no.4
    • /
    • pp.503-516
    • /
    • 2012
  • According to the researches of robot and software agent until now, computational emotion model is dependent on system, so it is hard task that emotion models is separated from existing systems and then recycled into new systems. Therefore, I introduce the Engine of computational Emotion model (shall hereafter appear as EE) to integrate with any robots or agents. This is the engine, ie a software for independent form from inputs and outputs, so the EE is Emotion Generation to control only generation and processing of emotions without both phases of Inputs(Perception) and Outputs(Expression). The EE can be interfaced with any inputs and outputs, and produce emotions from not only emotion itself but also personality and emotions of person. In addition, the EE can be existed in any robot or agent by a kind of software library, or be used as a separate system to communicate. In EE, emotions is the Primary Emotions, ie Joy, Surprise, Disgust, Fear, Sadness, and Anger. It is vector that consist of string and coefficient about emotion, and EE receives this vectors from input interface and then sends its to output interface. In EE, each emotions are connected to lists of emotional experiences, and the lists consisted of string and coefficient of each emotional experiences are used to generate and process emotional states. The emotional experiences are consisted of emotion vocabulary understanding various emotional experiences of human. This study EE is available to use to make interaction products to response the appropriate reaction of human emotions. The significance of the study is on development of a system to induce that person feel that product has your sympathy. Therefore, the EE can help give an efficient service of emotional sympathy to products of HRI, HCI area.

  • PDF

Speech emotion recognition for affective human robot interaction (감성적 인간 로봇 상호작용을 위한 음성감정 인식)

  • Jang, Kwang-Dong;Kwon, Oh-Wook
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.555-558
    • /
    • 2006
  • 감정을 포함하고 있는 음성은 청자로 하여금 화자의 심리상태를 파악할 수 있게 하는 요소 중에 하나이다. 음성신호에 포함되어 있는 감정을 인식하여 사람과 로봇과의 원활한 감성적 상호작용을 위하여 특징을 추출하고 감정을 분류한 방법을 제시한다. 음성신호로부터 음향정보 및 운율정보인 기본 특징들을 추출하고 이로부터 계산된 통계치를 갖는 특징벡터를 입력으로 support vector machine (SVM) 기반의 패턴분류기를 사용하여 6가지의 감정- 화남(angry), 지루함(bored), 기쁨(happy), 중립(neutral), 슬픔(sad) 그리고 놀람(surprised)으로 분류한다. SVM에 의한 인식실험을 한 경우 51.4%의 인식률을 보였고 사람의 판단에 의한 경우는 60.4%의 인식률을 보였다. 또한 화자가 판단한 감정 데이터베이스의 감정들을 다수의 청자가 판단한 감정 상태로 변경한 입력을 SVM에 의해서 감정을 분류한 결과가 51.2% 정확도로 감정인식하기 위해 사용한 기본 특징들이 유효함을 알 수 있다.

  • PDF

An Emotion Recognition and Expression Method using Facial Image and Speech Signal (음성 신호와 얼굴 표정을 이용한 감정인식 몇 표현 기법)

  • Ju, Jong-Tae;Mun, Byeong-Hyeon;Seo, Sang-Uk;Jang, In-Hun;Sim, Gwi-Bo
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2007.04a
    • /
    • pp.333-336
    • /
    • 2007
  • 본 논문에서는 감정인식 분야에서 가장 많이 사용되어지는 음성신호와 얼굴영상을 가지고 4개의(기쁨, 슬픔, 화남, 놀람) 감정으로 인식하고 각각 얻어진 감정인식 결과를 Multi modal 기법을 이용해서 이들의 감정을 융합한다. 이를 위해 얼굴영상을 이용한 감정인식에서는 주성분 분석(Principal Component Analysis)법을 이용해 특징벡터를 추출하고, 음성신호는 언어적 특성을 배재한 acoustic feature를 사용하였으며 이와 같이 추출된 특징들을 각각 신경망에 적용시켜 감정별로 패턴을 분류하였고, 인식된 결과는 감정표현 시스템에 작용하여 감정을 표현하였다.

  • PDF

Personalized Emotional Engine for effective contents providing (컨텐츠의 효과적인 공급을 위한 개인화된 감성엔진)

  • Ham, Jun-Seok;Ko, Il-Ju
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02c
    • /
    • pp.152-157
    • /
    • 2007
  • 개인의 감성에 잘 부합하는 컨텐츠를 제공하기 위해선 개인의 취향과 컨텐츠를 이용할 때 어떤 감정을 가지고 있는지 알아야 한다. 개인의 취향을 알고 있다면 취향에 부합하는 컨텐츠를 재분류 할 수 있고 개인이 컨텐츠를 이용하려할 때의 감정 상태를 알 수 있다면 분류된 컨텐츠 중에서 감성에 부합하는 컨텐츠를 빠르게 제공할 수 있다. 본 논문은 개인의 취향을 인식하고 감정상태를 추측하여 연동된 플랫폼에 감정상태에 따른 결과를 출력하는 감성엔진을 제안 한다. 성격이 비슷한 사람은 비슷한 취향을 가지는 성향이 있으므로 취향인식을 위해 개인의 성격을 구별했고, 구별도구로 MBTI를 이용했다. 감정 추측을 위해 주변에서 일어나는 환경의 정보를 OCC 모델을 기반으로 분석하여 감정자극의 종류와 크기를 산출했다. 감정자극들을 감정그래프를 이용해 감정의 생성, 유지, 소멸을 관리하고 감정감정 상태에 맞는 결과를 출력했다.

  • PDF