• Title/Summary/Keyword: Artificial Emotion Model

Search Result 50, Processing Time 0.026 seconds

Design of an Artificial Emotion Model (인공 감정 모델의 설계)

  • Lee, In-K.;Seo, Suk-T.;Jeong, Hye-C.;Kwon, Soon-H.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.5
    • /
    • pp.648-653
    • /
    • 2007
  • Researches on artificial emotion which generates emotion artificially from various external excitations imitating human emotion has been initiated in recent years. But the conventional studies in which the emotion state is changed exponentially or linearly by external emotion excitation have a drawback that the variation of emotion state is changed rapidly and abruptly. In this paper, we propose an artificial emotion generation model which reflects not only strength and frequency of external emotion excitations but also period of it in the emotion state and represents the emotion state with a sigmoid curve w.r.t. time. And we propose an artificial emotion system which generates emotion at the situation of no external emotional excitations through recollection of past emotional excitations, and show its effectiveness through computer simulation results.

Behavior Decision Model Based on Emotion and Dynamic Personality

  • Yu, Chan-Woo;Choi, Jin-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.101-106
    • /
    • 2005
  • In this paper, we propose a behavior decision model for a robot, which is based on artificial emotion, various motivations and dynamic personality. Our goal is making a robot which can express its emotion human-like way. To achieve this goal, we applied several emotion and personality theories in psychology. Especially, we introduced the concept of dynamic personality model for a robot. Drawing on this concept, we could make a behavior decision model so that the emotion expression of the robot has adaptability to various environments through interactions between human and the robot.

  • PDF

Visualizing Emotions with an Artificial Emotion Model Based on Psychology -Focused on Characters in Hamlet- (심리학 기반 인공감정모델을 이용한 감정의 시각화 -햄릿의 등장인물을 중심으로-)

  • Ham, Jun-Seok;Ryeo, Ji-Hye;Ko, Il-Ju
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.541-552
    • /
    • 2008
  • We cannot express emotions correctly with only speech because it is hard to estimate the kind, size, amount of emotions. Hamlet who is a protagonist in 'Hamlet' of Shakespeare has emotions which cannot be expressed within only speech because he is in various dramatic situations. So we supposed an artificial emotion, instead of expressing emotion with speech, expressing and visualizing current emotions with color and location. And we visualized emotions of characters in 'Hamlet' with the artificial emotion. We designed the artificial emotion to four steps considering peculiarities of emotion. First, the artificial emotion analyzes inputted emotional stimulus as relationship between causes and effects and analyzes its kinds and amounts. Second, we suppose Emotion Graph Unit to express generating, maintaining, decaying of analyzed one emotional stimuli which is outputted by first step, according to characteristic. Third, using Emotion Graph Unit, we suppose Emotion Graph that expresses continual same emotional stimulus. And we make Emotion Graph at each emotions, managing generation and decay of emotion individually. Last, we suppose Emotion Field can express current combined value of Emotion Graph according to co-relation of various emotions, and visualize current emotion by a color and a location in Emotion Field. We adjusted the artificial emotion to the play 'Hamlet' to test and visualize changes of emotion of Hamlet and his mother, Gertrude.

  • PDF

Implementation of Intelligent Virtual Character Based on Reinforcement Learning and Emotion Model (강화학습과 감정모델 기반의 지능적인 가상 캐릭터의 구현)

  • Woo Jong-Ha;Park Jung-Eun;Oh Kyung-Whan
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.3
    • /
    • pp.259-265
    • /
    • 2006
  • Learning and emotions are very important parts to implement intelligent robots. In this paper, we implement intelligent virtual character based on reinforcement learning which interacts with user and have internal emotion model. Virtual character acts autonomously in 3D virtual environment by internal state. And user can learn virtual character specific behaviors by repeated directions. Mouse gesture is used to perceive such directions based on artificial neural network. Emotion-Mood-Personality model is proposed to express emotions. And we examine the change of emotion and learning behaviors when virtual character interact with user.

An Intelligent Emotion Recognition Model Using Facial and Bodily Expressions

  • Jae Kyeong Kim;Won Kuk Park;Il Young Choi
    • Asia pacific journal of information systems
    • /
    • v.27 no.1
    • /
    • pp.38-53
    • /
    • 2017
  • As sensor technologies and image processing technologies make collecting information on users' behavior easy, many researchers have examined automatic emotion recognition based on facial expressions, body expressions, and tone of voice, among others. Specifically, many studies have used normal cameras in the multimodal case using facial and body expressions. Thus, previous studies used a limited number of information because normal cameras generally produce only two-dimensional images. In the present research, we propose an artificial neural network-based model using a high-definition webcam and Kinect to recognize users' emotions from facial and bodily expressions when watching a movie trailer. We validate the proposed model in a naturally occurring field environment rather than in an artificially controlled laboratory environment. The result of this research will be helpful in the wide use of emotion recognition models in advertisements, exhibitions, and interactive shows.

A Design of Artificial Emotion Model (인공 감정 모델의 설계)

  • Lee, In-Geun;Seo, Seok-Tae;Jeong, Hye-Cheon;Gwon, Sun-Hak
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2007.04a
    • /
    • pp.58-62
    • /
    • 2007
  • 인간이 생성한 음성, 표정 영상, 문장 등으로부터 인간의 감정 상태를 인식하는 연구와 함께, 인간의 감정을 모방하여 다양한 외부 자극으로 감정을 생성하는 인공 감정(Artificial Emotion)에 관한 연구가 이루어지고 있다. 그러나 기존의 인공 감정 연구는 외부 감정 자극에 대한 감정 변화 상태를 선형적, 지수적으로 변화시킴으로써 감정 상태가 급격하게 변하는 형태를 보인다. 본 논문에서는 외부 감정 자극의 강도와 빈도뿐만 아니라 자극의 반복 주기를 감정 상태에 반영하고, 시간에 따른 감정의 변화를 Sigmoid 곡선 형태로 표현하는 감정 생성 모델을 제안한다. 그리고 기존의 감정 자극에 대한 회상(recollection)을 통해 외부 감정 자극이 없는 상황에서도 감정을 생성할 수 있는 인공 감정 시스템을 제안한다.

  • PDF

Design of an Artificial Emotion for visualizing emotion (감정의 시각화를 위한 인공감정 설계)

  • Ham, Jun-Seok;Son, Chung-Yeon;Jeong, Chan-Sun;Park, Jun-Hyeong;Yeo, Ji-Hye;Go, Il-Ju
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2009.11a
    • /
    • pp.91-94
    • /
    • 2009
  • 인공감정에 관련된 기존의 연구는 대부분 감정의 인식과 물리적 표현에 중점 되어 연구되었다. 하지만 감정은 성격에 따라 달리 표출되고, 시간에 따라 변화 양상을 갖는다. 또한 새로운 감정자극을 받기 이 전의 감정상태에 따라서 표출 될 감정은 달라진다. 본 논문은 감정을 성격, 시간, 감정간의 관계에 따라 관리하여 현재 표출될 감정을 시각화 해주는 인공감정을 제안한다. 감정을 시각화하기 위해서 본 논문의 인공감정은 감정그래프와 감정장을 갖는다. 감정그래프는 특정 감정을 성격과 시간에 따라 표현하는 2차원 형태의 그래프 이다. 감정장은 감정그래프에서 표현된 서로 다른 종류의 감정들을 시간과 감정간의 관계에 따라 시각화 해주는 3차원 형태의 모델이다. 제안된 인공감정을 통해 감정을 시각화해 보기 위해, 감정의 인식과 물리적 표현을 텍스트 기반으로 간소화시킨 시뮬레이터에 적용했다.

  • PDF

Human emotional elements and external stimulus information-based Artificial Emotion Expression System for HRI (HRI를 위한 사람의 내적 요소 기반의 인공 정서 표현 시스템)

  • Oh, Seung-Won;Hahn, Min-Soo
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.7-12
    • /
    • 2008
  • In human and robot interaction, the role of emotion becomes more important Therefore, robots need the emotion expression mechanism similar to human. In this paper, we suggest a new emotion expression system based on the psychological studies and it consists of five affective elements, i.e., the emotion, the mood, the personality, the tendency, and the machine rhythm. Each element has somewhat peculiar influence on the emotion expression pattern change according to their characteristics. As a result, although robots were exposed to the same external stimuli, each robot can show a different emotion expression pattern. The proposed system may contribute to make a rather natural and human-friendly human-robot interaction and to promote more intimate relationships between people and robots.

  • PDF

Developing and Adopting an Artificial Emotion by Technological Approaching Based on Psychological Emotion Model (심리학 기반 감정 모델의 공학적 접근에 의한 인공감정의 제안과 적용)

  • Ham, Jun-Seok;Ryeo, Ji-Hye;Ko, Il-Ju
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2008.06a
    • /
    • pp.331-336
    • /
    • 2008
  • 같은 상황이라도 사람에 따라 느끼는 감정은 다르다. 따라서 감정을 일반화하여 현재의 감정 상태를 정량적으로 표현하는데 는 한계가 있다. 본 논문은 현재의 감정 상태를 나타내기 위해, 인간의 감정을 모델링한 심리학의 감정 모델을 공학적으로 접근하여 심리학기반 공학적 인공감정을 제안한다. 제안된 인공감정은 심리학을 기반으로 감정발생의 인과관계, 성격에 따른 감정의 차이, 시간에 따른 감정의 차이, 연속된 감정자극에 따른 감정의 차이, 감정간의 상호관계에 따른 감정의 차이를 반영하여 구성했다. 현재의 감정 상태를 위치로 나타내기 위해서 감정장을 제안했고, 감정장 상의 위치와 위치에 따른 색깔로 현재의 감정 상태를 표현했다. 감정상태의 변화를 제안된 인공감정을 통해 시각화해보기 위해 셰익스피어의 '햄릿'에서 극중 등장인물인 햄릿의 감정변화를 제안된 인공감정을 통해 시각화 해 보였다.

  • PDF

Multimodal Attention-Based Fusion Model for Context-Aware Emotion Recognition

  • Vo, Minh-Cong;Lee, Guee-Sang
    • International Journal of Contents
    • /
    • v.18 no.3
    • /
    • pp.11-20
    • /
    • 2022
  • Human Emotion Recognition is an exciting topic that has been attracting many researchers for a lengthy time. In recent years, there has been an increasing interest in exploiting contextual information on emotion recognition. Some previous explorations in psychology show that emotional perception is impacted by facial expressions, as well as contextual information from the scene, such as human activities, interactions, and body poses. Those explorations initialize a trend in computer vision in exploring the critical role of contexts, by considering them as modalities to infer predicted emotion along with facial expressions. However, the contextual information has not been fully exploited. The scene emotion created by the surrounding environment, can shape how people perceive emotion. Besides, additive fusion in multimodal training fashion is not practical, because the contributions of each modality are not equal to the final prediction. The purpose of this paper was to contribute to this growing area of research, by exploring the effectiveness of the emotional scene gist in the input image, to infer the emotional state of the primary target. The emotional scene gist includes emotion, emotional feelings, and actions or events that directly trigger emotional reactions in the input image. We also present an attention-based fusion network, to combine multimodal features based on their impacts on the target emotional state. We demonstrate the effectiveness of the method, through a significant improvement on the EMOTIC dataset.