• Title/Summary/Keyword: Emotion Engine

Search Result 37, Processing Time 0.022 seconds

KOBIE: A Pet-type Emotion Robot (KOBIE: 애완형 감성로봇)

  • Ryu, Joung-Woo;Park, Cheon-Shu;Kim, Jae-Hong;Kang, Sang-Seung;Oh, Jin-Hwan;Sohn, Joo-Chan;Cho, Hyun-Kyu
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.2
    • /
    • pp.154-163
    • /
    • 2008
  • This paper presents the concept for the development of a pet-type robot with an emotion engine. The pet-type robot named KOBIE (KOala roBot with Intelligent Emotion) is able to interact with a person through touch. KOBIE is equipped with tactile sensors on the body for interaction with a person through recognition of his/her touching behaviors such as "Stroke","Tickle","Hit". We have covered KOBIE with synthetic fur fabric in order to can make him/her feel affection as well. KOBIE is able to also express an emotional status that varies according to the circumstances under which it is presented. The emotion engine of KOBIE's emotion expression system generates an emotional status in an emotion vector space which is associated with a predefined needs and mood models. In order to examine the feasibility of our emotion expression system, we verified a changing emotional status in our emotion vector space by a touching behavior. We specially examined the reaction of children who have interacted with three kind of pet-type robots: KOBIE, PARO, AIBO for roughly 10 minutes to investigate the children's preference for pet-type robots.

  • PDF

Engine of computational Emotion model for emotional interaction with human (인간과 감정적 상호작용을 위한 '감정 엔진')

  • Lee, Yeon Gon
    • Science of Emotion and Sensibility
    • /
    • v.15 no.4
    • /
    • pp.503-516
    • /
    • 2012
  • According to the researches of robot and software agent until now, computational emotion model is dependent on system, so it is hard task that emotion models is separated from existing systems and then recycled into new systems. Therefore, I introduce the Engine of computational Emotion model (shall hereafter appear as EE) to integrate with any robots or agents. This is the engine, ie a software for independent form from inputs and outputs, so the EE is Emotion Generation to control only generation and processing of emotions without both phases of Inputs(Perception) and Outputs(Expression). The EE can be interfaced with any inputs and outputs, and produce emotions from not only emotion itself but also personality and emotions of person. In addition, the EE can be existed in any robot or agent by a kind of software library, or be used as a separate system to communicate. In EE, emotions is the Primary Emotions, ie Joy, Surprise, Disgust, Fear, Sadness, and Anger. It is vector that consist of string and coefficient about emotion, and EE receives this vectors from input interface and then sends its to output interface. In EE, each emotions are connected to lists of emotional experiences, and the lists consisted of string and coefficient of each emotional experiences are used to generate and process emotional states. The emotional experiences are consisted of emotion vocabulary understanding various emotional experiences of human. This study EE is available to use to make interaction products to response the appropriate reaction of human emotions. The significance of the study is on development of a system to induce that person feel that product has your sympathy. Therefore, the EE can help give an efficient service of emotional sympathy to products of HRI, HCI area.

  • PDF

Emotion Engine for Digital Character (디지털 캐릭터를 위한 감성엔진)

  • Kim Ji-Hwan;Cho Sung-Hyun;Choi Jong-Hak;Yang Jung-Jin
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.06b
    • /
    • pp.208-210
    • /
    • 2006
  • 최근 온라인 게임을 비롯하여 영화, 애니메이션 등 가상현실에서 캐릭터가 중심적인 역할을 하게 되었고 좀 더 능동적이고 사람에 가까운 캐릭터 개발이 필요하게 되었다. 이러한 요구 중에서 본 논문에서는 감성기반 캐릭터에 초점을 맞추었고 Emotion Al사의 Artificial Emotion Engine Model과 OCC Model를 바탕으로 각 캐릭터의 특성을 반영하고 캐릭터간의 상호 작용을 바탕으로 감성을 도출해 낼 수 있는 Emotion Engine의 Architecture를 제시한다.

  • PDF

Emotional Head Robot System Using 3D Character (3D 캐릭터를 이용한 감정 기반 헤드 로봇 시스템)

  • Ahn, Ho-Seok;Choi, Jung-Hwan;Baek, Young-Min;Shamyl, Shamyl;Na, Jin-Hee;Kang, Woo-Sung;Choi, Jin-Young
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.328-330
    • /
    • 2007
  • Emotion is getting one of the important elements of the intelligent service robots. Emotional communication can make more comfortable relation between humans and robots. We developed emotional head robot system using 3D character. We designed emotional engine for making emotion of the robot. The results of face recognition and hand recognition is used for the input data of emotional engine. 3D character expresses nine emotions and speaks about own emotional status. The head robot has memory of a degree of attraction. It can be chaIU!ed by input data. We tested the head robot and conform its functions.

  • PDF

Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans (가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석)

  • Kim, Jin Ok
    • Journal of Internet Computing and Services
    • /
    • v.13 no.5
    • /
    • pp.9-19
    • /
    • 2012
  • Virtual human used as HCI in digital contents expresses his various emotions across modalities like facial expression and body posture. However, few studies considered combinations of such nonverbal multimodal in emotion perception. Computational engine models have to consider how a combination of nonverbal modal like facial expression and body posture will be perceived by users to implement emotional virtual human, This paper proposes the impacts of nonverbal multimodal in design of emotion expressed virtual human. First, the relative impacts are analysed between different modals by exploring emotion recognition of modalities for virtual human. Then, experiment evaluates the contribution of the facial and postural congruent expressions to recognize basic emotion categories, as well as the valence and activation dimensions. Measurements are carried out to the impact of incongruent expressions of multimodal on the recognition of superposed emotions which are known to be frequent in everyday life. Experimental results show that the congruence of facial and postural expression of virtual human facilitates perception of emotion categories and categorical recognition is influenced by the facial expression modality, furthermore, postural modality are preferred to establish a judgement about level of activation dimension. These results will be used to implementation of animation engine system and behavior syncronization for emotion expressed virtual human.

디지털 캐릭터를 위한 온톨로지 기반의 감성엔진

  • Kim Ji-Hwan;Cho Sung-Hyun;Choi Jong-Hak;Yang Jung-Jin
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • 2006.06a
    • /
    • pp.255-261
    • /
    • 2006
  • 디지털 캐릭터가 여러 분야에서 중심적인 역할을 하게 되었고 그에 따라 좀 더 능동적이고 사람에 가까운 캐릭터 개발이 필요하게 되었다. 이러한 요구 중에서 본 논문은 감성기반 캐릭터에 초점을 맞추었고 OCC Model과 AEE Model을 바탕으로 온톨로지와 추론엔진을 이용해서 상황정보를 감정으로 바꾸고 캐릭터의 특성을 반영 할 수 있는 Emotion Engine의 Architecture를 제시한다.

  • PDF

Emotion Based Gesture Animation Generation Mobile System (감정 기반 모바일 손제스쳐 애니메이션 제작 시스템)

  • Lee, Jung-Suk;Byun, Hae-Won
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.129-134
    • /
    • 2009
  • Recently, percentage of people who use SMS service is increasing. However, it is difficult to express own complicated emotion with text and emoticon of exited SMS service. This paper focuses on that point and practical uses character animation to express emotion and nuance correctly, funny. Also this paper suggests emotion based gesture animation generation system that use character's facial expression and gesture to delivery emotion excitably and clearly than only speaking. Michel[1] investigated interview movies of a person whose gesturing style they wish to animate and suggested gesture generation graph for stylized gesture animation. In this paper, we make focus to analyze and abstracted emotional gestures of Disney animation characters and did 3D modeling of these emotional gestures expanding Michel[1]'s research. To express emotion of person, suggests a emotion gesture generation graph that reflects emotion flow graph express emotion flow for probability. We investigated user reaction for research the propriety of suggested system and alternation propriety.

  • PDF

An Artificial Emotion Model for Expression of Game Character (감정요소가 적용된 게임 캐릭터의 표현을 위한 인공감정 모델)

  • Kim, Ki-Il;Yoon, Jin-Hong;Park, Pyoung-Sun;Kim, Mi-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.411-416
    • /
    • 2008
  • The development of games has brought about the birth of game characters that are visually very realistic. At present, one sees much enthusiasm for giving the characters emotions through such devices as avatars and emoticons. However, in a freely changing environment of games, the devices merely allow for the expression of the value derived from a first input rather than creating expressions of emotion that actively respond to their surroundings. As such, there are as of yet no displays of deep emotions among game characters. In light of this, the present article proposes the 'CROSS(Character Reaction on Specific Situation) Model AE Engine' for game characters in order to develop characters that will actively express action and emotion within the environment of the changing face of games. This is accomplished by classifying the emotional components applicable to game characters based on the OCC model, which is one of the most well known cognitive psychological models. Then, the situation of game playing analysis of the commercialized RPG game is systematized by ontology.

  • PDF

Emotion Recognition Method of Competition-Cooperation Using Electrocardiogram (심전도를 이용한 경쟁-협력의 감성 인식 방법)

  • Park, Sangin;Lee, Don Won;Mun, Sungchul;Whang, Mincheol
    • Science of Emotion and Sensibility
    • /
    • v.21 no.3
    • /
    • pp.73-82
    • /
    • 2018
  • Attempts have been made to recognize social emotion, including competition-cooperation, while designing interaction in work places. This study aimed to determine the cardiac response associated with classifying competition-cooperation of social emotion. Sixty students from Sangmyung University participated in the study and were asked to play a pattern game to experience the social emotion associated with competition and cooperation. Electrocardiograms were measured during the task and were analyzed to obtain time domain indicators, such as RRI, SDNN, and pNN50, and frequency domain indicators, such as VLF, LF, HF, VLF/HF, LF/HF, lnVLF, lnLF, lnHF, and lnVLF/lnHF. The significance of classifying social emotions was assessed using an independent t-test. The rule-base for the classification was determined using significant parameters of 30 participants and verified from data obtained from another 30 participants. As a result, 91.67% participants were correctly classified. This study proposes a new method of classifying social emotions of competition and cooperation and provides objective data for designing social interaction.

The Design and Implementation of a Driver's Emotion Estimation based Application/Service Framework for Connected Cars (커넥티드 카를 위한 운전자 감성추론 기반의 차량 제어 및 애플리케이션/서비스 프레임워크)

  • Kook, Joongjin
    • The Transactions of the Korean Institute of Electrical Engineers P
    • /
    • v.67 no.2
    • /
    • pp.100-105
    • /
    • 2018
  • In this paper, we determined the driver's stress and fatigue level through physiological signals of a driver in the connected car environment, accordingly designing and implementing the architecture of the connected cars' platforms needed to provide services to make the driving environments comfortable and reduce the driver's fatigue level. It includes a gateway between AVN and ECU for the vehicle control, a framework for native applications and web applications based on AVN, and a sensing device and an emotion estimation engine for application services. This paper will provide the element technologies for the connected car-based convergence services and their implementation methods, and reference models for the service design.