• 제목/요약/키워드: Facial Expressions of Robots

검색결과 15건 처리시간 0.028초

감정 경계를 이용한 로봇의 생동감 있는 얼굴 표정 구현 (Life-like Facial Expression of Mascot-Type Robot Based on Emotional Boundaries)

  • 박정우;김우현;이원형;정명진
    • 로봇학회논문지
    • /
    • 제4권4호
    • /
    • pp.281-288
    • /
    • 2009
  • Nowadays, many robots have evolved to imitate human social skills such that sociable interaction with humans is possible. Socially interactive robots require abilities different from that of conventional robots. For instance, human-robot interactions are accompanied by emotion similar to human-human interactions. Robot emotional expression is thus very important for humans. This is particularly true for facial expressions, which play an important role in communication amongst other non-verbal forms. In this paper, we introduce a method of creating lifelike facial expressions in robots using variation of affect values which consist of the robot's emotions based on emotional boundaries. The proposed method was examined by experiments of two facial robot simulators.

  • PDF

소셜 로봇의 표정 커스터마이징 구현 및 분석 (The Implementation and Analysis of Facial Expression Customization for a Social Robot)

  • 이지연;박하은;;김병헌;이희승
    • 로봇학회논문지
    • /
    • 제18권2호
    • /
    • pp.203-215
    • /
    • 2023
  • Social robots, which are mainly used by individuals, emphasize the importance of human-robot relationships (HRR) more compared to other types of robots. Emotional expression in robots is one of the key factors that imbue HRR with value; emotions are mainly expressed through the face. However, because of cultural and preference differences, the desired robot facial expressions differ subtly depending on the user. It was expected that a robot facial expression customization tool may mitigate such difficulties and consequently improve HRR. To prove this, we created a robot facial expression customization tool and a prototype robot. We implemented a suitable emotion engine for generating robot facial expressions in a dynamic human-robot interaction setting. We conducted experiments and the users agreed that the availability of a customized version of the robot has a more positive effect on HRR than a predefined version of the robot. Moreover, we suggest recommendations for future improvements of the customization process of robot facial expression.

Facial Actions 과 애니메이션 원리에 기반한 로봇의 얼굴 제스처 생성 (Generation of Robot Facial Gestures based on Facial Actions and Animation Principles)

  • 박정우;김우현;이원형;이희승;정명진
    • 제어로봇시스템학회논문지
    • /
    • 제20권5호
    • /
    • pp.495-502
    • /
    • 2014
  • This paper proposes a method to generate diverse robot facial expressions and facial gestures in order to help long-term HRI. First, nine basic dynamics for diverse robot facial expressions are determined based on the dynamics of human facial expressions and principles of animation for even identical emotions. In the second stage, facial actions are added to express facial gestures such as sniffling or wailing loudly corresponding to sadness, laughing aloud or smiling corresponding to happiness, etc. To evaluate the effectiveness of our approach, we compared the facial expressions of the developed robot when the proposed method is used or not. The results of the survey showed that the proposed method can help robots generate more realistic facial expressions.

3차원 정서 공간에서 마스코트 형 얼굴 로봇에 적용 가능한 동적 감정 모델 (Dynamic Emotion Model in 3D Affect Space for a Mascot-Type Facial Robot)

  • 박정우;이희승;조수훈;정명진
    • 로봇학회논문지
    • /
    • 제2권3호
    • /
    • pp.282-287
    • /
    • 2007
  • Humanoid and android robots are emerging as a trend shifts from industrial robot to personal robot. So human-robot interaction will increase. Ultimate objective of humanoid and android would be a robot like a human. In this aspect, implementation of robot's facial expression is necessary in making a human-like robot. This paper proposes a dynamic emotion model for a mascot-type robot to display similar facial and more recognizable expressions.

  • PDF

인공근육을 이용한 얼굴로봇 (A Face Robot Actuated with Artiflcial Muscle)

  • 곽종원;지호준;정광목;남재도;전재욱;최혁렬
    • 제어로봇시스템학회논문지
    • /
    • 제10권11호
    • /
    • pp.991-999
    • /
    • 2004
  • Face robots capable of expressing their emotional status, can be adopted as an efficient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with artificial muscle based on dielectric elastomer. By exploiting the properties of polymers, it is possible to actuate the covering skin, eyes as well as provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven types of actuator modules such as eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is sufficient to generate six fundamental facial expressions such as surprise, fear, anger, disgust, sadness, and happiness. Each module communicates with the others via CAN communication protocol fur the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

로봇과 인간의 상호작용을 위한 얼굴 표정 인식 및 얼굴 표정 생성 기법 (Recognition and Generation of Facial Expression for Human-Robot Interaction)

  • 정성욱;김도윤;정명진;김도형
    • 제어로봇시스템학회논문지
    • /
    • 제12권3호
    • /
    • pp.255-263
    • /
    • 2006
  • In the last decade, face analysis, e.g. face detection, face recognition, facial expression recognition, is a very lively and expanding research field. As computer animated agents and robots bring a social dimension to human computer interaction, interest in this research field is increasing rapidly. In this paper, we introduce an artificial emotion mimic system which can recognize human facial expressions and also generate the recognized facial expression. In order to recognize human facial expression in real-time, we propose a facial expression classification method that is performed by weak classifiers obtained by using new rectangular feature types. In addition, we make the artificial facial expression using the developed robotic system based on biological observation. Finally, experimental results of facial expression recognition and generation are shown for the validity of our robotic system.

감정표현을 위한 FACS 기반의 안드로이드 헤드의 개발 (Development of FACS-based Android Head for Emotional Expressions)

  • 최동운;이덕연;이동욱
    • 방송공학회논문지
    • /
    • 제25권4호
    • /
    • pp.537-544
    • /
    • 2020
  • 본 논문에서는 FACS(Facial Action Coding System)에 기반을 둔 안드로이드 로봇 헤드의 개발을 통한 감정표현 방법을 제안한다. 안드로이드 로봇은 인간과 가까운 외모를 가진 로봇을 말하며, 인공 피부, 인공 근육을 가지고 있다. 감정 표현을 로봇으로 구현하기 위해서는 인공 근육의 개수와 배치를 정하여야 하는데 이를 위해 인간의 얼굴 움직임을 해부학적으로 분석하였다. FACS는 해부학을 기반으로 하여 표정을 만들 때의 얼굴의 움직임을 분석한 시스템이다. FACS에서는 표정은 AU(Action Unit)의 조합으로 만들어지며, 이 AU를 기반으로 로봇의 인공 근육의 수와 위치를 정하게 된다. 개발된 안드로이드 헤드는 30개의 인공 근육에 해당되는 모터와 와이어를 가지고 있으며, 표정 구현을 위한 인공 피부를 가지고 있다. 제한된 머리 공간에 많은 모터를 탑재하기 위해 spherical joint와 스프링을 이용하여 초소형 안구 모듈이 개발되었고, 와이어 경로의 효율적인 설계를 기반으로 30개의 모터가 배치되었다. 제작된 안드로이드 헤드는 30 자유도를 가지고 13개의 기본 감정 표현을 구현 가능하였고, 전시회에서 일반 관람객들을 대상으로 인식률을 평가 받았다.

A Face Robot Actuated With Artificial Muscle Based on Dielectric Elastomer

  • Kwak Jong Won;Chi Ho June;Jung Kwang Mok;Koo Ja Choon;Jeon Jae Wook;Lee Youngkwan;Nam Jae-do;Ryew Youngsun;Choi Hyouk Ryeol
    • Journal of Mechanical Science and Technology
    • /
    • 제19권2호
    • /
    • pp.578-588
    • /
    • 2005
  • Face robots capable of expressing their emotional status, can be adopted as an efficient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with artificial muscle based on dielectric elastomer. By exploiting the properties of dielectric elastomer, it is possible to actuate the covering skin, eyes as well as provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven actuator modules such eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is sufficient to generate six fundamental facial expressions such as surprise, fear, angry, disgust, sadness, and happiness. In the robot, each module communicates with the others via CAN communication protocol and according to the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

Development of Face Robot Actuated by Artificial Muscle

  • Choi, H.R.;Kwak, J.W.;Chi, H.J.;Jung, K.M.;Hwang, S.H.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1229-1234
    • /
    • 2004
  • Face robots capable of expressing their emotional status, can be adopted as an e cient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with arti cial muscle based on dielectric elastomer. By exploiting the properties of polymers, it is possible to actuate the covering skin, and provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven types of actuator modules such as eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is su cient to generate six fundamental facial expressions such as surprise, fear, angry, disgust, sadness, and happiness. Each module communicates with the others via CAN communication protocol and according to the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

  • PDF

음악요법을 이용한 노인의 우울증 완화 로봇 'BOOGI'의 콘텐츠 개발 (Development of Content for the Robot that Relieves Depression in the Elderly Using Music Therapy)

  • 정유화;정성원
    • 한국콘텐츠학회논문지
    • /
    • 제15권2호
    • /
    • pp.74-85
    • /
    • 2015
  • 타악기가 주는 긍정적인 효과가 우울증을 겪는 노인에게 자아 존중감의 증가 및 우울증의 감소를 유도할 수 있다는 사실을 바탕으로 음악 요법을 로봇에 접목하여 노인들이 음악을 직접 연주할 수 있게 하는 타악기 로봇의 콘텐츠가 개발되었다. 노인의 우울증 완화를 위한 음악치료 로봇의 콘텐츠로써, 노인과 로봇의 상호작용 요소가 도출되었다. 회상을 자극할 수 있는 노래가 선정되었고 노래의 연주에 따른 타악기의 타격에 대한 점수체계가 개발되었다. 또한 상호작용 요소 중 노인들의 감성적 자극요인이 될 수 있는 로봇 표정 요소가 디자인되었다. 표정은 무표정, 기쁨, 아주기쁨으로 구분되며 타악기의 연주 참여도에 따라 실시간으로 변화하여 적극적 참여를 유도하게 된다. 노인이 로봇에서 나오는 음악을 따라 연주하고 참여도의 정도에 따라 노인과 로봇이 상호작용할 수 있다면 단순한 음악 게임기가 아닌 우울증을 완화시킬 수 있는 로봇으로서의 효과가 극대화 될 수 있을 것으로 기대한다.