• 제목/요약/키워드: robot face

검색결과 186건 처리시간 0.025초

로봇 환경에서 텐서 부공간 분석기법을 이용한 얼굴인식 (Face Recognition Using Tensor Subspace Analysis in Robot Environments)

  • 김승석;곽근창
    • 로봇학회논문지
    • /
    • 제3권4호
    • /
    • pp.300-307
    • /
    • 2008
  • This paper is concerned with face recognition for human-robot interaction (HRI) in robot environments. For this purpose, we use Tensor Subspace Analysis (TSA) to recognize the user's face through robot camera when robot performs various services in home environments. Thus, the spatial correlation between the pixels in an image can be naturally characterized by TSA. Here we utilizes face database collected in u-robot test bed environments in ETRI. The presented method can be used as a core technique in conjunction with HRI that can naturally interact between human and robots in home robot applications. The experimental results on face database revealed that the presented method showed a good performance in comparison with the well-known methods such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) in distant-varying environments.

  • PDF

얼굴로봇 Buddy의 기능 및 구동 메커니즘 (Functions and Driving Mechanisms for Face Robot Buddy)

  • 오경균;장명수;김승종;박신석
    • 로봇학회논문지
    • /
    • 제3권4호
    • /
    • pp.270-277
    • /
    • 2008
  • The development of a face robot basically targets very natural human-robot interaction (HRI), especially emotional interaction. So does a face robot introduced in this paper, named Buddy. Since Buddy was developed for a mobile service robot, it doesn't have a living-being like face such as human's or animal's, but a typically robot-like face with hard skin, which maybe suitable for mass production. Besides, its structure and mechanism should be simple and its production cost also should be low enough. This paper introduces the mechanisms and functions of mobile face robot named Buddy which can take on natural and precise facial expressions and make dynamic gestures driven by one laptop PC. Buddy also can perform lip-sync, eye-contact, face-tracking for lifelike interaction. By adopting a customized emotional reaction decision model, Buddy can create own personality, emotion and motive using various sensor data input. Based on this model, Buddy can interact probably with users and perform real-time learning using personality factors. The interaction performance of Buddy is successfully demonstrated by experiments and simulations.

  • PDF

Human Robot Interaction Using Face Direction Gestures

  • Kwon, Dong-Soo;Bang, Hyo-Choong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.171.4-171
    • /
    • 2001
  • This paper proposes a method of human- robot interaction (HRI) using face directional gesture. A single CCD color camera is used to input face region, and the robot recognizes the face directional gesture based on the facial feature´s positions. One can give a command such as stop, go, left and right turn to the robot using the face directional gesture. Since the robot also has the ultra sonic sensors, it can detect obstacles and determine a safe direction at the current position. By combining the user´s command with the sensed obstacle configuration, the robot selects the safe and efficient motion direction. From simulation results, we show that the robot with HRI is more reliable for the robot´s navigation.

  • PDF

Speaker Detection and Recognition for a Welfare Robot

  • Sugisaka, Masanori;Fan, Xinjian
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.835-838
    • /
    • 2003
  • Computer vision and natural-language dialogue play an important role in friendly human-machine interfaces for service robots. In this paper we describe an integrated face detection and face recognition system for a welfare robot, which has also been combined with the robot's speech interface. Our approach to face detection is to combine neural network (NN) and genetic algorithm (GA): ANN serves as a face filter while GA is used to search the image efficiently. When the face is detected, embedded Hidden Markov Model (EMM) is used to determine its identity. A real-time system has been created by combining the face detection and recognition techniques. When motivated by the speaker's voice commands, it takes an image from the camera, finds the face inside the image and recognizes it. Experiments on an indoor environment with complex backgrounds showed that a recognition rate of more than 88% can be achieved.

  • PDF

얼굴 인증을 이용한 무인 접수 로봇 개발 (Unattended Reception Robot using Face Identification)

  • 박세현;류정탁;문병현;차경애
    • 한국산업정보학회논문지
    • /
    • 제19권5호
    • /
    • pp.33-37
    • /
    • 2014
  • 다양한 개인 정보의 활용으로 신뢰할 수 있는 인증 수단이 요구되고 있다. 개인 얼굴의 특징을 이용하는 얼굴 인증 기술은 특징점 추출이 용이하여 많이 활용되고 있다. 본 논문에서는 무인 접수를 위한 얼굴인증 로봇을 구현하였다. 구현된 로봇은 사용자 인증을 위해 얼굴인식방법을 이용한여 개인 인증을 하고 있다. 얼굴인증 시스템을 무인접수로봇에 적용하여 유용함을 보였다.

AdaBoost 알고리즘을 이용한 얼굴인식 및 선박용 감시로봇 개발 (Face Recognition using AdaBoost Algorithm and Development of Surveillance Robot for a Ship)

  • 고석조;박장식;장용석;최문호
    • 로봇학회논문지
    • /
    • 제3권3호
    • /
    • pp.219-225
    • /
    • 2008
  • This study developed a surveillance robot for a ship. The developed robot consists of ultrasonic sensors, an actuator, a lighting fixture and a camera. The ultrasonic sensors are used to avoid collision with obstacles in the environment. The actuator is a servo motor system. The developed robot has four drive wheels for driving. The lighting fixture is used to guide the robot in a dark environment. To transmit an image, a camera with a pan moving and a tilt moving is equipped on the upper part of the robot. AdaBoost algorithm trained with 15 features, is used for face recognition. In order to evaluate the face recognition of the developed robot, experiments were performed.

  • PDF

인공근육을 이용한 얼굴로봇 (A Face Robot Actuated with Artiflcial Muscle)

  • 곽종원;지호준;정광목;남재도;전재욱;최혁렬
    • 제어로봇시스템학회논문지
    • /
    • 제10권11호
    • /
    • pp.991-999
    • /
    • 2004
  • Face robots capable of expressing their emotional status, can be adopted as an efficient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with artificial muscle based on dielectric elastomer. By exploiting the properties of polymers, it is possible to actuate the covering skin, eyes as well as provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven types of actuator modules such as eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is sufficient to generate six fundamental facial expressions such as surprise, fear, anger, disgust, sadness, and happiness. Each module communicates with the others via CAN communication protocol fur the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

다양한 조명 환경에서의 실시간 사용자 검출을 위한 압축 영역에서의 색상 조절을 사용한 얼굴 검출 방법 (Face detection in compressed domain using color balancing for various illumination conditions)

  • 민현석;이영복;신호철;임을균;노용만
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2009년도 학술대회
    • /
    • pp.140-145
    • /
    • 2009
  • 본 논문에서는 압축 영역에서 동작하는 조명 환경 변화에 강인한 얼굴 검출 방법을 제안한다. 기존 이미지 처리를 이용한 얼굴 검출 방법들은 주로 픽셀 기반 영역에서 이루어져 왔다. 그러나 컴퓨팅 파워와 저장 공간이 제한적인 로봇 환경에는 픽셀 기반 처리가 적합하지 않다. 또한 조명 환경의 변화는 안정된 얼굴 검출을 위해 해결되어야 하는 문제로 인식되어 왔다. 이러한 문제점들을 해결하기 위하여 본 논문에서는 압축 영역에서의 조명 효과 보상과 색 온도 변환을 이용한 색상 정보 조절 과정을 사용한 얼굴 검출 방법을 제안한다. 제안된 방법은 색상 정보 조절을 통하여 다양한 조명 환경에서 기존 방법에 비해 강인한 얼굴 검출을 보여준다.

  • PDF

서비스 로봇을 위한 감성인터페이스 기술 (Emotional Interface Technologies for Service Robot)

  • 양현승;서용호;정일웅;한태우;노동현
    • 로봇학회논문지
    • /
    • 제1권1호
    • /
    • pp.58-65
    • /
    • 2006
  • The emotional interface is essential technology for the robot to provide the proper service to the user. In this research, we developed emotional components for the service robot such as a neural network based facial expression recognizer, emotion expression technologies based on 3D graphical face expression and joints movements, considering a user's reaction, behavior selection technology for emotion expression. We used our humanoid robots, AMI and AMIET as the test-beds of our emotional interface. We researched on the emotional interaction between a service robot and a user by integrating the developed technologies. Emotional interface technology for the service robot, enhance the performance of friendly interaction to the service robot, to increase the diversity of the service and the value-added of the robot for human. and it elevates the market growth and also contribute to the popularization of the robot. The emotional interface technology can enhance the performance of friendly interaction of the service robot. This technology can also increase the diversity of the service and the value-added of the robot for human. and it can elevate the market growth and also contribute to the popularization of the robot.

  • PDF