• 제목/요약/키워드: Robot Eyes

검색결과 41건 처리시간 0.029초

로봇 캐릭터와의 상호작용에서 사용자의 시선 배분 분석 (Analysis of User's Eye Gaze Distribution while Interacting with a Robotic Character)

  • 장세윤;조혜경
    • 로봇학회논문지
    • /
    • 제14권1호
    • /
    • pp.74-79
    • /
    • 2019
  • In this paper, we develop a virtual experimental environment to investigate users' eye gaze in human-robot social interaction, and verify it's potential for further studies. The system consists of a 3D robot character capable of hosting simple interactions with a user, and a gaze processing module recording which body part of the robot character, such as eyes, mouth or arms, the user is looking at, regardless of whether the robot is stationary or moving. To verify that the results acquired on this virtual environment are aligned with those of physically existing robots, we performed robot-guided quiz sessions with 120 participants and compared the participants' gaze patterns with those in previous works. The results included the followings. First, when interacting with the robot character, the user's gaze pattern showed similar statistics as the conversations between humans. Second, an animated mouth of the robot character received longer attention compared to the stationary one. Third, nonverbal interactions such as leakage cues were also effective in the interaction with the robot character, and the correct answer ratios of the cued groups were higher. Finally, gender differences in the users' gaze were observed, especially in the frequency of the mutual gaze.

Development of a Biped Walking Robot Actuated by a Closed-Chain Mechanism

  • Choi, Hyeung-Sik;Oh, Jung-Min;Baek, Chang-Yul;Chung, Kyung-Sik
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.209-214
    • /
    • 2003
  • We developed a new type of human-sized BWR (biped walking robot), named KUBIR1 which is driven by the closed-chain type of actuator. A new type of the closed-chain actuator for the robot is developed, which is composed of the four-bar-link mechanism driven by the ball screw which has high strength and high gear ratio. Each leg of the robot is composed of 6 D.O.F joints. For front walking, three pitch joints and one roll joint at the ankle. In addition to this, one yaw joint for direction change, and another roll joint for balancing the body are attached. Also, the robot has two D.O.F joints of each hand and three D.O.F. for eye motion. There are three actuating motors for stereo cameras for eyes. In all, a 18 degree-of-freedom robot was developed. KUBIR1 was designed to walk autonomously by adapting small 90W DC motors as the robot actuators and batteries and controllers are on-boarded. The whole weight for Kubir1 is over 90Kg, and height is 167Cm. In the paper, the performance test of KUBIR1 will be shown.

  • PDF

이동로봇을 이용한 원전 내부 감시점검에 관한 연구 (A Study of Nuclear Power Plant Inspection Tasks Using A Mobile Robot)

  • 김창회;서용칠;조재완;최영수;김승호
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2002년도 하계종합학술대회 논문집(5)
    • /
    • pp.193-196
    • /
    • 2002
  • In this paper, we presents the remote inspection activity with a mobile robot at the calandria face areas of the PHWR (pressurized heavy water reactor) nuclear power plants during full power plant operation.. The tele-operated mobile robot has been developed for this task. A 4 wheeled mechanism with the dual reconfigurable crawler arm has been adopted for the ease access to the high radiation area of calandria face. A specially designed extendable long reach mast attached on the mobile platform and the thermal image monitoring system enable human eyes to look into the calandria face. Application of robot will keep human workers from high radiation exposure and enhance the reliability of nuclear power plants.

  • PDF

A Study on Real Time Control of Moving Stuff Action Through Iterative Learning for Mobile-Manipulator System

  • Kim, Sang-Hyun;Kim, Du-Beum;Kim, Hui-Jin;Im, O-Duck;Han, Sung-Hyun
    • 한국산업융합학회 논문집
    • /
    • 제22권4호
    • /
    • pp.415-425
    • /
    • 2019
  • This study proposes a new approach to control Moving Stuff Action Through Iterative Learning robot with dual arm for smart factory. When robot moves object with dual arm, not only position of each hand but also contact force at surface of an object should be considered. However, it is not easy to determine every parameters for planning trajectory of the an object and grasping object concerning about variety compliant environment. On the other hand, human knows how to move an object gracefully by using eyes and feel of hands which means that robot could learn position and force from human demonstration so that robot can use learned task at variety case. This paper suggest a way how to learn dynamic equation which concern about both of position and path.

Development of Pose-Invariant Face Recognition System for Mobile Robot Applications

  • Lee, Tai-Gun;Park, Sung-Kee;Kim, Mun-Sang;Park, Mig-Non
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.783-788
    • /
    • 2003
  • In this paper, we present a new approach to detect and recognize human face in the image from vision camera equipped on the mobile robot platform. Due to the mobility of camera platform, obtained facial image is small and pose-various. For this condition, new algorithm should cope with these constraints and can detect and recognize face in nearly real time. In detection step, ‘coarse to fine’ detection strategy is used. Firstly, region boundary including face is roughly located by dual ellipse templates of facial color and on this region, the locations of three main facial features- two eyes and mouth-are estimated. For this, simplified facial feature maps using characteristic chrominance are made out and candidate pixels are segmented as eye or mouth pixels group. These candidate facial features are verified whether the length and orientation of feature pairs are suitable for face geometry. In recognition step, pseudo-convex hull area of gray face image is defined which area includes feature triangle connecting two eyes and mouth. And random lattice line set are composed and laid on this convex hull area, and then 2D appearance of this area is represented. From these procedures, facial information of detected face is obtained and face DB images are similarly processed for each person class. Based on facial information of these areas, distance measure of match of lattice lines is calculated and face image is recognized using this measure as a classifier. This proposed detection and recognition algorithms overcome the constraints of previous approach [15], make real-time face detection and recognition possible, and guarantee the correct recognition irregardless of some pose variation of face. The usefulness at mobile robot application is demonstrated.

  • PDF

A Face Robot Actuated With Artificial Muscle Based on Dielectric Elastomer

  • Kwak Jong Won;Chi Ho June;Jung Kwang Mok;Koo Ja Choon;Jeon Jae Wook;Lee Youngkwan;Nam Jae-do;Ryew Youngsun;Choi Hyouk Ryeol
    • Journal of Mechanical Science and Technology
    • /
    • 제19권2호
    • /
    • pp.578-588
    • /
    • 2005
  • Face robots capable of expressing their emotional status, can be adopted as an efficient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with artificial muscle based on dielectric elastomer. By exploiting the properties of dielectric elastomer, it is possible to actuate the covering skin, eyes as well as provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven actuator modules such eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is sufficient to generate six fundamental facial expressions such as surprise, fear, angry, disgust, sadness, and happiness. In the robot, each module communicates with the others via CAN communication protocol and according to the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

지네를 모방한 수직 장애물 극복방법 (The Method of Vertical Obstacle Negotiation Inspired from a Centipede)

  • 윤병호;정태일;고두열;김수현
    • 제어로봇시스템학회논문지
    • /
    • 제18권3호
    • /
    • pp.193-200
    • /
    • 2012
  • Mobility is one of the most important issues for search and rescue robots. To increase mobility for small size robot we have focused on the mechanism and algorithm inspired from centipede. In spite of small size, using many legs and flexible long body, centipede can overcome high obstacles and move in rough terrains stably. This research focused on those points and imitated their legs and body that are good for obstacle negotiation. Based on similarity of a centipede's legs and tracks, serially connected tracks are used for climbing obstacles higher than the robot's height. And a centipede perceives environments using antennae on its head instead of eyes. Inspired from that, 3 IR sensors are attached on the front, top and bottom of the first module to imitate the antenna. Using the information gotten from the sensors, the robot decides next behavior automatically. In experiments, the robot can climb up to 45 cm height vertical wall and it is 600 % of the robot's height and 58 % of the robot's length.

모션센서를 이용한 로봇의 디지털 영상 보정 기술 (Digital Image Stabilization Technique of Robot using Motion Sensor)

  • 오정석;심귀보
    • 한국지능시스템학회논문지
    • /
    • 제19권3호
    • /
    • pp.317-322
    • /
    • 2009
  • 로봇은 일정 속도 이상의 움직임을 가지면 필연적으로 진동을 하기 때문에 장착된 카메라의 경우 이미지가 흔들려 더 이상 로봇의 눈 역할을 수행하지 못하게 된다. 따라서 흔들리는 영상의 안정화를 위한 방법을 연구가 필요해졌다. 영상에서 글로벌 모션 벡터를 계산하여 안정화하는 방법이 존재하지만 이는 프로세서가 처리해야 하는 데이터양이 많아지기 때문에 임베디드 로봇의 사양의 한계로 인하여 실시간으로 영상을 송출하는데 큰 어려움이 있다. 이를 보완하기 위하여 모션벡터를 사용하지 않는 모션센서를 통한 영상 안정화를 제안한다. 모션센서를 통하여 로봇의 진행과 관계없는 움직임을 추출하고 추출한 움직임을 영상에서 제거하는 방법이다.

각막 압평을 이용한 로봇 바늘 삽입법: 심부표층각막이식수술에의 적용 (Robotic Needle Insertion Using Corneal Applanation for Deep Anterior Lamellar Keratoplasty)

  • 박익종;신형곤;김기훈;김홍균;정완균
    • 로봇학회논문지
    • /
    • 제16권1호
    • /
    • pp.64-71
    • /
    • 2021
  • This paper describes a robotic teleoperation system to perform an accurate needle insertion into a cornea for a separation between the stromal layer and Descemet's membrane during deep anterior lamellar Keratoplasty (DALK). The system can reduce the hand tremor of a surgeon by scaling the input motion, which is the control input of the slave robot. Moreover, we utilize corneal applanation to estimate the insertion depth. The proposed system was validated by performing the layer separation using 25 porcine eyes. The average depth of needle insertion was 742 ± 39.8 ㎛ while the target insertion depth was 750 ㎛. Tremor error was reduced from 402 ± 248 ㎛ in the master device to 28.5 ± 21.0 ㎛ in the slave robot. The rate of complete success, partial success, and failure were 60, 28, and 12%, respectively. The experimental results showed that the proposed system was able to reduce the hand tremor of surgeons and perform precise needle insertion during DALK.

Hyundai 8608 Robot 제어기 파라미터 튜닝 방안 연구 (A Study on the Control Parameter Tuning Method of the Hyundai 8608 Robot)

  • 김미경;윤천석;강희준;서영수;노영식;손홍래
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 2005년도 춘계학술대회 논문집
    • /
    • pp.1836-1840
    • /
    • 2005
  • This work proposes a controller tuning method of a Hyundai 8608 robot in order to improve its performance. For this, we analyzed the control structure of the robot, and the functions of all the adjustable parameters in the robot controller with a reference 'NACHI Technical Report'. Through the analysis, we found out that 3 important parameters(VRRL, VRF, VRGIN) act like a conventional PID gains and other parameters are closely related to these 3 parameters. Conclusively, parameter tuning of these 3 parameters is enough in most cases of applications with other parameters fixed. The conventional PID tuning is performed to each joint of the test robot with Robot Performance Evaluation System(shown in our companion paper) so that the acceptable gain ranges for each joint are determined and then the robot performance tests are repeatedly done with the combination of the acceptable gains. Finally, the best combination is selected for its best performance. For the effectiveness of the proposed method, it was implemented on a Hyundai 8608 robot and its results are compared with the results of NACHI's Semi-Auto Tuning Method and the results which are done by a tuning expert with his eyes.

  • PDF