• 제목/요약/키워드: Human robot

검색결과 1,374건 처리시간 0.024초

로봇의 신뢰회복 행동이 인간-로봇 상호작용에 미치는 영향 (The effect of trust repair behavior on human-robot interaction)

  • 맹호영;김환이;박재은;한소원
    • 인지과학
    • /
    • 제33권4호
    • /
    • pp.205-228
    • /
    • 2022
  • 본 연구는 인간 로봇 상호작용에서 로봇의 사회적이고 관계적인 행동 유형이 인간의 인식에 끼치는 영향을 확인하고자 하였다. 이를 위한 실험에서는 연구 참여자들이 로봇 나오가 인간과 상호작용 하면서 로봇이 오류를 일으키고 신뢰회복을 위한 행동을 영상으로 시청한 후 로봇에 대한 신뢰를 평가하였다. 신뢰회복 행동은 로봇이 오류를 인정하고 사과하는 내부 귀인, 오류가 있었음을 사과하지만 외부로 귀인하는 조건, 오류 자체를 부인, 오류에 대해 아무런 사후 행동을 하지 않는 비 행동 조건으로 설정하였다. 이후 로봇에 대한 인간의 평가를 3가지 측면에서 분석하였다. 첫째, 로봇의 유능함과 정직성에 기반한 신뢰, 둘째 로봇에 대한 지각된 유능함과 정직성, 그리고 로봇의 오류로 인한 신뢰 위반에 대하여 오류의 심각성을 어떻게 지각하는지 탐색하였다. 실험의 결과는 3가지 모든 경우에서 로봇이 사과하지 않을 때보다 사과할 때 오류가 덜 심각하다고 지각하였으며 로봇에 대한 능력 또한 높이 평가하였다. 이러한 연구 결과는 로봇의 행동유형과 오류 극복 방법에 따라 로봇에 대한 인간의 태도가 민감하게 반응 할 수 있다는 근거를 제공하며 로봇에 대한 인간의 지각이 변할 수 있음을 시사한다. 특히 로봇이 스스로의 오류를 인정하고 사과하는 것이 더 신뢰를 높인다는 결과는 로봇이 인간처럼 사회적이고 매너있는 행동을 통해 긍정적인 인간 로봇상호작용을 증진시킬 수 있음을 보여준다.

보행자의 영상정보를 이용한 인간추종 이동로봇의 위치 개선 (Position Improvement of a Human-Following Mobile Robot Using Image Information of Walking Human)

  • 진태석;이동희;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제11권5호
    • /
    • pp.398-405
    • /
    • 2005
  • The intelligent robots that will be needed in the near future are human-friendly robots that are able to coexist with humans and support humans effectively. To realize this, robots need to recognize their position and posture in known environment as well as unknown environment. Moreover, it is necessary for their localization to occur naturally. It is desirable for a robot to estimate of his position by solving uncertainty for mobile robot navigation, as one of the best important problems. In this paper, we describe a method for the localization of a mobile robot using image information of a moving object. This method combines the observed position from dead-reckoning sensors and the estimated position from the images captured by a fixed camera to localize a mobile robot. Using a priori known path of a moving object in the world coordinates and a perspective camera model, we derive the geometric constraint equations which represent the relation between image frame coordinates for a moving object and the estimated robot's position. Also, the control method is proposed to estimate position and direction between the walking human and the mobile robot, and the Kalman filter scheme is used for the estimation of the mobile robot localization. And its performance is verified by the computer simulation and the experiment.

Effects of LED on Emotion-Like Feedback of a Single-Eyed Spherical Robot

  • Onchi, Eiji;Cornet, Natanya;Lee, SeungHee
    • 감성과학
    • /
    • 제24권3호
    • /
    • pp.115-124
    • /
    • 2021
  • Non-verbal communication is important in human interaction. It provides a layer of information that complements the message being transmitted. This type of information is not limited to human speakers. In human-robot communication, increasing the animacy of the robotic agent-by using non-verbal cues-can aid the expression of abstract concepts such as emotions. Considering the physical limitations of artificial agents, robots can use light and movement to express equivalent emotional feedback. This study analyzes the effects of LED and motion animation of a spherical robot on the emotion being expressed by the robot. A within-subjects experiment was conducted at the University of Tsukuba where participants were asked to rate 28 video samples of a robot interacting with a person. The robot displayed different motions with and without light animations. The results indicated that adding LED animations changes the emotional impression of the robot for valence, arousal, and dominance dimensions. Furthermore, people associated various situations according to the robot's behavior. These stimuli can be used to modulate the intensity of the emotion being expressed and enhance the interaction experience. This paper facilitates the possibility of designing more affective robots in the future, using simple feedback.

지능형 표정로봇, 휴머노이드 ICHR (Intelligent Countenance Robot, Humanoid ICHR)

  • 변상준
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2006년도 학술대회 논문집 전문대학교육위원
    • /
    • pp.175-180
    • /
    • 2006
  • In this paper, we develope a type of humanoid robot which can express its emotion against human actions. To interact with human, the developed robot has several abilities to express its emotion, which are verbal communication with human through voice/image recognition, motion tracking, and facial expression using fourteen Servo Motors. The proposed humanoid robot system consists of a control board designed with AVR90S8535 to control servor motors, a framework equipped with fourteen server motors and two CCD cameras, a personal computer to monitor its operations. The results of this research illustrate that our intelligent emotional humanoid robot is very intuitive and friendly so human can interact with the robot very easily.

  • PDF

인간 팔의 형태학적.신경학적 분석 기법에 기반한 휴머노이드 로봇 팔 설계 (The Design of Humanoid Robot Arm based on the Morphological and Neurological Analysis of Human Arm)

  • 최형윤;배영철;문용선
    • 제어로봇시스템학회논문지
    • /
    • 제13권6호
    • /
    • pp.555-559
    • /
    • 2007
  • There are few representative humanoid robots including Japanese ASIMO from Honda and HUBO from KAIST. We cannot consider ASIMO and HUBO the perfect humanoid robots, however. The basic principles when developing humanoid robot is to make them to work in a similar way as human's movement of arm. In this paper, we proposed method of designing humanoid robotic arms based on the morphological.eurological analysis of human's arm tor robot's arm to work in a similar way as human's ann, and we also implemented arm movement control system to humanoids robot by using SERCOS communication.

Kinect 센서를 이용한 효율적인 사람 추종 로봇의 예측 제어 (Predictive Control of an Efficient Human Following Robot Using Kinect Sensor)

  • 허신녕;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제20권9호
    • /
    • pp.957-963
    • /
    • 2014
  • This paper proposes a predictive control for an efficient human following robot using Kinect sensor. Especially, this research is focused on detecting of foot-end-point and foot-vector instead of human body which can be occluded easily by the obstacles. Recognition of the foot-end-point by the Kinect sensor is reliable since the two feet images can be utilized, which increases the detection possibility of the human motion. Depth image features and a decision tree have been utilized to estimate the foot end-point precisely. A tracking point average algorithm is also adopted in this research to estimate the location of foot accurately. Using the continuous locations of foot, the human motion trajectory is estimated to guide the mobile robot along a smooth path to the human. It is verified through the experiments that detecting foot-end-point is more reliable and efficient than detecting the human body. Finally, the tracking performance of the mobile robot is demonstrated with a human motion along an 'L' shape course.

공간지능화에서 다중카메라를 이용한 이동로봇의 인간추적행위 (Human-Tracking Behavior of Mobile Robot Using Multi-Camera System in a Networked ISpace)

  • 진태석;하시모토 히데키
    • 로봇학회논문지
    • /
    • 제2권4호
    • /
    • pp.310-316
    • /
    • 2007
  • The paper proposes a human-following behavior of mobile robot and an intelligent space (ISpace) is used in order to achieve these goals. An ISpace is a 3-D environment in which many sensors and intelligent devices are distributed. Mobile robots exist in this space as physical agents providing humans with services. A mobile robot is controlled to track a walking human using distributed intelligent sensors as stably and precisely as possible. The moving objects is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the intelligent space. Uncertainties in the position estimation caused by the point-object assumption are compensated using the Kalman filter. To generate the shortest time trajectory to track the walking human, the linear and angular velocities are estimated and utilized. The computer simulation and experimental results of estimating and trackinging of the walking human with the mobile robot are presented.

  • PDF

인간손의 동작과 모양을 모방한 휴머노이드 로봇손 설계 (Design of a Humanoid Robot Hand by Mimicking Human Hand's Motion and Appearance)

  • 안상익;오용환;권상주
    • 제어로봇시스템학회논문지
    • /
    • 제14권1호
    • /
    • pp.62-69
    • /
    • 2008
  • A specialized anthropomorphic robot hand which can be attached to the biped humanoid robot MAHRU-R in KIST, has been developed. This built-in type hand consists of three fingers and a thumb with total four DOF(Degrees of Freedom) where the finger mechanism is well designed for grasping typical objects stably in human's daily activities such as sphere and cylinder shaped objects. The restriction of possible motions and the limitation of grasping objects arising from the reduction of DOF can be overcome by reflecting a typical human finger's motion profile to the design procedure. As a result, the developed hand can imitate not only human hand's shape but also its motion in a compact and efficient manner. Also this novel robot hand can perform various human hand gestures naturally and grasp normal objects with both power and precision grasping capability.

실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구 (Vision-based Human-Robot Motion Transfer in Tangible Meeting Space)

  • 최유경;나성권;김수환;김창환;박성기
    • 로봇학회논문지
    • /
    • 제2권2호
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF