• 제목/요약/키워드: HRI(Human Robot Interaction)

검색결과 77건 처리시간 0.029초

아동과 홈 로봇의 심리적.교육적 상호작용 분석 (Analysis on Psychological and Educational Effects in Children and Home Robot Interaction)

  • 김병준;한정혜
    • 정보교육학회논문지
    • /
    • 제9권3호
    • /
    • pp.501-510
    • /
    • 2005
  • 홈 로봇이 인간과 원활한 상호작용을 하기 위해서 인간과 로봇의 상호작용 즉 HRI(Human-Robot Interaction) 연구가 절실히 필요하다. 본 연구에서는 최근 개발된 홈 로봇 'iRobi'와 아동의 상호작용을 통해 홈 로봇이 아동의 심리적 인식에 어떤 영향을 미쳤는가와 홈 로봇 학습이 얼마나 효과적인가를 알아보았다. 심리적 인식 측면에서 홈 로봇과의 상호작용은 아동에게 친근감과 상호작용이 가능한 상대로 인식하도록 하였으며 아동의 불안을 해소시키는 것으로 분석되었다. 학습 효과 측면에서 홈 로봇을 이용한 경우가 다른 학습 매체(책, WBI)에 비해 학습 집중도와 학습 흥미도 그리고 학업 성취도가 높은 것으로 분석되었다. 따라서 홈 로봇은 아동의 정서적, 교육적 상호작용 도구로서 긍정적인 의미가 있는 것으로 보여진다.

  • PDF

로봇 환경에서 텐서 부공간 분석기법을 이용한 얼굴인식 (Face Recognition Using Tensor Subspace Analysis in Robot Environments)

  • 김승석;곽근창
    • 로봇학회논문지
    • /
    • 제3권4호
    • /
    • pp.300-307
    • /
    • 2008
  • This paper is concerned with face recognition for human-robot interaction (HRI) in robot environments. For this purpose, we use Tensor Subspace Analysis (TSA) to recognize the user's face through robot camera when robot performs various services in home environments. Thus, the spatial correlation between the pixels in an image can be naturally characterized by TSA. Here we utilizes face database collected in u-robot test bed environments in ETRI. The presented method can be used as a core technique in conjunction with HRI that can naturally interact between human and robots in home robot applications. The experimental results on face database revealed that the presented method showed a good performance in comparison with the well-known methods such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) in distant-varying environments.

  • PDF

Safe and Reliable Intelligent Wheelchair Robot with Human Robot Interaction

  • Hyuk, Moon-In;Hyun, Joung-Sang;Kwang, Kum-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.120.1-120
    • /
    • 2001
  • This paper proposes a prototype of a safe and reliable wheelchair robot with Human Robot Interaction (HRI). Since the wheelchair users are usually the handicapped, the wheelchair robot must guarantee the safety and reliability for the motion while considering users intention, A single color CCD camera is mounted for input user´s command based on human-friendly gestures, and a ultra sonic sensor array is used for sensing external motion environment. We use face and hand directional gestures as the user´s command. By combining the user´s command with the sensed environment configuration, the planner of the wheelchair robot selects an optimal motion. We implement a prototype wheelchair robot, MR, HURI (Mobile Robot with Human Robot Interaction) ...

  • PDF

강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식 (A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction)

  • 이래경;안수용;오세영
    • 제어로봇시스템학회논문지
    • /
    • 제18권4호
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

사용자 적응 인터페이스를 사용한 이동로봇의 원격제어 (Remote Control of a Mobile Robot Using Human Adaptive Interface)

  • 황창순;이상룡;박근영;이춘영
    • 제어로봇시스템학회논문지
    • /
    • 제13권8호
    • /
    • pp.777-782
    • /
    • 2007
  • Human Robot Interaction(HRI) through a haptic interface plays an important role in controlling robot systems remotely. The augmented usage of bio-signals in the haptic interface is an emerging research area. To consider operator's state in HRI, we used bio-signals such as ECG and blood pressure in our proposed force reflection interface. The variation of operator's state is checked from the information processing of bio-signals. The statistical standard variation in the R-R intervals and blood pressure were used to adaptively adjust force reflection which is generated from environmental condition. To change the pattern of force reflection according to the state of the human operator is our main idea. A set of experiments show the promising results on our concepts of human adaptive interface.

그림모델과 파티클필터를 이용한 인간 정면 상반신 포즈 인식 (Pictorial Model of Upper Body based Pose Recognition and Particle Filter Tracking)

  • 오치민;;김민욱;이칠우
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2009년도 학술대회
    • /
    • pp.186-192
    • /
    • 2009
  • 본 논문은 비전을 이용한 인간 정면 상반신 포즈를 인식 방법에 대해서 기술한다. 일반적으로 HCI(Human Computer Interaction)와 HRI(Human Robot Interaction)에서는 인간이 정면을 바라볼 때 얼굴, 손짓으로 의사소통 하는 경우가 많기 때문에 본 논문에서는 인식의 범위를 인간의 정면 그리고 상반신에 대해서만 한정한다. 인간 포즈인식의 주요 두 가지 어려움은 첫째 인간은 다양한 관절로 이루어진 객체이기 때문에 포즈의 자유도가 높은 문제점 때문에 모델링이 어렵다는 것이다. 둘째는 모델링된 정보와 영상과의 매칭이 어려운 것이다. 이를 해결하기 위해 본 논문에서는 모델링이 쉬운 그림모델(Pictorial Model)을 이용해 인체를 다수 사각형 파트로 모델링 하였고 이를 이용해 주요한 상반신 포즈를 DB화 해 인식한다. DB 포즈로 표현되지 못하는 세부포즈는 인식된 주요 포즈 파라미터로 부터 파티클필터를 이용해 예측한 다수 파티클로부터 가장 높은 사후분포를 갖는 파티클을 찾아 주요 포즈를 업데이트하여 결정한다. 따라서 주요한 포즈 인식과 이를 기반으로 한 세부 포즈를 추적하는 두 단계를 통해 인체 정면 상반신 포즈를 정확하게 인식 할 수 있다.

  • PDF

객체 지향적 슬레이브 로봇들로 구성된 홈서비스 로봇 시스템의 구현 (Implementation of Home Service Robot System consisting of Object Oriented Slave Robots)

  • 고창건;고대건;권혜진;박정일;이석규
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2007년도 심포지엄 논문집 정보 및 제어부문
    • /
    • pp.337-339
    • /
    • 2007
  • This paper proposes a new paradigm for cooperation of multi-robot system for home service. For localization of each robot. the master robot collects information of location of each robot based on communication of RFID tag on the floor and RFID reader attached on the bottom of the robot. The Master robot communicates with slave robots via wireless LAN to check the motion of robots and command to them based on the information from slave robots. The operator may send command to slave robots based on the HRI(Human-Robot Interaction) screened on masted robot using information from slave robots. The cooperation of multiple robots will enhance the performance comparing with single robot.

  • PDF

An analysis of the component of Human-Robot Interaction for Intelligent room

  • Park, Jong-Chan;Kwon, Dong-Soo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.2143-2147
    • /
    • 2005
  • Human-Robot interaction (HRI) has recently become one of the most important issues in the field of robotics. Understanding and predicting the intentions of human users is a major difficulty for robotic programs. In this paper we suggest an interaction method allows the robot to execute the human user's desires in an intelligent room-based domain, even when the user does not give a specific command for the action. To achieve this, we constructed a full system architecture of an intelligent room so that the following were present and sequentially interconnected: decision-making based on the Bayesian belief network, responding to human commands, and generating queries to remove ambiguities. The robot obtained all the necessary information from analyzing the user's condition and the environmental state of the room. This information is then used to evaluate the probabilities of the results coming from the output nodes of the Bayesian belief network, which is composed of the nodes that includes several states, and the causal relationships between them. Our study shows that the suggested system and proposed method would improve a robot's ability to understand human commands, intuit human desires, and predict human intentions resulting in a comfortable intelligent room for the human user.

  • PDF

진동감지를 이용한 사용자 걸음걸이 인식 (Estimating Human Walking Pace and Direction Using Vibration Signals)

  • 정은석;김대은
    • 제어로봇시스템학회논문지
    • /
    • 제20권5호
    • /
    • pp.481-485
    • /
    • 2014
  • In service robots, a number of human movements are analyzed using a variety of sensors. Vibration signals from walking movements of a human provide useful information about the distance and the movement direction of the human. In this paper, we measure the intensity of vibrations and detect both human walking pace and direction. In our experiments, vibration signals detected by microphone sensors provide good estimation of the distance and direction of a human movement. This can be applied to HRI (Human-Robot Interaction) technology.