• Title/Summary/Keyword: Human Robot Interaction system

Search Result 142, Processing Time 0.028 seconds

Deep Level Situation Understanding for Casual Communication in Humans-Robots Interaction

  • Tang, Yongkang;Dong, Fangyan;Yoichi, Yamazaki;Shibata, Takanori;Hirota, Kaoru
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.1
    • /
    • pp.1-11
    • /
    • 2015
  • A concept of Deep Level Situation Understanding is proposed to realize human-like natural communication (called casual communication) among multi-agent (e.g., humans and robots/machines), where the deep level situation understanding consists of surface level understanding (such as gesture/posture understanding, facial expression understanding, speech/voice understanding), emotion understanding, intention understanding, and atmosphere understanding by applying customized knowledge of each agent and by taking considerations of thoughtfulness. The proposal aims to reduce burden of humans in humans-robots interaction, so as to realize harmonious communication by excluding unnecessary troubles or misunderstandings among agents, and finally helps to create a peaceful, happy, and prosperous humans-robots society. A simulated experiment is carried out to validate the deep level situation understanding system on a scenario where meeting-room reservation is done between a human employee and a secretary-robot. The proposed deep level situation understanding system aims to be applied in service robot systems for smoothing the communication and avoiding misunderstanding among agents.

A Human-Robot Interaction Entertainment Pet Robot (HRI 엔터테인먼트 애완 로봇)

  • Lee, Heejin
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.24 no.2
    • /
    • pp.179-185
    • /
    • 2014
  • In this paper, a quadruped walking pet robot for human-robot interaction, a robot-controller using a smart phone application program, and a home smart control system using sensor informations providing from the robot are described. The robot has 20 degree of freedom and consists of various sensors such as Kinect sensor, infrared sensor, 3 axis motion sensor, temperature/humidity sensor, gas sensor and graphic LCD module. We propose algorithms for the robot entertainment: walking algorithm of the robot, motion and voice recognition algorithm using Kinect sensor. emotional expression algorithm, smart phone application algorithm for a remote control of the robot, and home smart control algorithm for controlling home appliances. The experiments of this paper show that the proposed algorithms applied to the pet robot, smart phone, and computer are well operated.

Human Robot Interaction via Evolutionary Network Intelligence

  • Yamaguchi, Toru
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.49.2-49
    • /
    • 2002
  • This paper describes the configuration of a multi-agent system that can recognize human intentions. This system constructs ontologies of human intentions and enables knowledge acquisition and sharing between intelligent agents operating in different environments. This is achieved by using a bi-directional associative memory network. The process of intention recognition is based on fuzzy association inferences. This paper shows the process of information sharing by using ontologies. The purpose of this research is to create human-centered systems that can provide a natural interface in their interaction with people.

  • PDF

Virtual Space Calibration for Laser Vision Sensor Using Circular Jig (원형 지그를 이용한 레이저-비젼 센서의 가상 공간 교정에 관한 연구)

  • 김진대;조영식;이재원
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.20 no.12
    • /
    • pp.73-79
    • /
    • 2003
  • Recently, the tole-robot operations to an unstructured environment have been widely researched. The human's interaction with the tole-robot system can be used to improve robot operation and performance for an unknown environment. The exact modeling based on real environment is fundamental and important process for this interaction. In this paper, we propose an extrinsic parameter calibration and data augmentation method that only uses a circular jig in the hand-eye laser virtual environment. Compared to other methods, easier estimation and overlay can be done by this algorithm. Experimental results using synthetic graphic demonstrate the usefulness of the proposed algorithm.

A study on an error recovery expert system in the advanced teleoperator system (지적 원격조작시스템의 일환으로서 에러회복 전문가 시스템에 관한 연구)

  • 이순요;염준규;오제상;이창민
    • Journal of the Ergonomics Society of Korea
    • /
    • v.6 no.2
    • /
    • pp.19-28
    • /
    • 1987
  • If an error occurs in the automatic mode when the advanced teleoperator system performs a task in hostile environment, then the mode changes into the manual mode. The operation by program and the operation by hyman recover the error in the manual mode. The system resumew the automatic mode and continues the given task. In order to utilize the inverse kinematics as means of the operation by program in the manual mode, Lee and Nagamachi determined the end point of the robot trajectory planning which varied with the height of the task object recognized by a T.V monitor, solved the end point by the fuzzy set theory, and controlled the position of the robot hand by the inverse kinematics and the posture of the robot hand by the operation by human. But the operation by human did take a lot of task time because the position and the posture of the robot hand were separately controlled. To reduce the task time by human, this paper developes an error recovery expert system (ERES). The position of the robot hand is controlled by the inverse kinematics of the cartesian coordinate system to the end point which is deter- mined by the fuzzy set theory. The posture of the robot hand is controlled by the modulality of the robot hand's motion which is made by the posture of the task object. The knowledge base and the inference engine of the ERES is developed using the muLISP-86 language. The experimental results show that the average task time by human the ERES which was performed by the integration of the position and the posture control of the robot hand is shorter than that of the research, done by the preliminary experiment, which was performed by the separation of the position and the posture control of the robot hand. A further study is likely to research into an even more intelligent robot system control usint a superimposed display and digitizer which can present two-dimensional coordinate of the work space for the convenience of human interaction.

  • PDF

Hardware Solutions for Interactive Robotic Cane (인터액티브 로봇 지팡이)

  • 심인보;윤중선
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.338-341
    • /
    • 2002
  • A human-friendly interactive system, based on the harmonious symbiotic coexistence of humans and robots, is explored. Based on this interactive technology paradigm, a robotic cane is designed to help blind or visually impaired travelers to navigate safely and quickly among obstacles and other hazards faced by blind pedestrians. We outline a set of the hardware solutions and working methodologies that can be used for successfully implementing and extending the interactive technology to complex environments, robots, and humans. The issues discussed include the interaction of human and robot, design issue of robotic cane, hardware requirements for efficient human-robot interaction.

  • PDF

A Study of Localization Algorithm of HRI System based on 3D Depth Sensor through Capstone Design (캡스톤 디자인을 통한 3D Depth 센서 기반 HRI 시스템의 위치추정 알고리즘 연구)

  • Lee, Dong Myung
    • Journal of Engineering Education Research
    • /
    • v.19 no.6
    • /
    • pp.49-56
    • /
    • 2016
  • The Human Robot Interface (HRI) based on 3D depth sensor on the docent robot is developed and the localization algorithm based on extended Kalman Filter (EKFLA) are proposed through the capstone design by graduate students in this paper. In addition to this, the performance of the proposed EKFLA is also analyzed. The developed HRI system consists of the route generation and localization algorithm, the user behavior pattern awareness algorithm, the map data generation and building algorithm, the obstacle detection and avoidance algorithm on the robot control modules that control the entire behaviors of the robot. It is confirmed that the improvement ratio of the localization error in EKFLA on the scenarios 1-3 is increased compared with the localization algorithm based on Kalman Filter (KFLA) as 21.96%, 25.81% and 15.03%, respectively.

A Human Robot Interactive System "RoJi"

  • Shim, Inbo;Yoon, Joongsun;Yoh, Myeungsook
    • International Journal of Control, Automation, and Systems
    • /
    • v.2 no.3
    • /
    • pp.398-405
    • /
    • 2004
  • A human-friendly interactive system that is based on the harmonious symbiotic coexistence of humans and robots is explored. Based on the interactive technology paradigm, a robotic cane is proposed for blind or visually impaired pedestrians to navigate safely and quickly through obstacles and other hazards. Robotic aids, such as robotic canes, require cooperation between humans and robots. Various methods for implementing the appropriate cooperative recognition, planning, and acting, have been investigated. The issues discussed include the interaction between humans and robots, design issues of an interactive robotic cane, and behavior arbitration methodologies for navigation planning.

Immersive user interfaces for visual telepresence in human-robot interaction (사람과 로봇간 원격작동을 위한 몰입형 사용자 인터페이스)

  • Jang, Su-Hyeong
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.406-410
    • /
    • 2009
  • As studies on more realistic human-robot interface are being actively carried out, people's interests about telepresence which remotely controls robot and obtains environmental information through video display are increasing. In order to provide natural telepresence services by moving a remote robot, it is required to recognize user's behaviors. The recognition of user movements used in previous telepresence system was difficult and costly to be implemented, limited in its applications to human-robot interaction. In this paper, using the Nintendo's Wii controller getting a lot of attention in these days and infrared LEDs, we propose an immersive user interface that easily recognizes user's position and gaze direction and provides remote video information through HMD.

  • PDF

User Interfaces for Visual Telepresence in Human-Robot Interaction Using Wii Controller (WII 컨트롤러를 이용한 사람과 로봇간 원격작동 사용자 인터페이스)

  • Jang, Su-Hyung;Yoon, Jong-Won;Cho, Sung-Bae
    • Journal of the HCI Society of Korea
    • /
    • v.3 no.1
    • /
    • pp.27-32
    • /
    • 2008
  • As studies on more realistic human-robot interface are being actively carried out, people's interests about telepresence which remotely controls robot and obtains environmental information through video display are increasing. In order to provide natural telepresence services by moving a remote robot, it is required to recognize user's behaviors. The recognition of user movements used in previous telepresence system was difficult and costly to be implemented, limited in its applications to human-robot interaction. In this paper, using the Nintendo's Wii controller getting a lot of attention in these days and infrared LEDs, we propose an immersive user interface that easily recognizes user's position and gaze direction and provides remote video information through HMD.

  • PDF