• Title/Summary/Keyword: Robot Interaction

Search Result 482, Processing Time 0.032 seconds

Human Centered Robot for Mutual Interaction in Intelligent Space

  • Jin Tae-Seok;Hashimoto Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.5 no.3
    • /
    • pp.246-252
    • /
    • 2005
  • Intelligent Space is a space where many sensors and intelligent devices are distributed. Mobile robots exist in this space as physical agents, which provide human with services. To realize this, human and mobile robots have to approach each other as much as possible. Moreover, it is necessary for them to perform interactions naturally. It is desirable for a mobile robot to carry out human affinitive movement. In this research, a mobile robot is controlled by the Intelligent Space through its resources. The mobile robot is controlled to follow walking human as stably and precisely as possible. In order to follow a human, control law is derived from the assumption that a human and a mobile robot are connected with a virtual spring model. Input velocity to a mobile robot is generated on the basis of the elastic force from the virtual spring in this model. And its performance is verified by the computer simulation and the experiment.

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.

Pictorial Model of Upper Body based Pose Recognition and Particle Filter Tracking (그림모델과 파티클필터를 이용한 인간 정면 상반신 포즈 인식)

  • Oh, Chi-Min;Islam, Md. Zahidul;Kim, Min-Wook;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.186-192
    • /
    • 2009
  • In this paper, we represent the recognition method for human frontal upper body pose. In HCI(Human Computer Interaction) and HRI(Human Robot Interaction) when a interaction is established the human has usually frontal direction to the robot or computer and use hand gestures then we decide to focus on human frontal upper-body pose, The two main difficulties are firstly human pose is consist of many parts which cause high DOF(Degree Of Freedom) then the modeling of human pose is difficult. Secondly the matching between image features and modeling information is difficult. Then using Pictorial Model we model the human main poses which are mainly took the space of frontal upper-body poses and we recognize the main poses by making main pose database. using determined main pose we used the model parameters for particle filter which predicts the posterior distribution for pose parameters and can determine more specific pose by updating model parameters from the particle having the maximum likelihood. Therefore based on recognizing main poses and tracking the specific pose we recognize the human frontal upper body poses.

  • PDF

Hardware Solutions for Interactive Robotic Cane (인터액티브 로봇 지팡이)

  • 심인보;윤중선
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.338-341
    • /
    • 2002
  • A human-friendly interactive system, based on the harmonious symbiotic coexistence of humans and robots, is explored. Based on this interactive technology paradigm, a robotic cane is designed to help blind or visually impaired travelers to navigate safely and quickly among obstacles and other hazards faced by blind pedestrians. We outline a set of the hardware solutions and working methodologies that can be used for successfully implementing and extending the interactive technology to complex environments, robots, and humans. The issues discussed include the interaction of human and robot, design issue of robotic cane, hardware requirements for efficient human-robot interaction.

  • PDF

Recognition and Generation of Facial Expression for Human-Robot Interaction (로봇과 인간의 상호작용을 위한 얼굴 표정 인식 및 얼굴 표정 생성 기법)

  • Jung Sung-Uk;Kim Do-Yoon;Chung Myung-Jin;Kim Do-Hyoung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.12 no.3
    • /
    • pp.255-263
    • /
    • 2006
  • In the last decade, face analysis, e.g. face detection, face recognition, facial expression recognition, is a very lively and expanding research field. As computer animated agents and robots bring a social dimension to human computer interaction, interest in this research field is increasing rapidly. In this paper, we introduce an artificial emotion mimic system which can recognize human facial expressions and also generate the recognized facial expression. In order to recognize human facial expression in real-time, we propose a facial expression classification method that is performed by weak classifiers obtained by using new rectangular feature types. In addition, we make the artificial facial expression using the developed robotic system based on biological observation. Finally, experimental results of facial expression recognition and generation are shown for the validity of our robotic system.

Recognizing User Engagement and Intentions based on the Annotations of an Interaction Video (상호작용 영상 주석 기반 사용자 참여도 및 의도 인식)

  • Jang, Minsu;Park, Cheonshu;Lee, Dae-Ha;Kim, Jaehong;Cho, Young-Jo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.6
    • /
    • pp.612-618
    • /
    • 2014
  • A pattern classifier-based approach for recognizing internal states of human participants in interactions is presented along with its experimental results. The approach includes a step for collecting video recordings of human-human interactions or humanrobot interactions and subsequently analyzing the videos based on human coded annotations. The annotation includes social signals directly observed in the video recordings and the internal states of human participants indirectly inferred from those observed social signals. Then, a pattern classifier is trained using the annotation data, and tested. In our experiments on human-robot interaction, 7 video recordings were collected and annotated with 20 social signals and 7 internal states. Several experiments were performed to obtain an 84.83% recall rate for interaction engagement, 93% for concentration intention, and 81% for task comprehension level using a C4.5 based decision tree classifier.

Robot Mobile Control Technology Using Robot Arm as Haptic Interface (로봇의 팔을 햅틱 인터페이스로 사용하여 로봇의 이동을 제어하는 기술)

  • Jung, Yu Chul;Lee, Seongsoo
    • Journal of IKEEE
    • /
    • v.17 no.1
    • /
    • pp.44-50
    • /
    • 2013
  • This paper proposed the implementation of haptic-based robot which is following a human by using fundamental sensors on robot arms without additional sensors. Joints in the robot arms have several motors, and their angles can be read out by these motors when a human pushes or pulls the robot arms. So these arms can be used as haptic sensors. The implemented robot follows a human by interacting with robot arms and human hands, as a human follows a human by hands.

A Case Study on the Nonverbal Immediacy of the Robot (로봇의 비언어적 즉시성에 대한 사례연구)

  • Jeong, Seongmi;Shin, Dong-Hee;Gu, Jihyang
    • The Journal of the Korea Contents Association
    • /
    • v.15 no.7
    • /
    • pp.181-192
    • /
    • 2015
  • Nonverbal immediacy plays a key role in interpersonal communication, inducing closeness and another interaction. This case study investigates the nonverbal behaviors in Human-Robot Interaction(HRI) focusing on immediacy, and how they affect perception of a robot. The results show that nonverbal immediacy, such as nodding and leaning forward, affect perceived interactivity. Nonverbal immediacy can be interpreted their meaning clearly when verbal feedback or other communication channel reinforcing them. Also, touch is found to affect come up with similar ones to the robot, because it is associated with concrete and discrete context. People tend to apply social rules to the robot, but they are more open to the unfamiliar robot compared to strangers. The findings in this study provide future HRI studies with heuristic implications by showing a direction and clarifying principles for composition of verbal and nonverbal expression.

An Emotional Gesture-based Dialogue Management System using Behavior Network (행동 네트워크를 이용한 감정형 제스처 기반 대화 관리 시스템)

  • Yoon, Jong-Won;Lim, Sung-Soo;Cho, Sung-Bae
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.10
    • /
    • pp.779-787
    • /
    • 2010
  • Since robots have been used widely recently, research about human-robot communication is in process actively. Typically, natural language processing or gesture generation have been applied to human-robot interaction. However, existing methods for communication among robot and human have their limits in performing only static communication, thus the method for more natural and realistic interaction is required. In this paper, an emotional gesture based dialogue management system is proposed for sophisticated human-robot communication. The proposed system performs communication by using the Bayesian networks and pattern matching, and generates emotional gestures of robots in real-time while the user communicates with the robot. Through emotional gestures robot can communicate the user more efficiently also realistically. We used behavior networks as the gesture generation method to deal with dialogue situations which change dynamically. Finally, we designed a usability test to confirm the usefulness of the proposed system by comparing with the existing dialogue system.