• Title/Summary/Keyword: Robot Gestures

Search Result 40, Processing Time 0.022 seconds

Generation of Robot Facial Gestures based on Facial Actions and Animation Principles (Facial Actions 과 애니메이션 원리에 기반한 로봇의 얼굴 제스처 생성)

  • Park, Jeong Woo;Kim, Woo Hyun;Lee, Won Hyong;Lee, Hui Sung;Chung, Myung Jin
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.5
    • /
    • pp.495-502
    • /
    • 2014
  • This paper proposes a method to generate diverse robot facial expressions and facial gestures in order to help long-term HRI. First, nine basic dynamics for diverse robot facial expressions are determined based on the dynamics of human facial expressions and principles of animation for even identical emotions. In the second stage, facial actions are added to express facial gestures such as sniffling or wailing loudly corresponding to sadness, laughing aloud or smiling corresponding to happiness, etc. To evaluate the effectiveness of our approach, we compared the facial expressions of the developed robot when the proposed method is used or not. The results of the survey showed that the proposed method can help robots generate more realistic facial expressions.

Safe and Reliable Intelligent Wheelchair Robot with Human Robot Interaction

  • Hyuk, Moon-In;Hyun, Joung-Sang;Kwang, Kum-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.120.1-120
    • /
    • 2001
  • This paper proposes a prototype of a safe and reliable wheelchair robot with Human Robot Interaction (HRI). Since the wheelchair users are usually the handicapped, the wheelchair robot must guarantee the safety and reliability for the motion while considering users intention, A single color CCD camera is mounted for input user´s command based on human-friendly gestures, and a ultra sonic sensor array is used for sensing external motion environment. We use face and hand directional gestures as the user´s command. By combining the user´s command with the sensed environment configuration, the planner of the wheelchair robot selects an optimal motion. We implement a prototype wheelchair robot, MR, HURI (Mobile Robot with Human Robot Interaction) ...

  • PDF

An Extraction Method of Meaningful Hand Gesture for a Robot Control (로봇 제어를 위한 의미 있는 손동작 추출 방법)

  • Kim, Aram;Rhee, Sang-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.27 no.2
    • /
    • pp.126-131
    • /
    • 2017
  • In this paper, we propose a method to extract meaningful motion among various kinds of hand gestures on giving commands to robots using hand gestures. On giving a command to the robot, the hand gestures of people can be divided into a preparation one, a main one, and a finishing one. The main motion is a meaningful one for transmitting a command to the robot in this process, and the other operation is a meaningless auxiliary operation to do the main motion. Therefore, it is necessary to extract only the main motion from the continuous hand gestures. In addition, people can move their hands unconsciously. These actions must also be judged by the robot with meaningless ones. In this study, we extract human skeleton data from a depth image obtained by using a Kinect v2 sensor and extract location data of hands data from them. By using the Kalman filter, we track the location of the hand and distinguish whether hand motion is meaningful or meaningless to recognize the hand gesture by using the hidden markov model.

Design and implement of the Educational Humanoid Robot D2 for Emotional Interaction System (감성 상호작용을 갖는 교육용 휴머노이드 로봇 D2 개발)

  • Kim, Do-Woo;Chung, Ki-Chull;Park, Won-Sung
    • Proceedings of the KIEE Conference
    • /
    • 2007.07a
    • /
    • pp.1777-1778
    • /
    • 2007
  • In this paper, We design and implement a humanoid robot, With Educational purpose, which can collaborate and communicate with human. We present an affective human-robot communication system for a humanoid robot, D2, which we designed to communicate with a human through dialogue. D2 communicates with humans by understanding and expressing emotion using facial expressions, voice, gestures and posture. Interaction between a human and a robot is made possible through our affective communication framework. The framework enables a robot to catch the emotional status of the user and to respond appropriately. As a result, the robot can engage in a natural dialogue with a human. According to the aim to be interacted with a human for voice, gestures and posture, the developed Educational humanoid robot consists of upper body, two arms, wheeled mobile platform and control hardware including vision and speech capability and various control boards such as motion control boards, signal processing board proceeding several types of sensors. Using the Educational humanoid robot D2, we have presented the successful demonstrations which consist of manipulation task with two arms, tracking objects using the vision system, and communication with human by the emotional interface, the synthesized speeches, and the recognition of speech commands.

  • PDF

Design and Control of Wire-driven Flexible Robot Following Human Arm Gestures (팔 동작 움직임을 모사하는 와이어 구동 유연 로봇의 설계 및 제어)

  • Kim, Sanghyun;Kim, Minhyo;Kang, Junki;Son, SeungJe;Kim, Dong Hwan
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.50-57
    • /
    • 2019
  • This work presents a design and control method for a flexible robot arm operated by a wire drive that follows human gestures. When moving the robot arm to a desired position, the necessary wire moving length is calculated and the motors are rotated accordingly to the length. A robotic arm is composed of a total of two module-formed mechanism similar to real human motion. Two wires are used as a closed loop in one module, and universal joints are attached to each disk to create up, down, left, and right movements. In order to control the motor, the anti-windup PID was applied to limit the sudden change usually caused by accumulated error in the integral control term. In addition, master/slave communication protocol and operation program for linking 6 motors to MYO sensor and IMU sensor output were developed at the same time. This makes it possible to receive the image information of the camera attached to the robot arm and simultaneously send the control command to the robot at high speed.

Development of a Cost-Effective Tele-Robot System Delivering Speaker's Affirmative and Negative Intentions (화자의 긍정·부정 의도를 전달하는 실용적 텔레프레즌스 로봇 시스템의 개발)

  • Jin, Yong-Kyu;You, Su-Jeong;Cho, Hye-Kyung
    • The Journal of Korea Robotics Society
    • /
    • v.10 no.3
    • /
    • pp.171-177
    • /
    • 2015
  • A telerobot offers a more engaging and enjoyable interaction with people at a distance by communicating via audio, video, expressive gestures, body pose and proxemics. To provide its potential benefits at a reasonable cost, this paper presents a telepresence robot system for video communication which can deliver speaker's head motion through its display stanchion. Head gestures such as nodding and head-shaking can give crucial information during conversation. We also can assume a speaker's eye-gaze, which is known as one of the key non-verbal signals for interaction, from his/her head pose. In order to develop an efficient head tracking method, a 3D cylinder-like head model is employed and the Harris corner detector is combined with the Lucas-Kanade optical flow that is known to be suitable for extracting 3D motion information of the model. Especially, a skin color-based face detection algorithm is proposed to achieve robust performance upon variant directions while maintaining reasonable computational cost. The performance of the proposed head tracking algorithm is verified through the experiments using BU's standard data sets. A design of robot platform is also described as well as the design of supporting systems such as video transmission and robot control interfaces.

Human-like Arm Movement Planning for Humanoid Robots Using Motion Capture Database (모션캡쳐 데이터베이스를 이용한 인간형 로봇의 인간다운 팔 움직임 계획)

  • Kim, Seung-Su;Kim, Chang-Hwan;Park, Jong-Hyeon;You, Bum-Jae
    • The Journal of Korea Robotics Society
    • /
    • v.1 no.2
    • /
    • pp.188-196
    • /
    • 2006
  • During the communication and interaction with a human using motions or gestures, a humanoid robot needs not only to look like a human but also to behave like a human to make sure the meanings of the motions or gestures. Among various human-like behaviors, arm motions of the humanoid robot are essential for the communication with people through motions. In this work, a mathematical representation for characterizing human arm motions is first proposed. The human arm motions are characterized by the elbow elevation angle which is determined using the position and orientation of human hands. That representation is mathematically obtained using an approximation tool, Response Surface Method (RSM). Then a method to generate human-like arm motions in real time using the proposed representation is presented. The proposed method was evaluated to generate human-like arm motions when the humanoid robot was asked to move its arms from a point to another point including the rotation of its hand. The example motion was performed using the KIST humanoid robot, MAHRU.

  • PDF

The Virtual Robot Arm Control Method by EMG Pattern Recognition using the Hybrid Neural Network System (혼합형 신경회로망을 이용한 근전도 패턴 분류에 의한 가상 로봇팔 제어 방식)

  • Jung, Kyung-Kwon;Kim, Joo-Woong;Eom, Ki-Hwan
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.10
    • /
    • pp.1779-1785
    • /
    • 2006
  • This paper presents a method of virtual robot arm control by EMG pattern recognition using the proposed hybrid system. The proposed hybrid system is composed of the LVQ and the SOFM, and the SOFM is used for the preprocessing of the LVQ. The SOFM converts the high dimensional EMG signals to 2-dimensional data. The EMG measurement system uses three surface electrodes to acquire the EMG signal from operator. Six hand gestures can be classified sufficiently by the proposed hybrid system. Experimental results are presented that show the effectiveness of the virtual robot arm control by the proposed hybrid system based classifier for the recognition of hand gestures from EMG signal patterns.

Implementation of a Spring Backboned Soft Arm Emulating Human Gestures (인간 동작 표현용 스프링 백본 구조 소프트 암의 구현)

  • Yoon, Hyun-Soo;Choi, Jae-Yeon;Oh, Se-Min;Lee, Byeong-Ju;Yoon, Ho-Sup;Cho, Young-Jo
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.2
    • /
    • pp.65-75
    • /
    • 2012
  • This study deals with the design of a spring backboned soft arm, which will be employed for generation of human gesture as an effective means of Human Robot interaction. The special features of the proposed mechanism are the light weight and the flexibility of the whole mechanism by using a spring backbone. Thus, even in the case of collision with human, this device is able to absorb the impact structurally. The kinematics and the design for the soft arm are introduced. The performance of this mechanism was shown through experiment emulating several human gestures expressing human emotion and some service contents. Finally, this soft arm was implemented as the wing mechanism of a penguin robot.

Improvement of Gesture Recognition using 2-stage HMM (2단계 히든마코프 모델을 이용한 제스쳐의 성능향상 연구)

  • Jung, Hwon-Jae;Park, Hyeonjun;Kim, Donghan
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.11
    • /
    • pp.1034-1037
    • /
    • 2015
  • In recent years in the field of robotics, various methods have been developed to create an intimate relationship between people and robots. These methods include speech, vision, and biometrics recognition as well as gesture-based interaction. These recognition technologies are used in various wearable devices, smartphones and other electric devices for convenience. Among these technologies, gesture recognition is the most commonly used and appropriate technology for wearable devices. Gesture recognition can be classified as contact or noncontact gesture recognition. This paper proposes contact gesture recognition with IMU and EMG sensors by using the hidden Markov model (HMM) twice. Several simple behaviors make main gestures through the one-stage HMM. It is equal to the Hidden Markov model process, which is well known for pattern recognition. Additionally, the sequence of the main gestures, which comes from the one-stage HMM, creates some higher-order gestures through the two-stage HMM. In this way, more natural and intelligent gestures can be implemented through simple gestures. This advanced process can play a larger role in gesture recognition-based UX for many wearable and smart devices.