• Title/Summary/Keyword: Robot Interface

Search Result 444, Processing Time 0.036 seconds

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.4
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

Development of Facial Expression Recognition System based on Bayesian Network using FACS and AAM (FACS와 AAM을 이용한 Bayesian Network 기반 얼굴 표정 인식 시스템 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.562-567
    • /
    • 2009
  • As a key mechanism of the human emotion interaction, Facial Expression is a powerful tools in HRI(Human Robot Interface) such as Human Computer Interface. By using a facial expression, we can bring out various reaction correspond to emotional state of user in HCI(Human Computer Interaction). Also it can infer that suitable services to supply user from service agents such as intelligent robot. In this article, We addresses the issue of expressive face modeling using an advanced active appearance model for facial emotion recognition. We consider the six universal emotional categories that are defined by Ekman. In human face, emotions are most widely represented with eyes and mouth expression. If we want to recognize the human's emotion from this facial image, we need to extract feature points such as Action Unit(AU) of Ekman. Active Appearance Model (AAM) is one of the commonly used methods for facial feature extraction and it can be applied to construct AU. Regarding the traditional AAM depends on the setting of the initial parameters of the model and this paper introduces a facial emotion recognizing method based on which is combined Advanced AAM with Bayesian Network. Firstly, we obtain the reconstructive parameters of the new gray-scale image by sample-based learning and use them to reconstruct the shape and texture of the new image and calculate the initial parameters of the AAM by the reconstructed facial model. Then reduce the distance error between the model and the target contour by adjusting the parameters of the model. Finally get the model which is matched with the facial feature outline after several iterations and use them to recognize the facial emotion by using Bayesian Network.

Gesture Interface for Controlling Intelligent Humanoid Robot (지능형 로봇 제어를 위한 제스처 인터페이스)

  • Bae Ki Tae;Kim Man Jin;Lee Chil Woo;Oh Jae Yong
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.10
    • /
    • pp.1337-1346
    • /
    • 2005
  • In this paper, we describe an algorithm which can automatically recognize human gesture for Human-Robot interaction. In early works, many systems for recognizing human gestures work under many restricted conditions. To eliminate these restrictions, we have proposed the method that can represent 3D and 2D gesture information simultaneously, APM. This method is less sensitive to noise or appearance characteristic. First, the feature vectors are extracted using APM. The next step is constructing a gesture space by analyzing the statistical information of training images with PCA. And then, input images are compared to the model and individually symbolized to one portion of the model space. In the last step, the symbolized images are recognized with HMM as one of model gestures. The experimental results indicate that the proposed algorithm is efficient on gesture recognition, and it is very convenient to apply to humanoid robot or intelligent interface systems.

  • PDF

Color Vision System for Intelligent Rehabilitation Robot mounted on the Wheelchair (휠체어 장착형 지능형 재활 로봇을 위한 칼라 비전 시스템)

  • Song, Won-Kyung;Lee, He-Young;Kim, Jong-Sung;Bien, Zeung-Nam
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.35S no.11
    • /
    • pp.75-87
    • /
    • 1998
  • KARES (KAIST Rehabilitation Engineering System) is the rehabilitation robot system in the type of the 6 degrees of freedom robot arm mounted on the wheelchair, in order to assist the independent livelihood of the disabled and the elderly. The interface device for programming and controlling of the robot arm is essential in the rehabilitation robotic system. Specially, in the case of the manual operation of the robot arm, the user has the burden of cognition and the difficulty for the operation of the robot arm. As a remedy, color vision system for the autonomous performance of jobs is proposed, and four basic desired jobs are specified. By mounting the camera in eye-in-hand type, color vision system for KARES is set up. The desired jobs for picking up the target and moving it to the user's face for drinking are successfully performed in real-time at the indoor environment.

  • PDF

Implementation of Home Monitoring System Using a Vacuum Robot with Wireless Router (유무선공유기와 청소로봇을 이용한 홈 모니터링 시스템의 구현)

  • Jeon, Byung-Chan;Choi, Gyoo-Seok;Kang, Jeong-Jin
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.8 no.5
    • /
    • pp.73-80
    • /
    • 2008
  • The recent trend in home network system includes intelligent home environments that remote monitoring and control service is achieved without restrictions by device types, time, and place. Also the use of a vacuum robot in homes is gradually generalized on account of the convenience of the use. In this paper, we proposed and realized new home-monitoring system with the employment of an self-movement robot as one trial for realizing an intelligent home under home network environment. The proposed system can freely monitor every where in home, because the system effectively overcame the surveillance limitations of the existing monitoring system by attaching a Wireless Router and WebCam to a commercial vacuum robot. The outdoor users of this system can readily monitor any place which they want to supervise by controlling a vacuum robot with mobile telecommunication devices such as PDA. The wireless router installed with Linux operation system "OpenWrt" made it possible for the system users to transmit images and to control a vacuum robot with RS-232 communication.

  • PDF

Study of Educational Insect Robot that Utilizes Mobile Augmented Reality Digilog Book (모바일 증강현실 Digilog Book을 활용한 교육용 곤충로봇 콘텐츠)

  • Park, Young-Sook;Park, Dea-Woo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.6
    • /
    • pp.1355-1360
    • /
    • 2014
  • In this paper, we apply the learning of the mobile robot insect augmented reality Digilog Book. In the era of electronic, book written in paper space just have moved to virtual reality space. The virtual reality, constraints spatial and physical, in the real world, it is a technique that enables to experience indirectly situation not experienced directly as user immersive experience type interface. Applied to the learning robot Digilog Book that allows the fusion of paper analog and digital content, using the augmented reality technology, to experience various interactions. Apply critical elements moving, three-dimensional images and animation to enrich the learning, for easier block assembly, designed to grasp more easily rank order between the blocks. Anywhere at any time, is capable of learning of the robot in Digilog Book to be executed by the mobile phone in particular.

Vibration Control of Working Booms on Articulated Bridge Inspection Robots (교량검사 굴절로봇 작업붐의 진동제어)

  • Hwang, In-Ho;Lee, Hu-Seok;Lee, Jong-Seh
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.21 no.5
    • /
    • pp.421-427
    • /
    • 2008
  • A robot crane truck is developed by the Bridge Inspection Robot Development Interface(BRIDI) for an automated and/or teleoperated bridge inspection. This crane truck looks similar to the conventional bucket crane, but is much smaller in size and light-weight. At the end of the telescoping boom which is 12m long, a robot platform is mounted which allows the operator to scan the bridge structure under the deck trough the camera. Boom vibration induced by wind and deck movement can cause serious problems in this scanning system. This paper presents a control system to mitigate such vibration of the robot boom. In the proposed control system, an actuator is installed at the end of the working boom. This control system is studied using a mathematical model analysis with LQ control algorithm and a scaled model test in the laboratory. The study indicates that the proposed system is efficient for the vibration control of the robot booms, thereby demonstrating its immediate applicability in the field.

Study of Educational Insect Robot that Utilizes Mobile Augmented Reality Digilog Book (모바일 증강현실 Digilog Book을 활용한 교육용 곤충로봇 콘텐츠)

  • Park, Young-sook;Park, Dea-woo
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2014.05a
    • /
    • pp.241-244
    • /
    • 2014
  • In this paper, we apply the learning of the mobile robot insect augmented reality Digilog Book. In the era of electronic, book written in paper space just have moved to virtual reality space. The virtual reality, constraints spatial and physical, in the real world, it is a technique that enables to experience indirectly situation not experienced directly as user immersive experience type interface. Applied to the learning robot Digilog Book that allows the fusion of paper analog and digital content, using the augmented reality technology, to experience various interactions. Apply critical elements moving, three-dimensional images and animation to enrich the learning, for easier block assembly, designed to grasp more easily rank order between the blocks. Anywhere at any time, is capable of learning of the robot in Digilog Book to be executed by the mobile phone in particular.

  • PDF

A Study on High Speed Laser Welding by using Scanner and Industrial Robot (스캐너와 산업용 로봇을 이용한 고속 레이저 용접에 관한 연구)

  • Kang, Hee-Shin;Suh, Jeong;Kim, Jong-Su;Kim, Jeng-O;Cho, Taik-Dong
    • Proceedings of the KWS Conference
    • /
    • 2009.11a
    • /
    • pp.29-29
    • /
    • 2009
  • On this research, laser welding technology for manufacturing automobile body is studied. Laser welding technology is one of the important technologies used in the manufacturing of lighter, safer automotive bodies at a high level of productivity; the leading automotive manufacturers have replaced spot welding with laser welding in the process of car body assembly. Korean auto manufacturers are developing and applying the laser welding technology using a high output power Nd:YAG laser and a 6-axes industrial robot. On the other hand, the robot-based remote laser welding system was equipped with a long focal laser scanner system in robotic end effect. Laser system, robot system, and scanner system are used for realizing the high speed laser welding system. The remote laser welding system and industrial robotic system are used to consist of robot-based remote laser welding system. The robot-based remote laser welding system is flexible and able to improve laser welding speed compared with traditional welding as spot welding and laser welding. The robot-based remote laser systems used in this study were Trumpf's 4kW Nd:YAG laser (HL4006D) and IPG's 1.6kW Fiber laser (YLR-1600), while the robot systems were of ABB's IRB6400R (payload:120kg) and Hyundai Heavy Industry's HX130-02 (payload:130kg). In addition, a study of quality evaluation and monitoring technology for the remote laser welding was conducted. The welding joints of steel plate and steel plate coated with zinc were butt and lapped joints. The quality testing of the laser welding was conducted by observing the shape of the beads on the plate and the cross-section of the welded parts, analyzing the results of mechanical tension test, and monitoring the plasma intensity and temperature by using UV and IR detectors. Over the past years, Trumf's 4kW Nd:YAG laser and ABB's IRB6400R robot system was used. Nowadays, the new laser source, robot and laser scanner system are used to increase the processing speed and to improve the efficiency of processes. This paper proposes the robot-based remote laser welding system as a means of resolving the limited welding speed and accuracy of conventional laser welding systems.

  • PDF

EEG Analysis Following Change in Hand Grip Force Level for BCI Based Robot Arm Force Control (BCI 기반 로봇 손 제어를 위한 악력 변화에 따른 EEG 분석)

  • Kim, Dong-Eun;Lee, Tae-Ju;Park, Seung-Min;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.23 no.2
    • /
    • pp.172-177
    • /
    • 2013
  • With Brain Computer Interface (BCI) system, a person with disabled limb could use this direct brain signal like electroencephalography (EEG) to control a device such as the artifact arm. The precise force control for the artifact arm is necessary for this artificial limb system. To understand the relationship between control EEG signal and the gripping force of hands, We proposed a study by measuring EEG changes of three grades (25%, 50%, 75%) of hand grip MVC (Maximal Voluntary Contract). The acquired EEG signal was filtered to obtain power of three wave bands (alpha, beta, gamma) by using fast fourier transformation (FFT) and computed power spectrum. Then the power spectrum of three bands (alpha, beta and gamma) of three classes (MVC 25%, 50%, 75%) was classified by using PCA (Principal Component Analysis) and LDA (Linear Discriminant Analysis). The result showed that the power spectrum of EEG is increased at MVC 75% more than MVC 25%, and the correct classification rate was 52.03% for left hand and 77.7% for right hand.