• Title/Summary/Keyword: Robot Interface

Search Result 444, Processing Time 0.027 seconds

Teleoperation System of a Mobile Robot over the Internet (인터넷을 이용한 이동로봇의 원격 운용 시스템)

  • Park, Taehyun;Gang, Geun-Taek;Lee, Wonchang
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.8 no.3
    • /
    • pp.270-274
    • /
    • 2002
  • This paper presents a teleoperation system that combines computer network and an autonomous mobile robot. We control remotely an autonomous mobile robot with vision over the Internet to guide it under unknown environments in the real time. The main feature of this system is that local operators need a web browser and a computer connected to the communication network and so they can command the robot in a remote location through the home page. The hardware architecture of this system consists of an autonomous mobile robot, workstation, and local computers. The software architecture of this system includes the client part for the user interface and robot control as well as the server part for communication between users and robot. The server and client systems are developed using Java language which is suitable to internet application and supports multi-platform. Furthermore. this system offers an image compression method using JPEG concept which reduces large time delay that occurs in network during image transmission.

Stairs Walking of a Biped Robot (2족 보행 로봇의 계단 보행)

  • 성영휘;안희욱
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.5 no.1
    • /
    • pp.46-52
    • /
    • 2004
  • In this paper, we introduce a case study of developing a miniature humanoid robot which has 16 degrees of freedom, 42 cm heights, and 1.5kg weights. For easy implimentation, the integrated RC-servo motors are adopted as actuators and a digital camera is equipped on its head. So, it can transmit vision data to a remote host computer via wireless modem. The robot can perform staircase walking as well as straight walking and turning to any direction. The user-interface program running on the host computer contains a robot graphic simulator and a motion editor which are used to generate and verify the robot's walking motion. The experimental results show that the robot has various walking capability including straight walking, turning, and stairs walking.

  • PDF

Development of a Real-time OS Based Control System for Laparoscopic Surgery Robot (복강경 수술로봇을 위한 실시간 운영체제 기반 제어 시스템의 개발)

  • Song, Seung-Joon;Park, Jun-Woo;Shin, Jung-Wook;Kim, Yun-Ho;Lee, Duk-Hee;Jo, Yung-Ho;Choi, Jae-Seoon;Sun, Kyung
    • Journal of Biomedical Engineering Research
    • /
    • v.29 no.1
    • /
    • pp.32-39
    • /
    • 2008
  • This paper reports on a realtime OS based master-slave configuration robot control system for laparoscopic surgery robot which enables telesurgery and overcomes shortcomings with conventional laparoscopic surgery. Surgery robot system requires control system that can process large volume information such as medical image data and video signal from endoscope in real-time manner, as well as precisely control the robot with high reliability. To meet the complex requirements, the use of high-level real-time OS (Operating System) in surgery robot controller is a must, which is as common as in many of modem robot controllers that adopt real-time OS as a base system software on which specific functional modules are implemened for more reliable and stable system. The control system consists of joint controllers, host controllers, and user interface units. The robot features a compact slave robot with 5 DOF (Degree-Of-Freedom) expanding the workspace of each tool and increasing the number of tools operating simultaneously. Each master, slave and Gill (Graphical User Interface) host runs a dedicated RTOS (Real-time OS), RTLinux-Pro (FSMLabs Inc., U.S.A.) on which functional modules such as motion control, communication, video signal integration and etc, are implemented, and all the hosts are in a gigabit Ethernet network for inter-host communication. Each master and slave controller set has a dedicated CAN (Controller Area Network) channel for control and monitoring signal communication with the joint controllers. Total 4 pairs of the master/slave manipulators as current are controlled by one host controller. The system showed satisfactory performance in both position control precision and master-slave motion synchronization in both bench test and animal experiment, and is now under further development for better safety and control fidelity for clinically applicable prototype.

Design of a Compact Laparoscopic Assistant Robot;KaLAR

  • Lee, Yun-Ju;Kim, Jona-Than;Ko, Seong-Young;Lee, Woo-Jung;Kwon, Dong-Soo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2648-2653
    • /
    • 2003
  • This paper describes the development of a 3-DOF laparoscopic assistant robot system with motor-controlled bending and zooming mechanisms using the voice command motion control and auto-tracking control. The system is designed with two major criteria: safety and adaptability. To satisfy the safety criteria we designed the robot with optimized range of motion. For adaptability, the robot is designed with compact size to minimize interference with the staffs in the operating room. The required external motions were replaced by the bending mechanism within the abdomen using flexible laparoscope. The zooming of the robot is achieved through in and out motion at the port where the laparoscope is inserted. The robot is attachable to the bedside using a conventional laparoscope holder with multiple DOF joints and is compact enough for hand-carry. The voice-controlled command input and auto-tracking control is expected to enhance the overall performance of the system while reducing the control load imposed on the surgeon during a laparoscopic surgery. The proposed system is expected to have sufficient safety features and an easy-to-use interface to enhance the overall performance of current laparoscopy.

  • PDF

A Development of Intelligent Service Robot System for Store Management in Unmanned Environment (무인화 환경 기반의 상점 자동 관리를 위한 지능형 서비스 로봇 시스템)

  • Ahn, Ho-Seok;Sa, In-Kyu;Baek, Young-Min;Lee, Dong-Wook
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.6
    • /
    • pp.539-545
    • /
    • 2011
  • This paper describes an intelligent service robot system for managing a store in an unmanned environment. The robot can be a good replacement for humans because it is possible to work all day and to remember lots of information. We design a system architecture for configuring many intelligent functions of intelligent service robot system which consists of four layers; a User Interaction Layer, a Behavior Scheduling Layer, a Intelligent Module Layer, and a Hardware Layer. We develop an intelligent service robot 'Part Timer' based on the designed system architecture. The 'Part Timer' has many intelligent function modules such as face detection-recognition-tracking module, speech recognition module, navigation module, manipulator module, appliance control module, etc. The 'Part Timer' is possible to answer the phone and this function gives convenient interface to users.

Development of a Hexapod Robot for Multi-terrain Reconnaissance (다양한 험지 정찰을 위한 6족 보행 로봇 개발)

  • Lim, Seoung-Yong;Kim, Jong-Hyeong;Kim, Hyeong-Gik
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.24 no.6
    • /
    • pp.667-674
    • /
    • 2015
  • This paper describes the development of a prototype hexapod robot with six circular legs to overcome a variety of challenging terrains. The legs of the robot are very important for stability during walking, which are analyzed for determining the optimal design parameters through CAE tools. Its control system consists of three types of sensors, microprocessors, and communication modules for PC interface. The entire operation of the robot can be controlled and monitored using a PC. The experimental operations for three different roads verified the feasibility of the prototype robot for carrying out reconnaissance on multi terrain. In the near future, the prototype robot can be used for a military purpose of detecting and informing a potential risk in advance.

Development of Advanced Robot System for Bridge Inspection and Monitoring (교량유지관리 자동화를 위한 첨단 로봇 시스템 개발)

  • Lee, Jong-Seh;Hwang, In-Ho;Kim, Dong-Woo;Lee, Hu-Seok
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2007.04a
    • /
    • pp.90-95
    • /
    • 2007
  • Conventional bridge inspection involves the physical positioning of an inspector by the hydraulic telescoping boom of a "snooper truck" thereby providing visual access to bridge components. The process is time consuming, hazardous, and may be affected by lighting conditions, Therefore, it is of great interest that an automated and/or teleoperated inspection robot be developed to replace the manual inspection procedure. This paper describes the advanced bridge inspection robot system under development and other related activities currently undergoing at the Bridge Inspection Robot Development Interface (BIRDI). BIRDI is a research consortium with its home in the Department of Civil and Environmental System Engineering at Hanyang University at Ansan. Its primary goal is to develop advanced robot systems for bridge inspection and monitoring for immediate field application and commercialization. The research program includes research areas such as advanced inspection robot and motion control system, sensing technologies for monitoring and assessment, and integrated system for bridge maintenance. The center embraces 12 institutions, which consist of 7 universities, 2 research institutes, and 3 private enterprises. Research projects are cross-disciplinary and include experts from structural engineering, mechanical engineering, electronic and control engineering. This research project will contribute to advancement of infrastructure maintenance technology, enhancement of construction industry competitiveness, and promotion of national capacity for technology innovation.

  • PDF

Health Monitoring and Efficient Data Management Method for the Robot Software Components (로봇 소프트웨어 컴포넌트의 실행 모니터링/효율적인 데이터 관리방안)

  • Kim, Jong-Young;Yoon, Hee-Byung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.11
    • /
    • pp.1074-1081
    • /
    • 2011
  • As robotics systems are becoming more complex there is the need to promote component based robot development, where systems can be constructed as the composition and integration of reusable building block. One of the most important challenges facing component based robot development is safeguarding against software component failures and malfunctions. The health monitoring of the robot software is most fundamental factors not only to manage system at runtime but also to analysis information of software component in design phase of the robot application. And also as a lot of monitoring events are occurred during the execution of the robot software components, a simple data treatment and efficient memory management method is required. In this paper, we propose an efficient events monitoring and data management method by modeling robot software component and monitoring factors based on robot software framework. The monitoring factors, such as component execution runtime exception, Input/Output data, execution time, checkpoint-rollback are deduced and the detail monitoring events are defined. Furthermore, we define event record and monitor record pool suitable for robot software components and propose a efficient data management method. To verify the effectiveness and usefulness of the proposed approach, a monitoring module and user interface has been implemented using OPRoS robot software framework. The proposed monitoring module can be used as monitoring tool to analysis the software components in robot design phase and plugged into self-healing system to monitor the system health status at runtime in robot systems.

Design of the emotion expression in multimodal conversation interaction of companion robot (컴패니언 로봇의 멀티 모달 대화 인터랙션에서의 감정 표현 디자인 연구)

  • Lee, Seul Bi;Yoo, Seung Hun
    • Design Convergence Study
    • /
    • v.16 no.6
    • /
    • pp.137-152
    • /
    • 2017
  • This research aims to develop the companion robot experience design for elderly in korea based on needs-function deploy matrix of robot and emotion expression research of robot in multimodal interaction. First, Elder users' main needs were categorized into 4 groups based on ethnographic research. Second, the functional elements and physical actuators of robot were mapped to user needs in function- needs deploy matrix. The final UX design prototype was implemented with a robot type that has a verbal non-touch multi modal interface with emotional facial expression based on Ekman's Facial Action Coding System (FACS). The proposed robot prototype was validated through a user test session to analyze the influence of the robot interaction on the cognition and emotion of users by Story Recall Test and face emotion analysis software; Emotion API when the robot changes facial expression corresponds to the emotion of the delivered information by the robot and when the robot initiated interaction cycle voluntarily. The group with emotional robot showed a relatively high recall rate in the delayed recall test and In the facial expression analysis, the facial expression and the interaction initiation of the robot affected on emotion and preference of the elderly participants.

Speech Emotion Recognition on a Simulated Intelligent Robot (모의 지능로봇에서의 음성 감정인식)

  • Jang Kwang-Dong;Kim Nam;Kwon Oh-Wook
    • MALSORI
    • /
    • no.56
    • /
    • pp.173-183
    • /
    • 2005
  • We propose a speech emotion recognition method for affective human-robot interface. In the Proposed method, emotion is classified into 6 classes: Angry, bored, happy, neutral, sad and surprised. Features for an input utterance are extracted from statistics of phonetic and prosodic information. Phonetic information includes log energy, shimmer, formant frequencies, and Teager energy; Prosodic information includes Pitch, jitter, duration, and rate of speech. Finally a pattern classifier based on Gaussian support vector machines decides the emotion class of the utterance. We record speech commands and dialogs uttered at 2m away from microphones in 5 different directions. Experimental results show that the proposed method yields $48\%$ classification accuracy while human classifiers give $71\%$ accuracy.

  • PDF