• Title/Summary/Keyword: Robot Interface

Search Result 444, Processing Time 0.024 seconds

A Portable Mediate Interface 'Handybot' for the Rich Human-Robot Interaction (인관과 로봇의 다양한 상호작용을 위한 휴대 매개인터페이스 ‘핸디밧’)

  • Hwang, Jung-Hoon;Kwon, Dong-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.8
    • /
    • pp.735-742
    • /
    • 2007
  • The importance of the interaction capability of a robot increases as the application of a robot is extended to a human's daily life. In this paper, a portable mediate interface Handybot is developed with various interaction channels to be used with an intelligent home service robot. The Handybot has a task-oriented channel of an icon language as well as a verbal interface. It also has an emotional interaction channel that recognizes a user's emotional state from facial expression and speech, transmits that state to the robot, and expresses the robot's emotional state to the user. It is expected that the Handybot will reduce spatial problems that may exist in human-robot interactions, propose a new interaction method, and help creating rich and continuous interactions between human users and robots.

Robot vision interface (로보트와 Vision System Interface)

  • 김선일;여인택;박찬웅
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1987.10b
    • /
    • pp.101-104
    • /
    • 1987
  • This paper shows the robot-vision system which consists of robot, vision system, single board computer and IBM-PC. IBM-PC based system has a great flexibility in expansion for a vision system interfacing. Easy human interfacing and great calculation ability are the benefits of this system. It was carried to interface between each component. The calibration between two coordinate systems is studied. The robot language for robot-vision system was written in "C" language. User also can write job program in "C" language in which the robot and vision related functions reside in the library.side in the library.

  • PDF

Implementation of a 3D Interface System for controlling Mobile Robot (모바일 로봇 제어를 위한 3D 인터페이스 시스템의 구현)

  • Kang, Chang-Hun;Lee, Jong-Jin;Ahn, Hyun-Sik
    • Proceedings of the KIEE Conference
    • /
    • 2001.11c
    • /
    • pp.107-110
    • /
    • 2001
  • Recently, there are lots of concerning on robot agent system working for itself with the trends of the research of bio-mimetic system and intelligent robot. In this paper, a virtual 3D interface system is proposed based on Internet for remote controlling and monitoring of mobile robot. The proposed system is constructed as manager-agent model. A worker can order the robot agent move to a new position by clicking the destination on virtual space of 3D graphic interface in manager. Then the robot agent move to the position automatically with avoiding collision by using range finding and autonomous control algorithm. The proposed robot agent system lets us control the mobile robot remotely located more conveniently.

  • PDF

A Graphical User Interface Design for Surveillance and Security Robot (감시경계 로봇의 그래픽 사용자 인터페이스 설계)

  • Choi, Duck-Kyu;Lee, Chun-Woo;Lee, Choonjoo
    • The Journal of Korea Robotics Society
    • /
    • v.10 no.1
    • /
    • pp.24-32
    • /
    • 2015
  • This paper introduces a graphical user interface design that is aimed to apply to the surveillance and security robot, which is the pilot program for the army unmanned light combat vehicle. It is essential to consider the activities of robot users under the changing security environment in order to design the efficient graphical user interface between user and robot to accomplish the designated mission. The proposed design approach firstly identifies the user activities to accomplish the mission in the standardized scenarios of military surveillance and security operation and then develops the hierarchy of the interface elements that are required to execute the tasks in the surveillance and security scenarios. The developed graphical user interface includes input control component, navigation component, information display component, and accordion and verified by the potential users from the various skilled levels with the military background. The assessment said that the newly developed user interface includes all the critical elements to execute the mission and is simpler and more intuitive compared to the legacy interface design that was more focused on the technical and functional information and informative to the system developing engineers rather than field users.

A User Interface for Vision Sensor based Indirect Teaching of a Robotic Manipulator (시각 센서 기반의 다 관절 매니퓰레이터 간접교시를 위한 유저 인터페이스 설계)

  • Kim, Tae-Woo;Lee, Hoo-Man;Kim, Joong-Bae
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.10
    • /
    • pp.921-927
    • /
    • 2013
  • This paper presents a user interface for vision based indirect teaching of a robotic manipulator with Kinect and IMU (Inertial Measurement Unit) sensors. The user interface system is designed to control the manipulator more easily in joint space, Cartesian space and tool frame. We use the skeleton data of the user from Kinect and Wrist-mounted IMU sensors to calculate the user's joint angles and wrist movement for robot control. The interface system proposed in this paper allows the user to teach the manipulator without a pre-programming process. This will improve the teaching time of the robot and eventually enable increased productivity. Simulation and experimental results are presented to verify the performance of the robot control and interface system.

SITAT: Simulation-based Interface Testing Automation Tool for Robot Software Component (로봇 소프트웨어 컴포넌트를 위한 시뮬레이션 기반 인터페이스 테스팅 자동화 도구)

  • Kang, Jeong-Seok;Choi, Hyeong-Seob;Maeng, Sang-Woo;Kim, Si-Wan;Park, Hong-Seong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.6
    • /
    • pp.608-616
    • /
    • 2010
  • Robot software components can be categorized into two types; the pure S/W component and the H/W-related one. Since interface testing of the robot software component is the labour-intensive and complicated work, an effective automated testing tool is necessary. Especially it is difficult to test all types of H/W-related components because it is hard work to prepare all H/W modules related to them. This paper proposes a new simulation-based interface testing automation tool(SITAT) which generates automatically test cases for interface testing of the robot software component and executes the interface test with the generated test cases where the simulator is used for simulation of the activity of a H/W module instead of the real H/W module. This paper verifies the effectiveness of the suggested SITAT with testing of the real H/W-related robot software component.

Development of a Personal Riding Robot Controlled by a Smartphone Based on Android OS (안드로이드 스마트폰 제어기반의 개인용 탑승로봇 구현)

  • Kim, Yeongyun;Kim, Dong Hun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.7
    • /
    • pp.592-598
    • /
    • 2013
  • In this paper, a small, lightweight smartphone-controlled riding robot is developed. Also, in this study, a smartphone with a jog shuttle mode for consideration of user convenience is proposed to make a small, lightweight riding robot. As well, a compass sensor is used to compensate for the mechanical characteristics of motors mounted on the riding robot. The riding robot is controlled by the interface of a drag-based jog shuttle in the smartphone, instead of a mechanical controller. For a personal riding robot, if the smartphone is used as a controller instead of a handle or a pole, it reduces its size, weight, and cost to a great extent. Thus, the riding robot can be used in indoor spaces such as offices for moving or a train or bus station and an airport for scouting, or hospital for disabilities. Experimental results show that the riding robot is easily and conveniently controlled by the proposed smartphone interface based on Android.

Real-Time Obstacle Avoidance of Autonomous Mobile Robot and Implementation of User Interface for Android Platform (자율주행 이동로봇의 실시간 장애물 회피 및 안드로이드 인터페이스 구현)

  • Kim, Jun-Young;Lee, Won-Chang
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.9 no.4
    • /
    • pp.237-243
    • /
    • 2014
  • In this paper we present an real-time obstacle avoidance technique of autonomous mobile robot with steering system and implementation of user interface for mobile devices with Android platform. The direction of autonomous robot is determined by virtual force field concept, which is based on the distance information acquired from 5 ultrasonic sensors. It is converted to virtual repulsive force around the autonomous robot which is inversely proportional to the distance. The steering system with PD(proportional and derivative) controller moves the mobile robot to the determined target direction. We also use PSD(position sensitive detector) sensors to supplement ultrasonic sensors around dead angle area. The mobile robot communicates with Android mobile device and PC via Ethernet. The video information from CMOS camera mounted on the mobile robot is transmitted to Android mobile device and PC. And the user can control the mobile robot manually by transmitting commands on the user interface to it via Ethernet.

Robust Control of a Haptic Interface Using LQG/LTR (LQG/LTR을 이용한 Haptic Interface의 강인제어)

  • Lee, Sang-Cheol;Park, Heon;Lee, Su-Sung;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.8 no.9
    • /
    • pp.757-763
    • /
    • 2002
  • A newly designed haptic interface enables an operator to control a remote robot precisely. It transmits position information to the remote robot and feeds back the interaction force from it. A control algorithm of haptic interface has been studied to improve the robustness and stability to uncertain dynamic environments with a proposed contact dynamic model that incorporates human hand dynamics. A simplified hybrid parallel robot dynamic model fur a 6 DOF haptic device was proposed to from a real time control system, which does not include nonlinear components. LQC/LTR scheme was adopted in this paper for the compensation of un-modeled dynamics. The recovery of the farce from the remote robot at the haptic interface was demonstrated through the experiments.