• Title/Summary/Keyword: robot's visual system

Search Result 74, Processing Time 0.045 seconds

Development and Test of the Remote Operator Visual Support System Based on Virtual Environment (가상환경기반 원격작업자 시각지원시스템 개발 및 시험)

  • Song, T.G.;Park, B.S.;Choi, K.H.;Lee, S.H.
    • Korean Journal of Computational Design and Engineering
    • /
    • v.13 no.6
    • /
    • pp.429-439
    • /
    • 2008
  • With a remote operated manipulator system, the situation at a remote site can be rendered through remote visualized image to the operator. Then the operator can quickly realize situations and control the slave manipulator by operating a master input device based on the information of the virtual image. In this study, the remote operator visual support system (ROVSS) was developed for viewing support of a remote operator to perform the remote task effectively. A visual support model based on virtual environment was also inserted and used to fulfill the need of this study. The framework for the system was created by Windows API based on PC and the library of 3D graphic simulation tool such as ENVISION. To realize this system, an operation test environment for a limited operating site was constructed by using experimental robot operation. A 3D virtual environment was designed to provide accurate information about the rotation of robot manipulator, the location and distance of operation tool through the real time synchronization. In order to show the efficiency of the visual support, we conducted the experiments by four methods such as the direct view, the camera view, the virtual view and camera view plus virtual view. The experimental results show that the method of camera view plus virtual view has about 30% more efficiency than the method of camera view.

Dynamic visual servo control of robotic manipulators using neural networks (신경 회로망을 이용한 로보트의 동력학적 시각 서보 제어)

  • 박재석;오세영
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1991.10a
    • /
    • pp.1012-1016
    • /
    • 1991
  • An effective visual servo control system for robotic manipulators based on neural networks is proposed. For this control system, firstly, one neural network is used to learn the mapping relationship between the robot's joint space and the video image space. However, in the proposed control scheme, this network is not used in itself, but its first and second derivatives are used to generate servo commands for the robot. Secondly, an adaptive Adaline network is used to identify the dynamics of the robot and also to generate the proper torque commands. Computer simulation has been performed indicating its superior performance. As far as the authors know, this is the first time attempt of the use of neural networks for a visual servo control of robots that compensates for their changing dynamics.

  • PDF

Development of Visual Odometry Estimation for an Underwater Robot Navigation System

  • Wongsuwan, Kandith;Sukvichai, Kanjanapan
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.4 no.4
    • /
    • pp.216-223
    • /
    • 2015
  • The autonomous underwater vehicle (AUV) is being widely researched in order to achieve superior performance when working in hazardous environments. This research focuses on using image processing techniques to estimate the AUV's egomotion and the changes in orientation, based on image frames from different time frames captured from a single high-definition web camera attached to the bottom of the AUV. A visual odometry application is integrated with other sensors. An internal measurement unit (IMU) sensor is used to determine a correct set of answers corresponding to a homography motion equation. A pressure sensor is used to resolve image scale ambiguity. Uncertainty estimation is computed to correct drift that occurs in the system by using a Jacobian method, singular value decomposition, and backward and forward error propagation.

Real-Time Control of a SCARA Robot by Visual Servoing with the Stereo Vision

  • S. H. Han;Lee, M. H.;K. Son;Lee, M. C.;Park, J. W.;Lee, J. M.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1998.10a
    • /
    • pp.238-243
    • /
    • 1998
  • This paper presents a new approach to visual servoing with the stereo vision. In order to control the position and orientation of a robot with respect to an object, a new technique is proposed using a binocular stereo vision. The stereo vision enables us to calculate an exact image Jacobian not only at around a desired location but also at the other locations. The suggested technique can guide a robot manipulator to the desired location without giving such priori knowledge as the relative distance to the desired location or the model of an object even if the initial positioning error is large. This paper describes a model of stereo vision and how to generate feedback commands. The performance of the proposed visual servoing system is illustrated by the simulation and experimental results and compared with the case of conventional method fur a SCARA robot.

  • PDF

Map-Building and Position Estimation based on Multi-Sensor Fusion for Mobile Robot Navigation in an Unknown Environment (이동로봇의 자율주행을 위한 다중센서융합기반의 지도작성 및 위치추정)

  • Jin, Tae-Seok;Lee, Min-Jung;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.5
    • /
    • pp.434-443
    • /
    • 2007
  • Presently, the exploration of an unknown environment is an important task for thee new generation of mobile service robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems. This paper presents a technique for localization of a mobile robot using fusion data of multi-ultrasonic sensors and vision system. The mobile robot is designed for operating in a well-structured environment that can be represented by planes, edges, comers and cylinders in the view of structural features. In the case of ultrasonic sensors, these features have the range information in the form of the arc of a circle that is generally named as RCD(Region of Constant Depth). Localization is the continual provision of a knowledge of position which is deduced from it's a priori position estimation. The environment of a robot is modeled into a two dimensional grid map. we defines a vision-based environment recognition, phisically-based sonar sensor model and employs an extended Kalman filter to estimate position of the robot. The performance and simplicity of the approach is demonstrated with the results produced by sets of experiments using a mobile robot.

Development of a 3D Off-Line Graphic Simulator for Industrial Robot (산업용 로봇의 3차원 오프라인 그래픽 시뮬레이터 개발)

  • 이병국
    • Proceedings of the Korean Society of Machine Tool Engineers Conference
    • /
    • 1999.10a
    • /
    • pp.565-570
    • /
    • 1999
  • In this paper, we developed a windows 95 version Off-Line Programming system which can simulate a Robot model in 3D Graphics space. 4axes SCARA Robot (especially FARA SM5) was adopted as an objective model. Forward kinematics, inverse kinematics and robot dynamics modeling were included in the developed program. The interface between users and the OLP system in the Windows 95's GUI environment was also studied. The developing language is Microsoft Visual C++. Graphic libraries, OpenGL, by silicon Graphics, Inc. were utilized for 3D Graphics.

  • PDF

A Study on locomotion of a mobile robot by a visual perception (시각정보에 의한 이동 로보트의 주행에 관한 연구)

  • Shin, J.S.;Jeong, D.M.;Cho, J.M.;Chang, W.S.;Hong, S.H.
    • Proceedings of the KIEE Conference
    • /
    • 1987.07b
    • /
    • pp.1260-1263
    • /
    • 1987
  • This paper describes the mobile robot system to recognize the guidance tape, and presents the locomotion algorithm. This system converts video imago to binary image by setting an optimal threshold and obtains the parameters to move the robot. The mobile robot moves according to the programmed route in memory. But after recognized the obstacle on the locomotion routs, this system constructs the new route and the robot moves following the new route.

  • PDF

A Development of The Remote Robot Control System with Virtual Reality Interface System (가상현실과 결합된 로봇제어 시스템의 구현방법)

  • 김우경;김훈표;현웅근
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2003.10a
    • /
    • pp.320-324
    • /
    • 2003
  • Recently, Virtual reality parts is applied in various fields of industry. In this paper we got under control motion of reality robot from interface manipulation in the virtual world. This paper created virtual robot using of 3D Graphic Tool. And we reappeared a similar image with reality robot put on texture the use of components of Direct 3D Graphic. Also a reality robot and a virtual robot is controlled by joystick. The developed robot consists of robot controller with vision system and host PC program. The robot and camera can move with 2 degree of freedom by independent remote controlling a user friendly designed joystick. An environment is recognized by the vision system and ultra sonic sensors. The visual mage and command data translated through 900MHz and 447MHz RF controller, respectively. If user send robot control command the use of simulator to control the reality robot, the transmitter/recever got under control until 500miter outdoor at the rate of 4800bps a second in Hlaf Duplex method via radio frequency module useing 447MHz frequency.

  • PDF

Mobility Improvement of an Internet-based Robot System Using the Position Prediction Simulator

  • Lee Kang Hee;Kim Soo Hyun;Kwak Yoon Keun
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.6 no.3
    • /
    • pp.29-36
    • /
    • 2005
  • With the rapid growth of the Internet, the Internet-based robot has been realized by connecting off-line robot to the Internet. However, because the Internet is often irregular and unreliable, the varying time delay in data transmission is a significant problem for the construction of the Internet-based robot system. Thus, this paper is concerned with the development of an Internet-based robot system, which is insensitive to the Internet time delay. For this purpose, the PPS (Position Prediction Simulator) is suggested and implemented on the system. The PPS consists of two parts : the robot position prediction part and the projective virtual scene part. In the robot position prediction part, the robot position is predicted for more accurate operation of the mobile robot, based on the time at which the user's command reaches the robot system. The projective virtual scene part shows the 3D visual information of a remote site, which is obtained through image processing and position prediction. For the verification of this proposed PPS, the robot was moved to follow the planned path under the various network traffic conditions. The simulation and experimental results showed that the path error of the robot motion could be reduced using the developed PPS.

An Aerial Robot System Tracking a Moving Object

  • Ogata, Takehito;Tan, Joo Kooi;Ishikawa, Seiji
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1917-1920
    • /
    • 2003
  • Automatic tracking of a moving object such as a person is a demanding technique especially in surveillance. This paper describes an experimental system for tracking a moving object on the ground by using a visually controlled aerial robot. A blimp is used as the aerial robot in the proposed system because of its locality in motion and its silent nature. The developed blimp is equipped with a camera for taking downward images and four rotors for controlling the progression. Once a camera takes an image of a specified moving object on the ground, the blimp is controlled so that it follows the object by the employment of the visual information. Experimental results show satisfactory performance of the system. Advantages of the present system include that images from the air often enable us to avoid occlusion among objects on the ground and that blimp’s progression is much less restricted in the air than, e.g., a mobile robot running on the ground.

  • PDF