• Title/Summary/Keyword: robot vision system

Search Result 589, Processing Time 0.027 seconds

A Study on the Rotation Angle Estimation of HMD for the Tele-operated Vision System (원격 비전시스템을 위한 HMD의 방향각 측정 알고리즘에 관한 연구)

  • Ro, Young-Shick;Yoon, Seung-Jun;Kang, Hee-Jun;Suh, Young-Soo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.58 no.3
    • /
    • pp.605-613
    • /
    • 2009
  • In this paper, we studied for the real-time azimuthal measurement of HMD (Head Mounted Display) to control the tele-operated vision system on the mobile robot. In the preexistence tele-operated vision system, a joystick was used to control the pan-tilt unit of the remote camera system. To give the sense of presence to the tele-operator, we used a HMD to display the remote scene, measured the rotation angle of the HMD on a real time basis, and transmitted the measured rotation angles to the mobile robot controller to synchronize the pan-tilt angles of remote camera with the HMD. In this paper, we suggest an algorithm for the real-time estimation of the HMD rotation angles using feature points extraction from pc-camera image. The simple experiment is conducted to demonstrate the feasibility.

A Stereo-Vision System for 3D Position Recognition of Cow Teats on Robot Milking System (로봇 착유시스템의 3차원 유두위치인식을 위한 스테레오비젼 시스템)

  • Kim, Woong;Min, Byeong-Ro;Lee, Dea-Weon
    • Journal of Biosystems Engineering
    • /
    • v.32 no.1 s.120
    • /
    • pp.44-49
    • /
    • 2007
  • A stereo vision system was developed for robot milking system (RMS) using two monochromatic cameras. An algorithm for inverse perspective transformation was developed for the 3-D information acquisition of all teats. To verify performance of the algorithm in the stereo vision system, indoor tests were carried out using a test-board and model teats. A real cow and a model cow were used to measure distance errors. The maximum distance errors of test-board, model teats and real teats were 0.5 mm, 4.9 mm and 6 mm, respectively. The average distance errors of model teats and real teats were 2.9 mm and 4.43 mm, respectively. Therefore, it was concluded that this algorithm was sufficient for the RMS to be applied.

Real-time Omni-directional Distance Measurement with Active Panoramic Vision

  • Yi, Soo-Yeong;Choi, Byoung-Wook;Ahuja, Narendra
    • International Journal of Control, Automation, and Systems
    • /
    • v.5 no.2
    • /
    • pp.184-191
    • /
    • 2007
  • Autonomous navigation of mobile robot requires a ranging system for measurement of distance to environmental objects. It is obvious that the wider and the faster distance measurement gives a mobile robot more freedom in trajectory planning and control. The active omni-directional ranging system proposed in this paper is capable of obtaining the distance for all 3600 directions in real-time because of the omni-directional mirror and the structured light. Distance computation including the sensitivity analysis and the experiments on the omni-directional ranging are presented to verify the effectiveness of the proposed system.

DEVELOPMENT OF A 3-DOF ROBOT FOR HARVESTING LETTUCE USING MACHINE: VISION AND FUZZY LOGIC CONTROL

  • S. I. Cho;S. J. Chang;Kim, Y. Y.
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 2000.11b
    • /
    • pp.354-362
    • /
    • 2000
  • In Korea, researches on year-round leaf vegetables production system are in progress, most of them focused on environmental control. Therefore, automation technologies for harvesting, transporting, and grading are in great demand. A robot system for harvesting lettuces, composed of a 3-DOF (degree of freedom) manipulator, an end-effector, a lettuce feeding conveyor, an air blower, a machine vision system, six photoelectric sensors, and a fuzzy logic controller, was developed. A fuzzy logic control was applied to determine appropriate grip force on lettuce. Leaf area index and height were used as input variables and voltage as an output variable for the fuzzy logic controller. Success rate of the lettuce harvesting was 94.12%, and average harvesting time was approximately 5 seconds per lettuce.

  • PDF

Development and Performance Evaluation of Hull Blasting Robot for Surface Pre-Preparation for Painting Process (도장전처리 작업을 위한 블라스팅 로봇 시스템 개발 및 성능평가)

  • Lee, JunHo;Jin, Taeseok
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.5
    • /
    • pp.383-389
    • /
    • 2016
  • In this paper, we present the hull blasting machine with vision-based weld bead recognition device for cleaning shipment exterior wall. The purpose of this study is to introduce the mechanism design of the high efficiency hull blasting machine using the vision system to recognize the weld bead. Therefore, we have developed a robot mechanism and drive controller system of the hull blasting robot. And hull blasting characteristics such as the climbing mechanism, vision system, remote controller and CAN have been discussed and compared with the experimental data. The hull blasting robots are able to remove rust or paint at anchor, so the re-docking is unnecessary. Therefore, this can save time and cost of undergoing re-docking process and build more vessels instead. The robot uses sensors to navigate safely around the hull and has a filter system to collect the fouling removed. We have completed a pilot test of the robot and demonstrated the drive control and CAN communication performance.

Target Tracking Control of Mobile Robots with Vision System in the Absence of Velocity Sensors (속도센서가 없는 비전시스템을 이용한 이동로봇의 목표물 추종)

  • Cho, Namsub;Kwon, Ji-Wook;Chwa, Dongkyoung
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.62 no.6
    • /
    • pp.852-862
    • /
    • 2013
  • This paper proposes a target tracking control method for wheeled mobile robots with nonholonomic constraints by using a backstepping-like feedback linearization. For the target tracking, we apply a vision system to mobile robots to obtain the relative posture information between the mobile robot and the target. The robots do not use the sensors to obtain the velocity information in this paper and therefore assumed the unknown velocities of both mobile robot and target. Instead, the proposed method uses only the maximum velocity information of the mobile robot and target. First, the pseudo command for the forward linear velocity and the heading direction angle are designed based on the kinematics by using the obtained image information. Then, the actual control inputs are designed to make the actual forward linear velocity and the heading direction angle follow the pseudo commands. Through simulations and experiments for the mobile robot we have confirmed that the proposed control method is able to track target even when the velocity sensors are not used at all.

Terrain Exploration Using a Mobile Robot with Stereo Cameras (스테레오 카메라를 장착한 주행 로봇의 야외 탐사)

  • Yoon, Suk-June;Park, Sung-Kee;Kim, Soo-Hyun;Kwak, Yoon-Keun
    • Proceedings of the KSME Conference
    • /
    • 2004.11a
    • /
    • pp.766-771
    • /
    • 2004
  • In this paper, new exploration mobile robot is presented. This mobile robot, called Robhaz-6W, is able to overcome hazardous terrains, recognize three dimensional terrain information and generate a path toward the destination by itself. We develop the passive four bar linkage mechanism adoptable to such terrain without any active control and the real time stereo vision system for obstacle avoidance, a remote control and a path planning method. And the geometrical information is transmitted to the operator in the remote site via wireless LAN equipment. And finally, experimental results for the passive mechanism, the real time stereo vision system, the path planning are reported, which show the versatility of the proposed mobile robot system to carry out some tasks.

  • PDF

A User Interface for Vision Sensor based Indirect Teaching of a Robotic Manipulator (시각 센서 기반의 다 관절 매니퓰레이터 간접교시를 위한 유저 인터페이스 설계)

  • Kim, Tae-Woo;Lee, Hoo-Man;Kim, Joong-Bae
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.10
    • /
    • pp.921-927
    • /
    • 2013
  • This paper presents a user interface for vision based indirect teaching of a robotic manipulator with Kinect and IMU (Inertial Measurement Unit) sensors. The user interface system is designed to control the manipulator more easily in joint space, Cartesian space and tool frame. We use the skeleton data of the user from Kinect and Wrist-mounted IMU sensors to calculate the user's joint angles and wrist movement for robot control. The interface system proposed in this paper allows the user to teach the manipulator without a pre-programming process. This will improve the teaching time of the robot and eventually enable increased productivity. Simulation and experimental results are presented to verify the performance of the robot control and interface system.

Dynamic Visual Servoing of Robot Manipulators (로봇 메니퓰레이터의 동력학 시각서보)

  • Baek, Seung-Min;Im, Gyeong-Su;Han, Ung-Gi;Guk, Tae-Yong
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.49 no.1
    • /
    • pp.41-47
    • /
    • 2000
  • A better tracking performance can be achieved, if visual sensors such as CCD cameras are used in controling a robot manipulator, than when only relative sensors such as encoders are used. However, for precise visual servoing of a robot manipulator, an expensive vision system which has fast sampling rate must be used. Moreover, even if a fast vision system is implemented for visual servoing, one cannot get a reliable performance without use of robust and stable inner joint servo-loop. In this paper, we propose a dynamic control scheme for robot manipulators with eye-in-hand camera configuration, where a dynamic learning controller is designed to improve the tracking performance of robotic system. The proposed control scheme is implemented for tasks of tracking moving objects and shown to be robust to parameter uncertainty, disturbances, low sampling rate, etc.

  • PDF

2D Map generation Using Omnidirectional Image sensor and Stereo Vision for MobileRobot MAIRO (자율이동로봇MAIRO의 전방향 이미지센서와 스테레오 비전 시스템을 이용한 2차원 지도 생성)

  • Kim, Kyung-Ho;Lee, Hyung-Kyu;Son, Young-Jun;Song, Jae-Keun
    • Proceedings of the KIEE Conference
    • /
    • 2002.11c
    • /
    • pp.495-500
    • /
    • 2002
  • Recently, a service robot industry outstands as an up and coming industry of the next generation. Specially, there are so many research in self-steering movement(SSM). In order to implement SSM, robot must effectively recognize all around, detect objects and make a surrounding map with sensors. So, many robots have a sonar and a infrared sensor, etc. But, in these sensors, We only know informations about between the robot and the object as well as resolution faculty is of inferior quality. In this paper, we will introduce new algorithm that recognizes objects around robot and makes a two dimension surrounding map with a omni-direction vision camera and two stereo vision cameras.

  • PDF