• Title/Summary/Keyword: Robot Vision

Search Result 878, Processing Time 0.029 seconds

Visual Servoing of a Mobile Manipulator Based on Stereo Vision (스테레오 영상을 이용한 이동형 머니퓰레이터의 시각제어)

  • Lee Hyun Jeong;Park Min Gyu;Lee Min Cheol
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.5
    • /
    • pp.411-417
    • /
    • 2005
  • In this study, stereo vision system is applied to a mobile manipulator for effective tasks. The robot can recognize a target and compute the potion of the target using a stereo vision system. While a monocular vision system needs properties such as geometric shape of a target, a stereo vision system enables the robot to find the position of a target without additional information. Many algorithms have been studied and developed for an object recognition. However, most of these approaches have a disadvantage of the complexity of computations and they are inadequate for real-time visual servoing. Color information is useful for simple recognition in real-time visual servoing. This paper addresses object recognition using colors, stereo matching method to reduce its calculation time, recovery of 3D space and the visual servoing.

A Study on Development and Application of Real Time Vision Algorithm for Inspection Process Automation (검사공정 자동화를 위한 실시간 비전알고리즘 개발 및 응용에 관한 연구)

  • Back, Seung-Hak;Hwang, Won-Jun;Shin, Haeng-Bong;Choi, Young-Sik;Park, Dae-Yeong
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.19 no.1
    • /
    • pp.42-49
    • /
    • 2016
  • This study proposes a non-contact inspective technology based robot vision system for Faulty Inspection of welding States and Parts Shape. The maine focus is real time implementation of the machining parts' automatic inspection by the robotic moving. For this purpose, the automatic test instrument inspects the precision components designator the vision system. pattern Recognition Technologies and Precision Components for vision inspection technology and precision machining of precision parts including the status and appearance distinguish between good and bad. To perform a realization of a real-time automation integration system for the precision parts of manufacturing process, it is designed a robot vision system for the integrated system controller and verified the reliability through experiments. The main contents of this paper, the robot vision technology for noncontact inspection of precision components and machinery parts is useful technology for FA.

Automatic Extraction of Stable Visual Landmarks for a Mobile Robot under Uncertainty (이동로봇의 불확실성을 고려한 안정한 시각 랜드마크의 자동 추출)

  • Moon, In-Hyuk
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.9
    • /
    • pp.758-765
    • /
    • 2001
  • This paper proposes a method to automatically extract stable visual landmarks from sensory data. Given a 2D occupancy map, a mobile robot first extracts vertical line features which are distinct and on vertical planar surfaces, because they are expected to be observed reliably from various viewpoints. Since the feature information such as position and length includes uncertainty due to errors of vision and motion, the robot then reduces the uncertainty by matching the planar surface containing the features to the map. As a result, the robot obtains modeled stable visual landmarks from extracted features. This extraction process is performed on-line to adapt to an actual changes of lighting and scene depending on the robot’s view. Experimental results in various real scenes show the validity of the proposed method.

  • PDF

Odor Source Tracking of Mobile Robot with Vision and Odor Sensors (비전과 후각 센서를 이용한 이동로봇의 냄새 발생지 추적)

  • Ji, Dong-Min;Lee, Jeong-Jun;Kang, Geun-Taek;Lee, Won-Chang
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.6
    • /
    • pp.698-703
    • /
    • 2006
  • This paper proposes an approach to search for the odor source using an autonomous mobile robot equipped with vision and odor sensors. The robot is initially navigating around the specific area with vision system until it looks for an object in the camera image. The robot approaches the object found in the field of view and checks it with the odor sensors if it is releasing odor. If so, the odor is classified and localized with the classification algorithm based on neural network The AMOR(Autonomous Mobile Olfactory Robot) was built up and used for the experiments. Experimental results on the classification and localization of odor sources show the validity of the proposed algorithm.

Control Strategy for Obstacle Avoidance of an Agricultural Robot (농용 로봇의 장애물 회피알고리즘)

  • 류관희;김기영;박정인;류영선
    • Journal of Biosystems Engineering
    • /
    • v.25 no.2
    • /
    • pp.141-150
    • /
    • 2000
  • This study was carried out to de develop a control strategy of a fruit harvesting redundant robot. The method of generating a safe trajectory, which avoids collisions with obstracles such as branches or immature fruits, in the 3D(3-dimension) space using artificial potential field technique and virtual plane concept was proposed. Also, the method of setting reference velocity vectors to follow the trajectory and to avoid obstacles in the 3D space was proposed. Developed methods were verified with computer simulations and with actual robot tests. Fro the actual robot tests, a machine vision system was used for detecting fruits and obstacles, Results showed that developed control method could reduce the occurrences of the robot manipulator located in the possible collision distance. with 10 virtual obstacles generated randomly in the 3 D space, maximum rates of the occurrences of the robot manipulator located in the possible collision distance, 0.03 m, from the obstacles were 8 % with 5 degree of freedom (DOF), 8 % with 6-DOF, and 4% with 7-DOF, respectively.

  • PDF

Vision-based Autonomous Semantic Map Building and Robot Localization (영상 기반 자율적인 Semantic Map 제작과 로봇 위치 지정)

  • Lim, Joung-Hoon;Jeong, Seung-Do;Suh, Il-Hong;Choi, Byung-Uk
    • Proceedings of the KIEE Conference
    • /
    • 2005.10b
    • /
    • pp.86-88
    • /
    • 2005
  • An autonomous semantic-map building method is proposed, with the robot localized in the semantic-map. Our semantic-map is organized by objects represented as SIFT features and vision-based relative localization is employed as a process model to implement extended Kalman filters. Thus, we expect that robust SLAM performance can be obtained even under poor conditions in which localization cannot be achieved by classical odometry-based SLAM

  • PDF

A Study on the Development of a Robot Vision Control Scheme Based on the Newton-Raphson Method for the Uncertainty of Circumstance (불확실한 환경에서 N-R방법을 이용한 로봇 비젼 제어기법 개발에 대한 연구)

  • Jang, Min Woo;Jang, Wan Shik;Hong, Sung Mun
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.40 no.3
    • /
    • pp.305-315
    • /
    • 2016
  • This study aims to develop a robot vision control scheme using the Newton-Raphson (N-R) method for the uncertainty of circumstance caused by the appearance of obstacles during robot movement. The vision system model used for this study involves six camera parameters (C1-C6). First, the estimation scheme for the six camera parameters is developed. Then, based on the six estimated parameters for three of the cameras, a scheme for the robot's joint angles is developed for the placement of a slender bar. For the placement of a slender bar for the uncertainty of circumstances, in particular, the discontinuous robot trajectory caused by obstacles is divided into three obstacle regions: the beginning region, middle region, and near-target region. Then, the effects of obstacles while using the proposed robot vision control scheme are investigated in each obstacle region by performing experiments with the placement of the slender bar.

The Tip Position Measurement of a Flexible Robot Arm Using a Vision Sensor (비전 센서를 이용한 유연한 로봇팔의 끝점 위치 측정)

  • Shin, Hyo-Pil;Lee, Jong-Kwang;Kang, E-Sok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.6 no.8
    • /
    • pp.682-688
    • /
    • 2000
  • To improve the performance of a flexible robot arm one of the important things is the vibration displacement measurement of a flexible arm. Many types of sensors have been used to measure it, The most popular has been strain gauges which measures the deflection of the beam,. Photo sensors have also been for detecting beam displacement and accelerometers are often used to measure the beam vibration. But the vibration displacement can be obtained indirectly from these sensors. In this article a vision sensor is used as a displacement sensor to measure the vibration displacement of a flexible robot arm. Several schemes are proposed to reduce the image processing time and increase its accuracy. From the experimental results it is seen that the vision sensor can be an alternative sensor for measuring the vibration displacement and has a potential for on-line tip position control of flexible robot systems.

  • PDF