• 제목/요약/키워드: real-time vision

검색결과 848건 처리시간 0.027초

소형 이동 로봇의 실시간 경로계획과 영상정보에 의한 추적제어 (A study on real-time path planning and visual tracking of the micro mobile robot)

  • 김은희;오준호
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 1997년도 춘계학술대회 논문집
    • /
    • pp.25-29
    • /
    • 1997
  • In this thesis, we construct the microrobot succor system and navigate the real-time path planning and visual tracking of each robot. The system consists robots, vision system and a host computer. Because the robots are free-ranging mobile robot, it is needed to make and gallow the path. The path is planned and controlled by a host computer, ie. Supervisory control system. In path planning, we suggest a cost function which consists of three terms. One is the smoothness of the path, another is the total distance or time, and the last one is to avoid obstacles. To minimize the cost function, we choose the parametric cubic spline and update the coefficients in real time. We perform the simulation for the path planing and obstacle avoidance and real experiment for visual tracking

  • PDF

대두의 자동 선별을 위한 컬러 기계시각장치의 설계 (Design of a Color Machine Vision System for the Automatic Sorting of Soybeans)

  • 김태호;문창수;박수우;정원교;도용태
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2003년도 학술회의 논문집 정보 및 제어부문 A
    • /
    • pp.231-234
    • /
    • 2003
  • This paper describes the structure, operation, image processing, and decision making techniques of a color machine vision system designed for the automatic sorting of soybeans. The system consists of feeder, conveyor belt, line-scan camera, lights. ejector, and a PC Unlike manufactured goods, agricultural products including soybeans have quite uneven features. The criteria for sorting good and bad beans also vary depending on inspectors. We tackle these problem by letting the system learn the inspecting parameters from good samples selected manually by a machine user before running the system for sorting. Real-time processing has another importance In the design. Four parallel DSPs are employed to increase the processing speed. When the designed system was tested with real soybeans and the result was successful.

  • PDF

산업용 비젼 시스템을 위한 체인코더의 제작 (Development of A Hardware Chain Coder for Industrial Vision Systems)

  • 이병일;신유식;임준홍;서일홍;변증남
    • 대한전기학회논문지
    • /
    • 제38권8호
    • /
    • pp.629-639
    • /
    • 1989
  • One of the major issues of the industrial vision systems is lengthy processing time due to bulky data for image coding. To reduce the processing time, a modified chain coding algorithm is proposed in such a way that it is more suitable for hardware implementation. A hardware chain coder is developed and used for learn and recognizing objects by extracting several features. It is shown that the desired vision system is much faster than a typical software based system so that it may be applicable to real-time industrial operations.

  • PDF

Identified Contract Net 프로토콜 기반의 유비쿼터스 시각시스템 (A Ubiquitous Vision System based on the Identified Contract Net Protocol)

  • 김치호;유범재;김학배
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제54권10호
    • /
    • pp.620-629
    • /
    • 2005
  • In this paper, a new protocol-based approach was proposed for development of a ubiquitous vision system. It is possible to apply the approach by regarding the ubiquitous vision system as a multiagent system. Thus, each vision sensor can be regarded as an agent (vision agent). Each vision agent independently performs exact segmentation for a target by color and motion information, visual tracking for multiple targets in real-time, and location estimation by a simple perspective transform. Matching problem for the identity of a target during handover between vision agents is solved by the Identified Contract Net (ICN) protocol implemented for the protocol-based approach. The protocol-based approach by the ICN protocol is independent of the number of vision agents and moreover the approach doesn't need calibration and overlapped region between vision agents. Therefore, the ICN protocol raises speed, scalability, and modularity of the system. The protocol-based approach was successfully applied for our ubiquitous vision system and operated well through several experiments.

THE DEVELOPMENT OF THE NARROW GAP MULTI-PASS WELDING SYSTEM USING LASER VISION SYSTEM

  • Park, Hee-Chang;Park, Young-Jo;Song, Keun-Ho;Lee, Jae-Woong;Jung, Yung-Hwa;Luc Didier
    • 대한용접접합학회:학술대회논문집
    • /
    • 대한용접접합학회 2002년도 Proceedings of the International Welding/Joining Conference-Korea
    • /
    • pp.706-713
    • /
    • 2002
  • In the multi-pass welding of pressure vessels or ships, the mechanical touch sensor system is generally used together with a manipulator to measure the gap and depth of the narrow gap to perform seam tracking. Unfortunately, such mechanical touch sensors may commit measuring errors caused by the eterioration of the measuring device. An automation system of narrow gap multi-pass welding using a laser vision system which can track the seam line of narrow gap and which can control welding power has been developed. The joint profile of the narrow gap, with 250mm depth and 28mm width, can be captured by laser vision camera. The image is then processed for defining tracking positions of the torch during welding. Then, the real-time correction of lateral and vertical position of the torch can be done by the laser vision system. The adaptive control of welding conditions like welding Currents and welding speeds, can also be performed by the laser vision system, which cannot be done by conventional mechanical touch systems. The developed automation system will be adopted to reduce the idle time of welders, which happens frequently in conventional long welding processes, and to improve the reliability of the weld quality as well.

  • PDF

The Development of the Narrow Gap Multi-Pass Welding System Using Laser Vision System

  • Park, H.C.;Park, Y.J.;Song, K.H.;Lee, J.W.;Jung, Y.H.;Didier, L.
    • International Journal of Korean Welding Society
    • /
    • 제2권1호
    • /
    • pp.45-51
    • /
    • 2002
  • In the multi-pass welding of pressure vessels or ships, the mechanical touch sensor system is generally used together with a manipulator to measure the gap and depth of the narrow gap to perform seam tracking. Unfortunately, such mechanical touch sensors may commit measuring errors caused by the deterioration of the measuring device. An automation system of narrow gap multi-pass welding using a laser vision system which can track the seam line of narrow gap and which can control welding power has been developed. The joint profile of the narrow gap, with 250mm depth and 28mm width, can be captured by laser vision camera. The image is then processed for defining tracking positions of the torch during welding. Then, the real-time correction of lateral and vertical position of the torch can be done by the laser vision system. The adaptive control of welding conditions like welding currents and welding speeds, can also be performed by the laser vision system, which cannot be done by conventional mechanical touch systems. The developed automation system will be adopted to reduce the idle time of welders, which happens frequently in conventional long welding processes, and to improve the reliability of the weld quality as well.

  • PDF

한 이미지 평면에서 다물체 위치의 실시간 화상처리 알고리즘 개발 (Development of Real-Time Image Processing Algorithm on the Positions of Multi-Object in an Image Plane)

  • 장완식;김경석;이성민
    • 비파괴검사학회지
    • /
    • 제22권5호
    • /
    • pp.523-531
    • /
    • 2002
  • 본 연구는 속도 향상을 고려한 실시간 다물체 화상처리 알고리즘을 개발하고자 한다. 최근 들어 비전시스템의 사용은 검사 및 로봇 위치 제어 풍에서 급속히 증가하고 있다. 이러한 비전시스템을 적용하기 위해서는 3차원 공간상 물체의 좌표를 CCD 카메라에 의해서 얻어진 이미지 정보로 변환하는 것이 필요하다. 검사 및 로봇 위치 제어 작업들에 비전시스템을 적용하기 위해서 이미지 평면에서 물체의 중심 위치를 알아야 한다. 특히, 그것의 물체 형상을 표시하기 위하여 여러 개 큐들을 사용하는 강체의 경우에는 여러 개 큐들의 각각 위치 값들이 동시에 하나의 이미지 평면에서 결정되어 져야 한다. 이러한 문제를 해결하기 위하여 여러 개 큐 (다물체)에 대한 화상처리 알고리즘 개발 과정을 본 논문에서 제시하고, 개발된 알고리즘의 타당성을 제시하였다.

퍼지 제어기를 이용한 실시간 이동 물체 추적에 관한 연구 (Study on the Real-Time Moving Object Tracking using Fuzzy Controller)

  • 김관형;강성인;이재현
    • 한국정보통신학회논문지
    • /
    • 제10권1호
    • /
    • pp.191-196
    • /
    • 2006
  • 본 논문에서는 비젼 시스템을 이용하여 이동 물체를 추적하는 방법을 제안하였다. 이동 물체를 계속적으로 추적하기 위해서는 이동 물체의 영상이 화상의 중심점 부근에 위치하도록 해야 한다. 따라서 이동 물체의 영상이 화상의 중심점의 부근에 위치하도록 하기 위하여 팬/틸트(Pan/Tilt)구조의 카메라 모듈을 제어하는 퍼지 제어기를 구현하였다. 향후, 시스템을 이동로봇에 적용하기 위하여 비젼 시스템을 위한 영상처리보드를 설계 제작하였고, 대상물체의 색상과 형태를 파악한 후 퍼지 제어기를 이용하여 카메라모듈이 물체를 추적할 수 있도록 StrongArm 보드를 이용하여 구성하였다. 그리고, 실험에 의해서 제안된 퍼지 제어기 가 실시간 이동물체 추적 시스템에 적용 가능함을 확인 하였다.

이동 로봇의 실시간 자세 추정을 위한 센서 시스템의 개발 (Development of a Sensor System for Real-Time Posture Measurement of Mobile Robots)

  • 이상룡;권승만
    • 대한기계학회논문집
    • /
    • 제17권9호
    • /
    • pp.2191-2204
    • /
    • 1993
  • A sensor system has been developed to measure the posture(position and orientation) of mobile robots working in industrial environments. The proposed sensor system consists of a CCD camera, retro-reflective landmarks, a strobe unit and an image processing board. The proposed hardware system can be built in economic price compared to commercial vision systems. The system has the capability of measuring the posture of mobile robots within 60 msec when a 386 personal computer is used as the host computer. The experimental results demonstrated a remarkable performance of the proposed sensor system in the posture measurement of mobile robots - the average error in position is less than 3 mm and the average error in orientation is less than 1.5.

컬러 정보를 이용한 무인항공기에서 실시간 이동 객체의 카메라 추적 (The Camera Tracking of Real-Time Moving Object on UAV Using the Color Information)

  • 홍승범
    • 한국항공운항학회지
    • /
    • 제18권2호
    • /
    • pp.16-22
    • /
    • 2010
  • This paper proposes the real-time moving object tracking system UAV using color information. Case of object tracking, it have studied to recognizing the moving object or moving multiple objects on the fixed camera. And it has recognized the object in the complex background environment. But, this paper implements the moving object tracking system using the pan/tilt function of the camera after the object's region extraction. To do this tracking system, firstly, it detects the moving object of RGB/HSI color model and obtains the object coordination in acquired image using the compact boundary box. Secondly, the camera origin coordination aligns to object's top&left coordination in compact boundary box. And it tracks the moving object using the pan/tilt function of camera. It is implemented by the Labview 8.6 and NI Vision Builder AI of National Instrument co. It shows the good performance of camera trace in laboratory environment.