• 제목/요약/키워드: Vision-Guided

검색결과 60건 처리시간 0.022초

가상계측기반 실시간 영상유도 자동비행 시스템 구현 및 무인 로터기를 이용한 비행시험 (Implementation of Virtual Instrumentation based Realtime Vision Guided Autopilot System and Onboard Flight Test using Rotory UAV)

  • 이병진;윤석창;이영재;성상경
    • 제어로봇시스템학회논문지
    • /
    • 제18권9호
    • /
    • pp.878-886
    • /
    • 2012
  • This paper investigates the implementation and flight test of realtime vision guided autopilot system based on virtual instrumentation platform. A graphical design process via virtual instrumentation platform is fully used for the image processing, communication between systems, vehicle dynamics control, and vision coupled guidance algorithms. A significatnt ojective of the algorithm is to achieve an environment robust autopilot despite wind and an irregular image acquisition condition. For a robust vision guided path tracking and hovering performance, the flight path guidance logic is combined in a multi conditional basis with the position estimation algorithm coupled with the vehicle attitude dynamics. An onboard flight test equipped with the developed realtime vision guided autopilot system is done using the rotary UAV system with full attitude control capability. Outdoor flight test demonstrated that the designed vision guided autopilot system succeeded in UAV's hovering on top of ground target within about several meters under geenral windy environment.

비전 기반 무인반송차의 효과적인 운행을 위한 가상추적륜 기반 유도선 추적 기법 (A Guideline Tracing Technique Based on a Virtual Tracing Wheel for Effective Navigation of Vision-based AGVs)

  • 김민환;변성민
    • 한국멀티미디어학회논문지
    • /
    • 제19권3호
    • /
    • pp.539-547
    • /
    • 2016
  • Automated guided vehicles (AGVs) are widely used in industry. Several types of vision-based AGVs have been studied in order to reduce cost of infrastructure building at floor of workspace and to increase flexibility of changing the navigation path layout. A practical vision-based guideline tracing method is proposed in this paper. A virtual tracing wheel is introduced and adopted in this method, which enables a vision-based AGV to trace a guideline in diverse ways. This method is also useful for preventing damage of the guideline by enforcing the real steering wheel of the AGV not to move on the guideline. Usefulness of the virtual tracing wheel is analyzed through computer simulations. Several navigation tests with a commercial AGV were also performed on a usual guideline layout and we confirmed that the virtual tracing wheel based tracing method could work practically well.

Vision Sensor-Based Driving Algorithm for Indoor Automatic Guided Vehicles

  • Quan, Nguyen Van;Eum, Hyuk-Min;Lee, Jeisung;Hyun, Chang-Ho
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제13권2호
    • /
    • pp.140-146
    • /
    • 2013
  • In this paper, we describe a vision sensor-based driving algorithm for indoor automatic guided vehicles (AGVs) that facilitates a path tracking task using two mono cameras for navigation. One camera is mounted on vehicle to observe the environment and to detect markers in front of the vehicle. The other camera is attached so the view is perpendicular to the floor, which compensates for the distance between the wheels and markers. The angle and distance from the center of the two wheels to the center of marker are also obtained using these two cameras. We propose five movement patterns for AGVs to guarantee smooth performance during path tracking: starting, moving straight, pre-turning, left/right turning, and stopping. This driving algorithm based on two vision sensors gives greater flexibility to AGVs, including easy layout change, autonomy, and even economy. The algorithm was validated in an experiment using a two-wheeled mobile robot.

A real-time vision system for SMT automation

  • Hwang, Shin-Hwan;Kim, Dong-Sik;Yun, Il-Dong;Choi, Jin-Woo;Lee, Sang-Uk;Choi, Jong-Soo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1990년도 한국자동제어학술회의논문집(국제학술편); KOEX, Seoul; 26-27 Oct. 1990
    • /
    • pp.923-928
    • /
    • 1990
  • This paper describes the design and implementation of a real-time, high-precision vision system and its application to SMT(surface mounting technology) automation. The vision system employs a 32 bit MC68030 as a main processor, and consists of image acquisition unit. DSP56001 DSP based vision processor, and several algorithmically dedicated hardware modules. The image acquisition unit provides 512*480*8 bit image for high-precision vision tasks. The DSP vision processor and hardware modules, such as histogram extractor and feature extractor, are designed for a real-time excution of vision algorithms. Especially, the implementation of multi-processing architecture based on DSP vision processors allows us to employ more sophisticated and flexible vision algorithms for real-time operation. The developed vision system is combined with an Adept Robot system to form a complete SMD system. It has been found that the vision guided SMD assembly system is able to provide a satisfactory performance for SND automation.

  • PDF

Research on Local and Global Infrared Image Pre-Processing Methods for Deep Learning Based Guided Weapon Target Detection

  • Jae-Yong Baek;Dae-Hyeon Park;Hyuk-Jin Shin;Yong-Sang Yoo;Deok-Woong Kim;Du-Hwan Hur;SeungHwan Bae;Jun-Ho Cheon;Seung-Hwan Bae
    • 한국컴퓨터정보학회논문지
    • /
    • 제29권7호
    • /
    • pp.41-51
    • /
    • 2024
  • 본 논문에서는 적외선 이미지에서 딥러닝 물체 탐지를 사용하여 유도무기의 표적 탐지 정확도 향상 방법을 연구한다. 적외선 이미지의 특성은 시간, 온도 등의 요인에 의해 영향을 받기 때문에 모델을 학습할 때 다양한 환경에서 표적 객체의 특징을 일관되게 표현하는 것이 중요하다. 이러한 문제를 해결하는 간단한 방법은 적절한 전처리 기술을 통해 적외선 이미지 내 표적 객체의 특징을 강조하고 노이즈를 줄이는 것이다. 그러나, 기존 연구에서는 적외선 영상 기반 딥러닝 모델 학습에서 전처리기법에 관한 충분한 논의가 이루어지지 못했다. 이에, 본 논문에서는 표적 객체 검출을 위한 적외선 이미지 기반 훈련에 대한 이미지 전처리 기술의 영향을 조사하는 것을 목표로 한다. 이를 위해 영상과 이미지의 전역(global) 또는 지역(local) 정보를 활용한 적외선 영상에 대한 전처리인 Min-max normalization, Z-score normalization, Histogram equalization, CLAHE (Contrast Limited Adaptive Histogram Equalization)에 대한 결과를 분석한다. 또한, 각 전처리 기법으로 변환된 이미지들이 객체 검출기 훈련에 미치는 영향을 확인하기 위해 다양한 전처리 방법으로 처리된 이미지에 대해 YOLOX 표적 검출기를 학습하고, 이에 대한 분석을 진행한다. 실험과 분석을 통해 전처리 기법들이 객체 검출기 정확도에 영향을 미친다는 사실을 알게 되었다. 특히, 전처리 기법 중에서도 CLAHE 기법을 사용해 실험을 진행한 결과가 81.9%의 mAP (mean average precision)을 기록하며 가장 높은 검출 정확도를 보임을 확인하였다.

가상 포토센서 배열을 탑재한 항만 자동화 자을 주행 차량 (The Vision-based Autonomous Guided Vehicle Using a Virtual Photo-Sensor Array (VPSA) for a Port Automation)

  • 김수용;박영수;김상우
    • 제어로봇시스템학회논문지
    • /
    • 제16권2호
    • /
    • pp.164-171
    • /
    • 2010
  • We have studied the port-automation system which is requested by the steep increment of cost and complexity for processing the freight. This paper will introduce a new algorithm for navigating and controlling the autonomous Guided Vehicle (AGV). The camera has the optical distortion in nature and is sensitive to the external ray, the weather, and the shadow, but it is very cheap and flexible to make and construct the automation system for the port. So we tried to apply to the AGV for detecting and tracking the lane using the CCD camera. In order to make the error stable and exact, this paper proposes new concept and algorithm for obtaining the error is generated by the Virtual Photo-Sensor Array (VPSA). VPSAs are implemented by programming and very easy to use for the various autonomous systems. Because the load of the computation is light, the AGV utilizes the maximal performance of the CCD camera and enables the CPU to take multi-tasks. We experimented on the proposed algorithm using the mobile robot and confirmed the stable and exact performance for tracking the lane.

4세대 LCD Cassette 자동 반송 이동로봇 (A Clean Mobile Robot for 4th Generation LCD Cassette transfer)

  • 김진기;성학경;김성권
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2000년도 제15차 학술회의논문집
    • /
    • pp.249-249
    • /
    • 2000
  • This paper introduces a clean mobile robot fur 4th generation LCD cassette, which is guided by optical sensor and position compensation using vision module. The mobile robot for LCD cassette transfer might be controlled by AGV controller which has powerful algorithms. It offers optimum routes to the destination of clean mobile robot by using dynamic dispatch algorithm and MAP data. This clean mobile robot is equipped with 4 axes fork type manipulator providing repeatability accuracy of $\pm$ 0.05mm.

  • PDF

야간 조명 아래 스테레오 비전의 반사 제거 (Reflection Removal in Stereo Vision Under Night Illumination)

  • 사이라 나비드;이상웅
    • 한국멀티미디어학회:학술대회논문집
    • /
    • 한국멀티미디어학회 2012년도 춘계학술발표대회논문집
    • /
    • pp.26-27
    • /
    • 2012
  • Reflection considered as the view disturbing noise in optical systems, such as stereo camera in autonomous vehicles especially in night. Reflection caused by the street light or due to rainwater under adverse weather conditions. A blur image detected by the camera that results in wrong guidance to vehicle for detecting its track. A vehicle guidance approach through stereo vision can be same in day and night time. However it cannot be guided with same image analysis due to diverse illumination conditions. We develop the technique that shows its efficacy with illustrations of reflection removal off the camera lens and vehicle tracking control.

  • PDF

물체 잡기를 위한 비전 기반의 로봇 메뉴플레이터 (Vision-Based Robot Manipulator for Grasping Objects)

  • 백영민;안호석;최진영
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2007년도 심포지엄 논문집 정보 및 제어부문
    • /
    • pp.331-333
    • /
    • 2007
  • Robot manipulator is one of the important features in service robot area. Until now, there has been a lot of research on robot" manipulator that can imitate the functions of a human being by recognizing and grasping objects. In this paper, we present a robot arm based on the object recognition vision system. We have implemented closed-loop control that use the feedback from visual information, and used a sonar sensor to improve the accuracy. We have placed the web-camera on the top of the hand to recognize objects. We also present some vision-based manipulation issues and our system features.

  • PDF

Vision을 이용한 자율주행 로봇의 라인 인식 및 주행방향 결정에 관한 연구 (A Study of Line Recognition and Driving Direction Control On Vision based AGV)

  • 김영숙;김태완;이창구
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2002년도 하계학술대회 논문집 D
    • /
    • pp.2341-2343
    • /
    • 2002
  • This paper describes a vision-based line recognition and control of driving direction for an AGV(autonomous guided vehicle). As navigation guide, black stripe attached on the corridor is used. Binary image of guide stripe captured by a CCD camera is used. For detect the guideline quickly and extractly, we use for variable thresholding algorithm. this low-cost line-tracking system is efficiently using pc-based real time vision processing. steering control is studied through controller with guide-line angle error. This method is tested via a typical agv with a single camera in laboratory environment.

  • PDF