• 제목/요약/키워드: Vision-aided Navigation

검색결과 10건 처리시간 0.035초

수직 이착륙 무인 항공기용 영상보정항법 시스템 성능평가를 위한 검증환경 개발 (Development of a Test Environment for Performance Evaluation of the Vision-aided Navigation System for VTOL UAVs)

  • 박세빈;신현철;정철주
    • 한국항행학회논문지
    • /
    • 제27권6호
    • /
    • pp.788-797
    • /
    • 2023
  • 본 논문은 수직 이착륙 무인 항공시스템의 GPS (global positioning system) 불가 시 대체 항법 시스템으로의 영상보정항법 시스템을 시험하기 위한 검증환경 개발 내용을 소개한다. 개발 중인 영상보정항법 시스템의 시험 및 평가를 위해서는 가상 환경을 활용하는 것이 효율적이지만, 현재 국내에는 적합한 장비가 개발되어 있지 않다. 따라서 제안된 검증환경은 시험 대상 장비의 운용 환경을 모델링 및 시뮬레이션 하여 입력 신호를 생성하고, 출력 신호를 관측함으로써 대상 장비의 성능을 평가할 수 있도록 개발되었다. 연구 과정은 검증환경 요구도 생성, 검증환경 설계에서부터 구성품별 하드웨어 및 소프트웨어 설계, 제작까지 포괄적으로 기술되었다. 이를 바탕으로 제작된 검증환경을 개발 중인 영상기반 보정항법 알고리즘의 성능평가와 시뮬레이션 기반의 사전 비행시험 수행에 활용하였다.

SLAM 기반 GPS/INS/영상센서를 결합한 헬리콥터 항법시스템의 구성 (SLAM Aided GPS/INS/Vision Navigation System for Helicopter)

  • 김재형;유준;곽휘권
    • 제어로봇시스템학회논문지
    • /
    • 제14권8호
    • /
    • pp.745-751
    • /
    • 2008
  • This paper presents a framework for GPS/INS/Vision based navigation system of helicopters. GPS/INS coupled algorithm has weak points such as GPS blockage and jamming, while the helicopter is a speedy and high dynamical vehicle amenable to lose the GPS signal. In case of the vision sensor, it is not affected by signal jamming and also navigation error is not accumulated. So, we have implemented an GPS/INS/Vision aided navigation system providing the robust localization suitable for helicopters operating in various environments. The core algorithm is the vision based simultaneous localization and mapping (SLAM) technique. For the verification of the SLAM algorithm, we performed flight tests. From the tests, we confirm the developed system is robust enough under the GPS blockage. The system design, software algorithm, and flight test results are described.

Visual Target Tracking and Relative Navigation for Unmanned Aerial Vehicles in a GPS-Denied Environment

  • Kim, Youngjoo;Jung, Wooyoung;Bang, Hyochoong
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제15권3호
    • /
    • pp.258-266
    • /
    • 2014
  • We present a system for the real-time visual relative navigation of a fixed-wing unmanned aerial vehicle in a GPS-denied environment. An extended Kalman filter is used to construct a vision-aided navigation system by fusing the image processing results with barometer and inertial sensor measurements. Using a mean-shift object tracking algorithm, an onboard vision system provides pixel measurements to the navigation filter. The filter is slightly modified to deal with delayed measurements from the vision system. The image processing algorithm and the navigation filter are verified by flight tests. The results show that the proposed aerial system is able to maintain circling around a target without using GPS data.

Loosely-Coupled Vision/INS Integrated Navigation System

  • Kim, Youngsun;Hwang, Dong-Hwan
    • Journal of Positioning, Navigation, and Timing
    • /
    • 제6권2호
    • /
    • pp.59-70
    • /
    • 2017
  • Since GPS signals are vulnerable to interference and obstruction, many alternate aiding systems have been proposed to integrate with an inertial navigation system. Among these alternate systems, the vision-aided method has become more attractive due to its benefits in weight, cost and power consumption. This paper proposes a loosely-coupled vision/INS integrated navigation method which can work in GPS-denied environments. The proposed method improves the navigation accuracy by correcting INS navigation and sensor errors using position and attitude outputs of a landmark based vision navigation system. Furthermore, it has advantage to provide redundant navigation output regardless of INS output. Computer simulations and the van tests have been carried out in order to show validity of the proposed method. The results show that the proposed method works well and gives reliable navigation outputs with better performance.

Integrated Navigation Design Using a Gimbaled Vision/LiDAR System with an Approximate Ground Description Model

  • Yun, Sukchang;Lee, Young Jae;Kim, Chang Joo;Sung, Sangkyung
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제14권4호
    • /
    • pp.369-378
    • /
    • 2013
  • This paper presents a vision/LiDAR integrated navigation system that provides accurate relative navigation performance on a general ground surface, in GNSS-denied environments. The considered ground surface during flight is approximated as a piecewise continuous model, with flat and slope surface profiles. In its implementation, the presented system consists of a strapdown IMU, and an aided sensor block, consisting of a vision sensor and a LiDAR on a stabilized gimbal platform. Thus, two-dimensional optical flow vectors from the vision sensor, and range information from LiDAR to ground are used to overcome the performance limit of the tactical grade inertial navigation solution without GNSS signal. In filter realization, the INS error model is employed, with measurement vectors containing two-dimensional velocity errors, and one differenced altitude in the navigation frame. In computing the altitude difference, the ground slope angle is estimated in a novel way, through two bisectional LiDAR signals, with a practical assumption representing a general ground profile. Finally, the overall integrated system is implemented, based on the extended Kalman filter framework, and the performance is demonstrated through a simulation study, with an aircraft flight trajectory scenario.

Mobile Robot Destination Generation by Tracking a Remote Controller Using a Vision-aided Inertial Navigation Algorithm

  • Dang, Quoc Khanh;Suh, Young-Soo
    • Journal of Electrical Engineering and Technology
    • /
    • 제8권3호
    • /
    • pp.613-620
    • /
    • 2013
  • A new remote control algorithm for a mobile robot is proposed, where a remote controller consists of a camera and inertial sensors. Initially the relative position and orientation of a robot is estimated by capturing four circle landmarks on the plate of the robot. When the remote controller moves to point to the destination, the camera pointing trajectory is estimated using an inertial navigation algorithm. The destination is transmitted wirelessly to the robot and then the robot is controlled to move to the destination. A quick movement of the remote controller is possible since the destination is estimated using inertial sensors. Also unlike the vision only control, the robot can be out of camera's range of view.

인공시계기반 헬기용 3차원 항법시스템 구성 (A Real-Time NDGPS/INS Navigation System Based on Artificial Vision for Helicopter)

  • 김재형;유준;곽휘권
    • 한국군사과학기술학회지
    • /
    • 제11권3호
    • /
    • pp.30-39
    • /
    • 2008
  • An artificial vision aided NDGPS/INS system has been developed and tested in the dynamic environment of ground and flight vehicles to evaluate the overall system performance. The results show the significant advantages in position accuracy and situation awareness. Accuracy meets the CAT-I precision approach and landing using NDGPS/INS integration. Also we confirm the proposed system is effective enough to improve flight safety by using artificial vision. The system design, software algorithm, and flight test results are presented in details.

비전센서와 INS 기반의 항법 시스템 구현 시 랜드마크 사용에 따른 가관측성 분석 (Observability Analysis of a Vision-INS Integrated Navigation System Using Landmark)

  • 원대희;천세범;성상경;조진수;이영재
    • 한국항공우주학회지
    • /
    • 제38권3호
    • /
    • pp.236-242
    • /
    • 2010
  • 위성항법시스템과 INS가 결합된 항법 시스템은 가용 위성이 없는 경우 항법 정보를 제공하지 못하는 단점이 있다. 이러한 단점을 극복하기 위해 비전센서를 결합한 항법 시스템이 대안으로 사용되지만, 일반적으로 특징점만 사용하여 항법을 수행하므로 가관측성이 부족한 문제점이 존재한다. 이때 사전에 위치가 알려져 있는 랜드마크를 추가적으로 사용하면 특징점만 이용하는 경우에 비해 가관측성을 향상 시킬 수 있다. 본 논문에서는 추가적인 랜드마크를 사용하는 경우에 대하여, TOM/SOM 분석과 고유치 분석을 통해 가관측성 향상 정도를 분석하였다. 시뮬레이션 결과 특징점만 사용하는 경우 항상 가관측성이 부족하나 랜드마크를 사용하는 경우 2번째 갱신과정 이후에는 완전 가관측한 특성을 보였다. 따라서 랜드마크를 사용하면 가관측성이 향상되므로 전체 시스템 성능을 향상시킬수 있다.

수중 로봇을 위한 다중 템플릿 및 가중치 상관 계수 기반의 물체 인식 및 추종 (Multiple Templates and Weighted Correlation Coefficient-based Object Detection and Tracking for Underwater Robots)

  • 김동훈;이동화;명현;최현택
    • 로봇학회논문지
    • /
    • 제7권2호
    • /
    • pp.142-149
    • /
    • 2012
  • The camera has limitations of poor visibility in underwater environment due to the limited light source and medium noise of the environment. However, its usefulness in close range has been proved in many studies, especially for navigation. Thus, in this paper, vision-based object detection and tracking techniques using artificial objects for underwater robots have been studied. We employed template matching and mean shift algorithms for the object detection and tracking methods. Also, we propose the weighted correlation coefficient of adaptive threshold -based and color-region-aided approaches to enhance the object detection performance in various illumination conditions. The color information is incorporated into the template matched area and the features of the template are used to robustly calculate correlation coefficients. And the objects are recognized using multi-template matching approach. Finally, the water basin experiments have been conducted to demonstrate the performance of the proposed techniques using an underwater robot platform yShark made by KORDI.

GPS 취약 환경에서 전술급 무인항공기의 주/야간 영상정보를 기반으로 한 실시간 비행체 위치 보정 시스템 개발 (Development of Real-Time Vision Aided Navigation Using EO/IR Image Information of Tactical Unmanned Aerial System in GPS Denied Environment)

  • 최승기;조신제;강승모;이길태;이원근;정길순
    • 한국항공우주학회지
    • /
    • 제48권6호
    • /
    • pp.401-410
    • /
    • 2020
  • 본 연구에서는 전술급 무인항공기의 GPS 신호간섭 및 재밍(Jamming)/기만(Spoofing) 공격 시 위치항법 정보의 취약성을 보완하기 위해 개발한 영상정보 기반 실시간 비행체 위치보정 시스템을 기술하고자 한다. 전술급 무인항공기는 GPS 두절 시 항법장비가 GPS/INS 통합항법에서 DR/AHRS 모드로 전환하여 자동비행이 가능하나, 위치 항법의 경우 대기속도 및 방위각을 활용한 추측항법(DR, Dead Reckoning)으로 인해 시간이 지나면 오차가 누적되어 비행체 위치 파악 및 데이터링크 안테나 자동추적이 제한되는 등의 문제점을 갖고 있다. 이러한 위치 오차의 누적을 최소화하기 위해 영상감지기를 이용한 특정지역 위치보정점을 바탕으로 비행체 자세, 영상감지기 방위각/고각 및 수치지도 데이터(DTED)를 활용하여 비행체 위치를 계산하고 이를 실시간으로 항법장비에 보정하는 시스템을 개발하였다. 또한 GPS 시뮬레이터를 이용한 지상시험과 추측항법 모드의 비행시험으로 영상정보 기반 실시간 비행체 위치보정 시스템의 기능 및 성능을 검증하였다.