• Title/Summary/Keyword: Vision Navigation System

Search Result 194, Processing Time 0.028 seconds

Autonomous Navigation of KUVE (KIST Unmanned Vehicle Electric) (KUVE (KIST 무인 주행 전기 자동차)의 자율 주행)

  • Chun, Chang-Mook;Suh, Seung-Beum;Lee, Sang-Hoon;Roh, Chi-Won;Kang, Sung-Chul;Kang, Yeon-Sik
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.7
    • /
    • pp.617-624
    • /
    • 2010
  • This article describes the system architecture of KUVE (KIST Unmanned Vehicle Electric) and unmanned autonomous navigation of it in KIST. KUVE, which is an electric light-duty vehicle, is equipped with two laser range finders, a vision camera, a differential GPS system, an inertial measurement unit, odometers, and control computers for autonomous navigation. KUVE estimates and tracks the boundary of road such as curb and line using a laser range finder and a vision camera. It follows predetermined trajectory if there is no detectable boundary of road using the DGPS, IMU, and odometers. KUVE has over 80% of success rate of autonomous navigation in KIST.

RESEARCH ON AUTONOMOUS LAND VEHICLE FOR AGRICULTURE

  • Matsuo, Yosuke;Yukumoto, Isamu
    • Proceedings of the Korean Society for Agricultural Machinery Conference
    • /
    • 1993.10a
    • /
    • pp.810-819
    • /
    • 1993
  • An autonomous lan vehicle for agriculture(ALVA-II) was developed. A prototype vehicle was made by modifying a commercial tractor. A Navigation sensor system with a geo-magnetic sensor performed the autonomous operations of ALVA-II, such as rotary tilling with headland turnings. A navigation sensor system with a machine vision system was also investigated to control ALVA-II following a work boudnary.

  • PDF

Design of Multisensor Navigation System for Autonomous Precision Approach and Landing

  • Soon, Ben K.H.;Scheding, Steve;Lee, Hyung-Keun;Lee, Hung-Kyu
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • v.1
    • /
    • pp.377-382
    • /
    • 2006
  • Precision approach and landing of aircraft in a remote landing zone autonomously present several challenges. Firstly, the exact location, orientation and elevation of the landing zone are not always known; secondly, the accuracy of the navigation solution is not always sufficient for this type of precision maneuver if there is no DGPS availability within close proximity. This paper explores an alternative approach for estimating the navigation parameters of the aircraft to the landing area using only time-differenced GPS carrier phase measurement and range measurements from a vision system. Distinct ground landmarks are marked before the landing zone. The positions of these landmarks are extracted from the vision system then the ranges relative to these locations are used as measurements for the extended Kalman filter (EKF) in addition to the precise time-differenced GPS carrier phase measurements. The performance of this navigation algorithm is demonstrated using simulation.

  • PDF

A Vision Based Guideline Interpretation Technique for AGV Navigation (AGV 운행을 위한 비전기반 유도선 해석 기술)

  • Byun, Sungmin;Kim, Minhwan
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.11
    • /
    • pp.1319-1329
    • /
    • 2012
  • AGVs are more and more utilized nowadays and magnetic guided AGVs are most widely used because their system has low cost and high speed. But this type of AGVs requires high infrastructure building cost and has poor flexibility of navigation path layout changing. Thus it is hard to applying this type of AGVs to a small quantity batch production system or a cooperative production system with many AGVs. In this paper, we propose a vision based guideline interpretation technique that uses the cheap, easily installable and changeable color tapes (or paint) as a guideline. So a vision-based AGV with color tapes is effectively applicable to the production systems. For easy setting and changing of AGV navigation path, we suggest an automatic method for interpreting a complex guideline layout including multi-branches and joins of branches. We also suggest a trace direction decision method for stable navigation of AGVs. Through several real-time navigation tests with an industrial AGV installed with the suggested technique, we confirmed that the technique is practically and stably applicable to real industrial field.

Estimation of Precise Relative Position using INS/Vision Sensor Integrated System (INS/비전 센서 통합 시스템을 이용한 정밀 상대 위치 추정)

  • Chun, Se-Bum;Won, Dae-Hee;Kang, Tae-Sam;Sung, Sang-Kyung;Lee, Eun-Sung;Cho, Jin-Soo;Lee, Young-Jae
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.36 no.9
    • /
    • pp.891-897
    • /
    • 2008
  • GPS can provide precise relative navigation information. But it needs a reference station in a close range and is effected by satellite observation environment. In this paper, we propose INS and Vision sensor integrated system with a known landmark geometry. This system is supposed to overcome problems of GPS only system. Using the proposed method, a relative navigation is available without a GPS reference station. The only need for the proposed system is a landmark image which is drawn on the ground. We conduct simple simulation to check the performance of this method. As a result, we confirm that it can improve the relative navigation information.

A Study on Development of Visual Navigation System based on Neural Network Learning

  • Shin, Suk-Young;Lee, Jang-Hee;You, Yang-Jun;Kang, Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.2 no.1
    • /
    • pp.1-8
    • /
    • 2002
  • It has been integrated into several navigation systems. This paper shows that system recognizes difficult indoor roads without any specific marks such as painted guide line or tape. In this method the robot navigates with visual sensors, which uses visual information to navigate itself along the read. The Neural Network System was used to learn driving pattern and decide where to move. In this paper, I will present a vision-based process for AMR(Autonomous Mobile Robot) that is able to navigate on the indoor read with simple computation. We used a single USB-type web camera to construct smaller and cheaper navigation system instead of expensive CCD camera.

Particle Filter Based Feature Points Tracking for Vision Based Navigation System (영상기반항법을 위한 파티클 필터 기반의 특징점 추적 필터 설계)

  • Won, Dae-Hee;Sung, Sang-Kyung;Lee, Young-Jae
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.40 no.1
    • /
    • pp.35-42
    • /
    • 2012
  • In this study, a feature-points-tracking algorithm is suggested using a particle filter for vision based navigation system. By applying a dynamic model of the feature point, the tracking performance is increased in high dynamic condition, whereas a conventional KLT (Kanade-Lucas-Tomasi) cannot give a solution. Futhermore, the particle filter is introduced to cope with irregular characteristics of vision data. Post-processing of recorded vision data shows that the tracking performance of suggested algorithm is more robust than that of KLT in high dynamic condition.

Vehicular Cooperative Navigation Based on H-SPAWN Using GNSS, Vision, and Radar Sensors (GNSS, 비전 및 레이더를 이용한 H-SPAWN 알고리즘 기반 자동차 협력 항법시스템)

  • Ko, Hyunwoo;Kong, Seung-Hyun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.40 no.11
    • /
    • pp.2252-2260
    • /
    • 2015
  • In this paper, we propose a vehicular cooperative navigation system using GNSS, vision sensor and radar sensor that are frequently used in mass-produced cars. The proposed cooperative vehicular navigation system is a variant of the Hybrid-Sum Product Algorithm over Wireless Network (H-SPAWN), where we use vision and radar sensors instead of radio ranging(i.e.,UWB). The performance is compared and analyzed with respect to the sensors, especially the position estimation error decreased about fifty percent when using radar compared to vision and radio ranging. In conclusion, the proposed system with these popular sensors can improve position accuracy compared to conventional cooperative navigation system(i.e.,H-SPAWN) and decrease implementation costs.

A Study on Obstacle Detection for Mobile Robot Navigation (이동형 로보트 주행을 위한 장애물 검출에 관한 연구)

  • Yun, Ji-Ho;Woo, Dong-Min
    • Proceedings of the KIEE Conference
    • /
    • 1995.11a
    • /
    • pp.587-589
    • /
    • 1995
  • The safe navigation of a mobile robot requires the recognition of the environment in terms of vision processing. To be guided in the given path, the robot should acquire the information about where the wall and corridor are located. Also unexpected obstacles should be detected as rapid as possible for the safe obstacle avoidance. In the paper, we assume that the mobile robot should be navigated in the flat surface. In terms of this assumption we simplify the correspondence problem by the free navigation surface and matching features in that coordinate system. Basically, the vision processing system adopts line segment of edge as the feature. The extracted line segments of edge out of both image are matched in the free nevigation surface. According to the matching result, each line segment is labeled by the attributes regarding obstacle and free surface and the 3D shape of obstacle is interpreted. This proposed vision processing method is verified in terms of various simulations and experimentation using real images.

  • PDF

Design of Navigation Algorithm for Mobile Robot using Sensor fusion (센서 합성을 이용한 자율이동로봇의 주행 알고리즘 설계)

  • Kim Jung-Hoon;Kim young-Joong;Lim Myo-Teag
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.53 no.10
    • /
    • pp.703-713
    • /
    • 2004
  • This paper presents the new obstacle avoidance method that is composed of vision and sonar sensors, also a navigation algorithm is proposed. Sonar sensors provide poor information because the angular resolution of each sonar sensor is not exact. So they are not suitable to detect relative direction of obstacles. In addition, it is not easy to detect the obstacle by vision sensors because of an image disturbance. In This paper, the new obstacle direction measurement method that is composed of sonar sensors for exact distance information and vision sensors for abundance information. The modified splitting/merging algorithm is proposed, and it is robuster for an image disturbance than the edge detecting algorithm, and it is efficient for grouping of the obstacle. In order to verify our proposed algorithm, we compare the proposed algorithm with the edge detecting algorithm via experiments. The direction of obstacle and the relative distance are used for the inputs of the fuzzy controller. We design the angular velocity controllers for obstacle avoidance and for navigation to center in corridor, respectively. In order to verify stability and effectiveness of our proposed method, it is apply to a vision and sonar based mobile robot navigation system.