• Title/Summary/Keyword: Vision-based Landing

Search Result 16, Processing Time 0.028 seconds

Vision-based Obstacle State Estimation and Collision Prediction using LSM and CPA for UAV Autonomous Landing (무인항공기의 자동 착륙을 위한 LSM 및 CPA를 활용한 영상 기반 장애물 상태 추정 및 충돌 예측)

  • Seongbong Lee;Cheonman Park;Hyeji Kim;Dongjin Lee
    • Journal of Advanced Navigation Technology
    • /
    • v.25 no.6
    • /
    • pp.485-492
    • /
    • 2021
  • Vision-based autonomous precision landing technology for UAVs requires precise position estimation and landing guidance technology. Also, for safe landing, it must be designed to determine the safety of the landing point against ground obstacles and to guide the landing only when the safety is ensured. In this paper, we proposes vision-based navigation, and algorithms for determining the safety of landing point to perform autonomous precision landings. To perform vision-based navigation, CNN technology is used to detect landing pad and the detection information is used to derive an integrated navigation solution. In addition, design and apply Kalman filters to improve position estimation performance. In order to determine the safety of the landing point, we perform the obstacle detection and position estimation in the same manner, and estimate the speed of the obstacle using LSM. The collision or not with the obstacle is determined based on the CPA calculated by using the estimated state of the obstacle. Finally, we perform flight test to verify the proposed algorithm.

Design and Fabrication of Multi-rotor system for Vision based Autonomous Landing (영상 기반 자동 착륙용 멀티로터 시스템 설계 및 개발)

  • Kim, Gyou-Beom;Song, Seung-Hwa;Yoon, Kwang-Joon
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.12 no.6
    • /
    • pp.141-146
    • /
    • 2012
  • This paper introduces development of multi-rotor system and vision based autonomous landing system. Multi-rotor platform is modeled by rigid body motion with Newton Euler concept. Also Multi-rotor platform is simulated and tuned by LQR control algorithm. Vision based Autonomous Landing system uses a single camera that is mounted Multi-rotor system. Augmented reality algorithm is used as marker detection algorithm and autonomous landing code is test with GCS for the precision landing.

Guidance Law for Vision-Based Automatic Landing of UAV

  • Min, Byoung-Mun;Tahk, Min-Jea;Shim, Hyun-Chul David;Bang, Hyo-Choong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.8 no.1
    • /
    • pp.46-53
    • /
    • 2007
  • In this paper, a guidance law for vision-based automatic landing of unmanned aerial vehicles (UAVs) is proposed. Automatic landing is a challenging but crucial capability for UAVs to achieve a fully autonomous flight. In an autonomous landing maneuver of UAVs, the decision of where to landing and the generation of guidance command to achieve a successful landing are very significant problem. This paper is focused on the design of guidance law applicable to automatic landing problem of fixed-wing UAV and rotary-wing UAV, simultaneously. The proposed guidance law generates acceleration command as a control input which derived from a specified time-to-go ($t_go$) polynomial function. The coefficient of $t_go$-polynomial function are determined to satisfy some terminal constraints. Nonlinear simulation results using a fixed-wing and rotary-wing UAV models are presented.

Vision-based Autonomous Landing System of an Unmanned Aerial Vehicle on a Moving Vehicle (무인 항공기의 이동체 상부로의 영상 기반 자동 착륙 시스템)

  • Jung, Sungwook;Koo, Jungmo;Jung, Kwangyik;Kim, Hyungjin;Myung, Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.11 no.4
    • /
    • pp.262-269
    • /
    • 2016
  • Flight of an autonomous unmanned aerial vehicle (UAV) generally consists of four steps; take-off, ascent, descent, and finally landing. Among them, autonomous landing is a challenging task due to high risks and reliability problem. In case the landing site where the UAV is supposed to land is moving or oscillating, the situation becomes more unpredictable and it is far more difficult than landing on a stationary site. For these reasons, the accurate and precise control is required for an autonomous landing system of a UAV on top of a moving vehicle which is rolling or oscillating while moving. In this paper, a vision-only based landing algorithm using dynamic gimbal control is proposed. The conventional camera systems which are applied to the previous studies are fixed as downward facing or forward facing. The main disadvantage of these system is a narrow field of view (FOV). By controlling the gimbal to track the target dynamically, this problem can be ameliorated. Furthermore, the system helps the UAV follow the target faster than using only a fixed camera. With the artificial tag on a landing pad, the relative position and orientation of the UAV are acquired, and those estimated poses are used for gimbal control and UAV control for safe and stable landing on a moving vehicle. The outdoor experimental results show that this vision-based algorithm performs fairly well and can be applied to real situations.

Vision Processing for Precision Autonomous Landing Approach of an Unmanned Helicopter (무인헬기의 정밀 자동착륙 접근을 위한 영상정보 처리)

  • Kim, Deok-Ryeol;Kim, Do-Myoung;Suk, Jin-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.1
    • /
    • pp.54-60
    • /
    • 2009
  • In this paper, a precision landing approach is implemented based on real-time image processing. A full-scale landmark for automatic landing is used. canny edge detection method is applied to identify the outside quadrilateral while circular hough transform is used for the recognition of inside circle. Position information on the ground landmark is uplinked to the unmanned helicopter via ground control computer in real time so that the unmanned helicopter control the air vehicle for accurate landing approach. Ground test and a couple of flight tests for autonomous landing approach show that the image processing and automatic landing operation system have good performance for the landing approach phase at the altitude of $20m{\sim}1m$ above ground level.

Performance Comparison of Depth Map Based Landing Methods for a Quadrotor in Unknown Environment (미지 환경에서의 깊이지도를 이용한 쿼드로터 착륙방식 성능 비교)

  • Choi, Jong-Hyuck;Park, Jongho;Lim, Jaesung
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.50 no.9
    • /
    • pp.639-646
    • /
    • 2022
  • Landing site searching algorithms are developed for a quadrotor using a depth map in unknown environment. Guidance and control system of Unmanned Aerial Vehicle (UAV) consists of a trajectory planner, a position and an attitude controller. Landing site is selected based on the information of the depth map which is acquired by a stereo vision sensor attached on the gimbal system pointing downwards. Flatness information is obtained by the maximum depth difference of a predefined depth map region, and the distance from the UAV is also considered. This study proposes three landing methods and compares their performance using various indices such as UAV travel distance, map accuracy, obstacle response time etc.

Autonomous Landing on Small Bodies based on Discrete Sliding Mode Control (이산 슬라이딩 모드 제어를 이용한 소천체 자율 착륙 기법)

  • Lee, Juyoung
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.45 no.8
    • /
    • pp.647-661
    • /
    • 2017
  • This paper presents a robust method for autonomously landing on small bodies. Autonomous landing is accomplished by generating and following reference position and attitude profiles. The position and attitude tracking controllers are based on discrete sliding mode control, which explicitly treats the discrete and impulsive natures of thruster operation. Vision-based inertial navigation is used for autonomous navigation for landing. Numerical simulation is carried out to evaluate the performance of the proposed method in a realistic situation with environmental uncertainties.

Vision-based Navigation for VTOL Unmanned Aerial Vehicle Landing (수직이착륙 무인항공기 자동 착륙을 위한 영상기반 항법)

  • Lee, Sang-Hoon;Song, Jin-Mo;Bae, Jong-Sue
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.18 no.3
    • /
    • pp.226-233
    • /
    • 2015
  • Pose estimation is an important operation for many vision tasks. This paper presents a method of estimating the camera pose, using a known landmark for the purpose of autonomous vertical takeoff and landing(VTOL) unmanned aerial vehicle(UAV) landing. The proposed method uses a distinctive methodology to solve the pose estimation problem. We propose to combine extrinsic parameters from known and unknown 3-D(three-dimensional) feature points, and inertial estimation of camera 6-DOF(Degree Of Freedom) into one linear inhomogeneous equation. This allows us to use singular value decomposition(SVD) to neatly solve the given optimization problem. We present experimental results that demonstrate the ability of the proposed method to estimate camera 6DOF with the ease of implementation.

Research of the Delivery Autonomy and Vision-based Landing Algorithm for Last-Mile Service using a UAV (무인기를 이용한 Last-Mile 서비스를 위한 배송 자동화 및 영상기반 착륙 알고리즘 연구)

  • Hanseob Lee;Hoon Jung
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.46 no.2
    • /
    • pp.160-167
    • /
    • 2023
  • This study focuses on the development of a Last-Mile delivery service using unmanned vehicles to deliver goods directly to the end consumer utilizing drones to perform autonomous delivery missions and an image-based precision landing algorithm for handoff to a robot in an intermediate facility. As the logistics market continues to grow rapidly, parcel volumes increase exponentially each year. However, due to low delivery fees, the workload of delivery personnel is increasing, resulting in a decrease in the quality of delivery services. To address this issue, the research team conducted a study on a Last-Mile delivery service using unmanned vehicles and conducted research on the necessary technologies for drone-based goods transportation in this paper. The flight scenario begins with the drone carrying the goods from a pickup location to the rooftop of a building where the final delivery destination is located. There is a handoff facility on the rooftop of the building, and a marker on the roof must be accurately landed upon. The mission is complete once the goods are delivered and the drone returns to its original location. The research team developed a mission planning algorithm to perform the above scenario automatically and constructed an algorithm to recognize the marker through a camera sensor and achieve a precision landing. The performance of the developed system has been verified through multiple trial operations within ETRI.

Development of a Hovering Robot System for Calamity Observation

  • Kang, M.S.;Park, S.;Lee, H.G.;Won, D.H.;Kim, T.J.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.580-585
    • /
    • 2005
  • A QRT(Quad-Rotor Type) hovering robot system is developed for quick detection and observation of the circumstances under calamity environment such as indoor fire spots. The UAV(Unmanned Aerial Vehicle) is equipped with four propellers driven by each electric motor, an embedded controller using a DSP, INS(Inertial Navigation System) using 3-axis rate gyros, a CCD camera with wireless communication transmitter for observation, and an ultrasonic range sensor for height control. The developed hovering robot shows stable flying performances under the adoption of RIC(Robust Internal-loop Compensator) based disturbance compensation and the vision based localization method. The UAV can also avoid obstacles using eight IR and four ultrasonic range sensors. The VTOL(Vertical Take-Off and Landing) flying object flies into indoor fire spots and sends the images captured by the CCD camera to the operator. This kind of small-sized UAV can be widely used in various calamity observation fields without danger of human beings under harmful environment.

  • PDF