• Title/Summary/Keyword: ego-motion

Search Result 33, Processing Time 0.028 seconds

Fire Detection Algorithm for a Quad-rotor using Ego-motion Compensation (Ego-Motion 보정기법을 적용한 쿼드로터의 화재 감지 알고리즘)

  • Lee, Young-Wan;Kim, Jin-Hwang;Oh, Jeong-Ju;Kim, Hakil
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.1
    • /
    • pp.21-27
    • /
    • 2015
  • A conventional fire detection has been developed based on images captured from a fixed camera. However, It is difficult to apply current algorithms to a flying Quad-rotor to detect fire. To solve this problem, we propose that the fire detection algorithm can be modified for Quad-rotor using Ego-motion compensation. The proposed fire detection algorithm consists of color detection, motion detection, and fire determination using a randomness test. Color detection and randomness test are adapted similarly from an existing algorithm. However, Ego-motion compensation is adapted on motion detection for compensating the degree of Quad-rotor's motion using Planar Projective Transformation based on Optical Flow, RANSAC Algorithm, and Homography. By adapting Ego-motion compensation on the motion detection step, it has been proven that the proposed algorithm has been able to detect fires 83% of the time in hovering mode.

Fine-Motion Estimation Using Ego/Exo-Cameras

  • Uhm, Taeyoung;Ryu, Minsoo;Park, Jong-Il
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.766-771
    • /
    • 2015
  • Robust motion estimation for human-computer interactions played an important role in a novel method of interaction with electronic devices. Existing pose estimation using a monocular camera employs either ego-motion or exo-motion, both of which are not sufficiently accurate for estimating fine motion due to the motion ambiguity of rotation and translation. This paper presents a hybrid vision-based pose estimation method for fine-motion estimation that is specifically capable of extracting human body motion accurately. The method uses an ego-camera attached to a point of interest and exo-cameras located in the immediate surroundings of the point of interest. The exo-cameras can easily track the exact position of the point of interest by triangulation. Once the position is given, the ego-camera can accurately obtain the point of interest's orientation. In this way, any ambiguity between rotation and translation is eliminated and the exact motion of a target point (that is, ego-camera) can then be obtained. The proposed method is expected to provide a practical solution for robustly estimating fine motion in a non-contact manner, such as in interactive games that are designed for special purposes (for example, remote rehabilitation care systems).

A Study on Vehicle Ego-motion Estimation by Optimizing a Vehicle Platform (차량 플랫폼에 최적화한 자차량 에고 모션 추정에 관한 연구)

  • Song, Moon-Hyung;Shin, Dong-Ho
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.9
    • /
    • pp.818-826
    • /
    • 2015
  • This paper presents a novel methodology for estimating vehicle ego-motion, i.e. tri-axis linear velocities and angular velocities by using stereo vision sensor and 2G1Y sensor (longitudinal acceleration, lateral acceleration, and yaw rate). The estimated ego-motion information can be utilized to predict future ego-path and improve the accuracy of 3D coordinate of obstacle by compensating for disturbance from vehicle movement representatively for collision avoidance system. For the purpose of incorporating vehicle dynamic characteristics into ego-motion estimation, the state evolution model of Kalman filter has been augmented with lateral vehicle dynamics and the vanishing point estimation has been also taken into account because the optical flow radiates from a vanishing point which might be varied due to vehicle pitch motion. Experimental results based on real-world data have shown the effectiveness of the proposed methodology in view of accuracy.

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.

Image-Based Ego-Motion Detect of the Unmanned Helicopter using Adaptive weighting (적응형 가중치를 사용한 영상기반 무인 헬리콥터의 Ego-Motion)

  • Chon, Jea-Choon;Chae, Hee-Sung;Shin, Chang-Wan;Kim, Hyong-Suk
    • Proceedings of the KIEE Conference
    • /
    • 1999.07b
    • /
    • pp.653-655
    • /
    • 1999
  • 카메라 영상을 통하여 무인 헬리콥터 동작을 추정하기 위해 적응형 가중치를 사용한 새로운 Ego-Motion을 검출 기법을 제안하였다. 무인 헬리콥터 동적 특성은 비선형이며, 심한 진동 발생으로 영상 번짐(blur) 현상이 나타나기 때문에 상관 값만을 고려한 정합 방법으로는 빈번히 오차가 발생한다. 본 논문에서는 가속도, 각 가속도 및 제어입력 값에 의한 위치 추정 값과 상관 값 및 에지 강도를 가중치에 의해 융합하여 정확한 Ego-Motion을 계산할 수 있는 기법을 제안하였다. 또한 무인 헬리콥터의 가속도, 각 가속도, 상하 속도에 따라서 영상의 번짐 정도가 달라 이들 같이 크면 위치오차에 가중을 크게 주고, 작으면 상관 값에 가중치를 적게 주는 적응형 가중치 결정 알고리즘을 적용하였다. 제안한 적응형 가중치 기법을 무인 헬리콥터에 실험한 결과 카메라에 포착된 영상에 의해 무인헬기의 동작을 정확히 추정 할 수 있었다.

  • PDF

Robust Features and Accurate Inliers Detection Framework: Application to Stereo Ego-motion Estimation

  • MIN, Haigen;ZHAO, Xiangmo;XU, Zhigang;ZHANG, Licheng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.1
    • /
    • pp.302-320
    • /
    • 2017
  • In this paper, an innovative robust feature detection and matching strategy for visual odometry based on stereo image sequence is proposed. First, a sparse multiscale 2D local invariant feature detection and description algorithm AKAZE is adopted to extract the interest points. A robust feature matching strategy is introduced to match AKAZE descriptors. In order to remove the outliers which are mismatched features or on dynamic objects, an improved random sample consensus outlier rejection scheme is presented. Thus the proposed method can be applied to dynamic environment. Then, geometric constraints are incorporated into the motion estimation without time-consuming 3-dimensional scene reconstruction. Last, an iterated sigma point Kalman Filter is adopted to refine the motion results. The presented ego-motion scheme is applied to benchmark datasets and compared with state-of-the-art approaches with data captured on campus in a considerably cluttered environment, where the superiorities are proved.

The study of the Image Geometric Transforms for Moving Object Detection (움직이는 물체검출을 위한 영상 좌표계 변환에 관한 연구)

  • Kim Yong-Jin;Lee Yill-Byung
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.06b
    • /
    • pp.322-324
    • /
    • 2006
  • 배경이 움직이는 카메라에서 영상을 획득하여 특징점을 추출하고 특징점을 이용해 영상 좌표계 변환 파라미터를 추정한다. 추정된 파라미터를 이용하여 영상내의 움직이는 물체를 검출하기 위해 카메라의 Ego-motion을 보정하는 영상 좌표계 변환 방법을 소개하고, Ego-motion 보정을 통해 연속된 두 영상에서 움직이는 물체를 검출하는 실험을 수행한 내용의 논문이다.

  • PDF

Improved View-Based Navigation for Obstacle Avoidance using Ego-Motion

  • Hagiwara, Yoshinobu;Suzuki, Akimasa;Kim, Youngbok;Choi, Yongwoon
    • Journal of Power System Engineering
    • /
    • v.17 no.5
    • /
    • pp.112-120
    • /
    • 2013
  • In this study, we propose an improved view-based navigation method for obstacle avoidance and evaluate the effectiveness of the method in real environments with real obstacles. The proposed method possesses the ability to estimate the position and rotation of a mobile robot, even if the mobile robot strays from a recording path for the purpose of avoiding obstacles. In order to achieve this, ego-motion estimation was incorporated into the existing view-based navigation system. The ego-motion is calculated from SURF points between a current view and a recorded view using a Kinect sensor. In conventional view-based navigation systems, it is difficult to generate alternate paths to avoid obstacles. The proposed method is anticipated to allow a mobile robot greater flexibility in path planning to avoid humans and objects expected in real environments. Based on experiments performed in an indoor environment using a mobile robot, we evaluated the measurement accuracy of the proposed method, and confirmed its feasibility for robot navigation in museums and shopping mall.

Real-Time Precision Vehicle Localization Using Numerical Maps

  • Han, Seung-Jun;Choi, Jeongdan
    • ETRI Journal
    • /
    • v.36 no.6
    • /
    • pp.968-978
    • /
    • 2014
  • Autonomous vehicle technology based on information technology and software will lead the automotive industry in the near future. Vehicle localization technology is a core expertise geared toward developing autonomous vehicles and will provide location information for control and decision. This paper proposes an effective vision-based localization technology to be applied to autonomous vehicles. In particular, the proposed technology makes use of numerical maps that are widely used in the field of geographic information systems and that have already been built in advance. Optimum vehicle ego-motion estimation and road marking feature extraction techniques are adopted and then combined by an extended Kalman filter and particle filter to make up the localization technology. The implementation results of this paper show remarkable results; namely, an 18 ms mean processing time and 10 cm location error. In addition, autonomous driving and parking are successfully completed with an unmanned vehicle within a $300m{\times}500m$ space.

Development of Simulation Environment for Autonomous Driving Algorithm Validation based on ROS (ROS 기반 자율주행 알고리즘 성능 검증을 위한 시뮬레이션 환경 개발)

  • Kwak, Jisub;Yi, Kyongsu
    • Journal of Auto-vehicle Safety Association
    • /
    • v.14 no.1
    • /
    • pp.20-25
    • /
    • 2022
  • This paper presents a development of simulation environment for validation of autonomous driving (AD) algorithm based on Robot Operating System (ROS). ROS is one of the commonly-used frameworks utilized to control autonomous vehicles. For the evaluation of AD algorithm, a 3D autonomous driving simulator has been developed based on LGSVL. Two additional sensors are implemented in the simulation vehicle. First, Lidar sensor is mounted on the ego vehicle for real-time driving environment perception. Second, GPS sensor is equipped to estimate ego vehicle's position. With the vehicle sensor configuration in the simulation, the AD algorithm can predict the local environment and determine control commands with motion planning. The simulation environment has been evaluated with lane changing and keeping scenarios. The simulation results show that the proposed 3D simulator can successfully imitate the operation of a real-world vehicle.