• Title/Summary/Keyword: Local Sensor Track Fusion

Search Result 3, Processing Time 0.016 seconds

A Study on Multi Sensor Track Fusion Algorithm for Naval Combat System (함정 전투체계 표적 융합 정확도 향상을 위한 알고리즘 연구)

  • Jung, Young-Ran
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.10 no.3
    • /
    • pp.34-42
    • /
    • 2007
  • It is very important for the combat system to process extensive data exactly at short time for the better situation awareness compared with the threats in these days. This paper suggests to add radial velocity on the decision factor of sensor data fusion in the existing algorithm for the accuracy enhancement of the sensor data fusion in the combat system.

Vision and Lidar Sensor Fusion for VRU Classification and Tracking in the Urban Environment (카메라-라이다 센서 융합을 통한 VRU 분류 및 추적 알고리즘 개발)

  • Kim, Yujin;Lee, Hojun;Yi, Kyongsu
    • Journal of Auto-vehicle Safety Association
    • /
    • v.13 no.4
    • /
    • pp.7-13
    • /
    • 2021
  • This paper presents an vulnerable road user (VRU) classification and tracking algorithm using vision and LiDAR sensor fusion method for urban autonomous driving. The classification and tracking for vulnerable road users such as pedestrian, bicycle, and motorcycle are essential for autonomous driving in complex urban environments. In this paper, a real-time object image detection algorithm called Yolo and object tracking algorithm from LiDAR point cloud are fused in the high level. The proposed algorithm consists of four parts. First, the object bounding boxes on the pixel coordinate, which is obtained from YOLO, are transformed into the local coordinate of subject vehicle using the homography matrix. Second, a LiDAR point cloud is clustered based on Euclidean distance and the clusters are associated using GNN. In addition, the states of clusters including position, heading angle, velocity and acceleration information are estimated using geometric model free approach (GMFA) in real-time. Finally, the each LiDAR track is matched with a vision track using angle information of transformed vision track and assigned a classification id. The proposed fusion algorithm is evaluated via real vehicle test in the urban environment.

The Performance Analysis of MPDA in Out of Sequence Measurement Environment (Out of Sequence Measurement 환경에서의 MPDA 성능 분석)

  • Seo, Il-Hwan;Lim, Young-Taek;Song, Taek-Lyul
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.55 no.9
    • /
    • pp.401-408
    • /
    • 2006
  • In a multi-sensor multi-target tracking systems, the local sensors have the role of tracking the target and transferring the measurements to the fusion center. The measurements from the same target can arrive out of sequence called the out-of-sequence measurements(OOSMs). Out-of-sequence measurements can arise at the fusion center due to communication delay and varying preprocessing time for different sensor platforms. In general, the track fusion occurs to enhance the tracking performance of the sensors using the measurements from the sensors at the fusion center. The target informations can wive at the fusion center with the clutter informations in cluttered environment. In this paper, the OOSM update step with MPDA(Most Probable Data Association) is introduced and tested in several cases with the various clutter density through the Monte Carlo simulation. The performance of the MPDA with OOSM update step is compared with the existing NN, PDA, and PDA-AI for the air target tracking in cluttered and out-of-sequence measurement environment. Simulation results show that MPDA with the OOSM has compatible root mean square errors with out-of-sequence PDA-AI filter and the MPDA is sufficient to be used in out-of-sequence environment.