• Title/Summary/Keyword: Model based Object Tracking

Search Result 234, Processing Time 0.021 seconds

Target Modeling with Color Arrangement for Region-Based Object Tracking (영역 기반 물체 추적에서 색상 배치를 고려한 표적 모델링)

  • Kim, Dae-Hwan;Lee, Seung-Jun;Ko, Sung-Jea
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.1
    • /
    • pp.1-10
    • /
    • 2012
  • In this paper, we propose a new class of color histogram model suitable for object tracking. In addition to the pixel count, each bin of the proposed model also contains the spatial mean and the average value of the pixels located at a certain distance from the mean location of the bin. Using the proposed color histogram model, we derive a mean shift procedure using the modified Bhattacharyya distance. Unlike most mean shift based methods, our algorithm performs well even when the object being tracked shares similar colors with the background. Experimental results demonstrate improved tracking performance over existing methods.

Design of A Moving Object Management System for Tracking Vehicle Location (차량 위치 추적을 위한 이동 객체 관리 시스템의 설계)

  • Ahn, Yoon-Ae;Kim, Dong-Ho;Ryu, Keun-Ho
    • The KIPS Transactions:PartD
    • /
    • v.9D no.5
    • /
    • pp.827-836
    • /
    • 2002
  • Moving object management systems manage spatiotemporal data, which change their location over tine such as people, animals, and cars. These moving object management systems can be applied to vehicle location tracking, digital battlefield, location-based service, and so on. The existing moving object management systems only manage past or future location of the moving objects separately. Therefore, they cannot suggest estimation method of uncertain past or future location of the moving objects. In this paper, we propose a moving object management system, which not only manages historical data of the moving objects, but also predicts past and future location of the moving objects using historical data stored in database. We define the moving objects for vehicle location tracking and propose a moving object database structure. Finally, we suggest an execution model of the proposed system and apply the execution model to a virtual scenario for vehicle tracking.

Vision and Lidar Sensor Fusion for VRU Classification and Tracking in the Urban Environment (카메라-라이다 센서 융합을 통한 VRU 분류 및 추적 알고리즘 개발)

  • Kim, Yujin;Lee, Hojun;Yi, Kyongsu
    • Journal of Auto-vehicle Safety Association
    • /
    • v.13 no.4
    • /
    • pp.7-13
    • /
    • 2021
  • This paper presents an vulnerable road user (VRU) classification and tracking algorithm using vision and LiDAR sensor fusion method for urban autonomous driving. The classification and tracking for vulnerable road users such as pedestrian, bicycle, and motorcycle are essential for autonomous driving in complex urban environments. In this paper, a real-time object image detection algorithm called Yolo and object tracking algorithm from LiDAR point cloud are fused in the high level. The proposed algorithm consists of four parts. First, the object bounding boxes on the pixel coordinate, which is obtained from YOLO, are transformed into the local coordinate of subject vehicle using the homography matrix. Second, a LiDAR point cloud is clustered based on Euclidean distance and the clusters are associated using GNN. In addition, the states of clusters including position, heading angle, velocity and acceleration information are estimated using geometric model free approach (GMFA) in real-time. Finally, the each LiDAR track is matched with a vision track using angle information of transformed vision track and assigned a classification id. The proposed fusion algorithm is evaluated via real vehicle test in the urban environment.

Position Improvement of a Mobile Robot by Real Time Tracking of Multiple Moving Objects (실시간 다중이동물체 추적에 의한 이동로봇의 위치개선)

  • Jin, Tae-Seok;Lee, Min-Jung;Tack, Han-Ho;Lee, In-Yong;Lee, Joon-Tark
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.2
    • /
    • pp.187-192
    • /
    • 2008
  • The Intelligent Space(ISpace) provides challenging research fields for surveillance, human-computer interfacing, networked camera conferencing, industrial monitoring or service and training applications. ISpace is the space where many intelligent devices, such as computers and sensors, are distributed. According to the cooperation of many intelligent devices, the environment, it is very important that the system knows the location information to offer the useful services. In order to achieve these goals, we present a method for representing, tracking and human Jollowing by fusing distributed multiple vision systems in ISpace, with application to pedestrian tracking in a crowd. This paper describes appearance based unknown object tracking with the distributed vision system in intelligent space. First, we discuss how object color information is obtained and how the color appearance based model is constructed from this data. Then, we discuss the global color model based on the local color information. The process of learning within global model and the experimental results are also presented.

Real-time Moving Object Tracking from a Moving Camera (이동 카메라 영상에서 이동물체의 실시간 추적)

  • Chun, Quan;Lee, Ju-Shin
    • The KIPS Transactions:PartB
    • /
    • v.9B no.4
    • /
    • pp.465-470
    • /
    • 2002
  • This paper presents a new model based method for tracking moving object from a moving camera. In the proposed method, binary model is derived from detected object regions and Hausdorff distance between the model and edge image is used as its similarity measure to overcome the target's shape changes. Also, a novel search algorithm and some optimization methods are proposed to enable realtime processing. The experimental results on our test sequences demonstrate the high efficiency and accuracy of our approach.

Positive Random Forest based Robust Object Tracking (Positive Random Forest 기반의 강건한 객체 추적)

  • Cho, Yunsub;Jeong, Soowoong;Lee, Sangkeun
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.6
    • /
    • pp.107-116
    • /
    • 2015
  • In compliance with digital device growth, the proliferation of high-tech computers, the availability of high quality and inexpensive video cameras, the demands for automated video analysis is increasing, especially in field of intelligent monitor system, video compression and robot vision. That is why object tracking of computer vision comes into the spotlight. Tracking is the process of locating a moving object over time using a camera. The consideration of object's scale, rotation and shape deformation is the most important thing in robust object tracking. In this paper, we propose a robust object tracking scheme using Random Forest. Specifically, an object detection scheme based on region covariance and ZNCC(zeros mean normalized cross correlation) is adopted for estimating accurate object location. Next, the detected region will be divided into five regions for random forest-based learning. The five regions are verified by random forest. The verified regions are put into the model pool. Finally, the input model is updated for the object location correction when the region does not contain the object. The experiments shows that the proposed method produces better accurate performance with respect to object location than the existing methods.

Asynchronous Sensor Fusion using Multi-rate Kalman Filter (다중주기 칼만 필터를 이용한 비동기 센서 융합)

  • Son, Young Seop;Kim, Wonhee;Lee, Seung-Hi;Chung, Chung Choo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.11
    • /
    • pp.1551-1558
    • /
    • 2014
  • We propose a multi-rate sensor fusion of vision and radar using Kalman filter to solve problems of asynchronized and multi-rate sampling periods in object vehicle tracking. A model based prediction of object vehicles is performed with a decentralized multi-rate Kalman filter for each sensor (vision and radar sensors.) To obtain the improvement in the performance of position prediction, different weighting is applied to each sensor's predicted object position from the multi-rate Kalman filter. The proposed method can provide estimated position of the object vehicles at every sampling time of ECU. The Mahalanobis distance is used to make correspondence among the measured and predicted objects. Through the experimental results, we validate that the post-processed fusion data give us improved tracking performance. The proposed method obtained two times improvement in the object tracking performance compared to single sensor method (camera or radar sensor) in the view point of roots mean square error.

Tracking Moving Object using Hausdorff Distance (Hausdorff 거리를 이용한 이동물체 추적)

  • Kim, Tea-Sik;Lee, Ju-Shin
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.37 no.3
    • /
    • pp.79-87
    • /
    • 2000
  • In this paper, we propose a model based moving object tracking algorithm In dynamic scenes To adapt shape change of the moving object, the Hausdorff distance is applied as the measurement of similarity between model and image To reduce processing time, 2D logarithmic search method is applied for locate the position of moving object Experiments on a running vehicle and motorcycle, the result showed that the mean square error of real position and tracking result is 1150 and 1845; matching times are reduced average 1125times and 523 times than existing algorithm for vehicle image and motorcycle image, respectively It showed that the proposed algorithm could track the moving object accurately.

  • PDF

Structurally Enhanced Correlation Tracking

  • Parate, Mayur Rajaram;Bhurchandi, Kishor M.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.10
    • /
    • pp.4929-4947
    • /
    • 2017
  • In visual object tracking, Correlation Filter-based Tracking (CFT) systems have arouse recently to be the most accurate and efficient methods. The CFT's circularly shifts the larger search window to find most likely position of the target. The need of larger search window to cover both background and object make an algorithm sensitive to the background and the target occlusions. Further, the use of fixed-sized windows for training makes them incapable to handle scale variations during tracking. To address these problems, we propose two layer target representation in which both global and local appearances of the target is considered. Multiple local patches in the local layer provide robustness to the background changes and the target occlusion. The target representation is enhanced by employing additional reversed RGB channels to prevent the loss of black objects in background during tracking. The final target position is obtained by the adaptive weighted average of confidence maps from global and local layers. Furthermore, the target scale variation in tracking is handled by the statistical model, which is governed by adaptive constraints to ensure reliability and accuracy in scale estimation. The proposed structural enhancement is tested on VTBv1.0 benchmark for its accuracy and robustness.

Mobile Object Tracking Algorithm Using Particle Filter (Particle filter를 이용한 이동 물체 추적 알고리즘)

  • Kim, Se-Jin;Joo, Young-Hoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.586-591
    • /
    • 2009
  • In this paper, we propose the mobile object tracking algorithm based on the feature vector using particle filter. To do this, first, we detect the movement area of mobile object by using RGB color model and extract the feature vectors of the input image by using the KLT-algorithm. And then, we get the first feature vectors by matching extracted feature vectors to the detected movement area. Second, we detect new movement area of the mobile objects by using RGB and HSI color model, and get the new feature vectors by applying the new feature vectors to the snake algorithm. And then, we find the second feature vectors by applying the second feature vectors to new movement area. So, we design the mobile object tracking algorithm by applying the second feature vectors to particle filter. Finally, we validate the applicability of the proposed method through the experience in a complex environment.