• Title/Summary/Keyword: Multi-filter fusion

Search Result 82, Processing Time 0.027 seconds

Particle Filter Based Robust Multi-Human 3D Pose Estimation for Vehicle Safety Control (차량 안전 제어를 위한 파티클 필터 기반의 강건한 다중 인체 3차원 자세 추정)

  • Park, Joonsang;Park, Hyungwook
    • Journal of Auto-vehicle Safety Association
    • /
    • v.14 no.3
    • /
    • pp.71-76
    • /
    • 2022
  • In autonomous driving cars, 3D pose estimation can be one of the effective methods to enhance safety control for OOP (Out of Position) passengers. There have been many studies on human pose estimation using a camera. Previous methods, however, have limitations in automotive applications. Due to unexplainable failures, CNN methods are unreliable, and other methods perform poorly. This paper proposes robust real-time multi-human 3D pose estimation architecture in vehicle using monocular RGB camera. Using particle filter, our approach integrates CNN 2D/3D pose measurements with available information in vehicle. Computer simulations were performed to confirm the accuracy and robustness of the proposed algorithm.

Integrated Automatic Pre-Processing for Change Detection Based on SURF Algorithm and Mask Filter (변화탐지를 위한 SURF 알고리즘과 마스크필터 기반 통합 자동 전처리)

  • Kim, Taeheon;Lee, Won Hee;Yeom, Junho;Han, Youkyung
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.37 no.3
    • /
    • pp.209-219
    • /
    • 2019
  • Satellite imagery occurs geometric and radiometric errors due to external environmental factors at the acquired time, which in turn causes false-alarm in change detection. These errors should be eliminated by geometric and radiometric corrections. In this study, we propose a methodology that automatically and simultaneously performs geometric and radiometric corrections by using the SURF (Speeded-Up Robust Feature) algorithm and the mask filter. The MPs (Matching Points), which show invariant properties between multi-temporal imagery, extracted through the SURF algorithm are used for automatic geometric correction. Using the properties of the extracted MPs, PIFs (Pseudo Invariant Features) used for relative radiometric correction are selected. Subsequently, secondary PIFs are extracted by generated mask filters around the selected PIFs. After performing automatic using the extracted MPs, we could confirm that geometric and radiometric errors are eliminated as the result of performing the relative radiometric correction using PIFs in geo-rectified images.

Real time orbit estimation using asynchronous multiple RADAR data fusion (비동기 다중 레이더 융합을 통한 실시간 궤도 추정 알고리즘)

  • Song, Ha-Ryong;Moon, Byoung-Jin;Cho, Dong-Hyun
    • Aerospace Engineering and Technology
    • /
    • v.13 no.2
    • /
    • pp.66-72
    • /
    • 2014
  • This paper introduces an asynchronous multiple radar fusion algorithm for space object tracking. To estimate orbital motion of space object, a multiple radar scenario which jointly measures single object with different sampling time indices is described. STK/ODTK is utilized to determine realization of orbital motion and joint coverage of multiple radars. Then, asynchronous fusion algorithm is adapted to enhance the estimation performance of orbital motion during which multiple radars measure the same time instances. Monte-Carlo simulation results demonstrate that the proposed asynchronous multi-sensor fusion scheme better than single linearized Kalman filter in an aspect of root mean square error.

Accurate Vehicle Positioning on a Numerical Map

  • Laneurit Jean;Chapuis Roland;Chausse Fr d ric
    • International Journal of Control, Automation, and Systems
    • /
    • v.3 no.1
    • /
    • pp.15-31
    • /
    • 2005
  • Nowadays, the road safety is an important research field. One of the principal research topics in this field is the vehicle localization in the road network. This article presents an approach of multi sensor fusion able to locate a vehicle with a decimeter precision. The different informations used in this method come from the following sensors: a low cost GPS, a numeric camera, an odometer and a steer angle sensor. Taking into account a complete model of errors on GPS data (bias on position and nonwhite errors) as well as the data provided by an original approach coupling a vision algorithm with a precise numerical map allow us to get this precision.

Map-Building and Position Estimation based on Multi-Sensor Fusion for Mobile Robot Navigation in an Unknown Environment (이동로봇의 자율주행을 위한 다중센서융합기반의 지도작성 및 위치추정)

  • Jin, Tae-Seok;Lee, Min-Jung;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.5
    • /
    • pp.434-443
    • /
    • 2007
  • Presently, the exploration of an unknown environment is an important task for thee new generation of mobile service robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems. This paper presents a technique for localization of a mobile robot using fusion data of multi-ultrasonic sensors and vision system. The mobile robot is designed for operating in a well-structured environment that can be represented by planes, edges, comers and cylinders in the view of structural features. In the case of ultrasonic sensors, these features have the range information in the form of the arc of a circle that is generally named as RCD(Region of Constant Depth). Localization is the continual provision of a knowledge of position which is deduced from it's a priori position estimation. The environment of a robot is modeled into a two dimensional grid map. we defines a vision-based environment recognition, phisically-based sonar sensor model and employs an extended Kalman filter to estimate position of the robot. The performance and simplicity of the approach is demonstrated with the results produced by sets of experiments using a mobile robot.

Road Recognition based Extended Kalman Filter with Multi-Camera and LRF (다중카메라와 레이저스캐너를 이용한 확장칼만필터 기반의 노면인식방법)

  • Byun, Jae-Min;Cho, Yong-Suk;Kim, Sung-Hoon
    • The Journal of Korea Robotics Society
    • /
    • v.6 no.2
    • /
    • pp.182-188
    • /
    • 2011
  • This paper describes a method of road tracking by using a vision and laser with extracting road boundary (road lane and curb) for navigation of intelligent transport robot in structured road environments. Road boundary information plays a major role in developing such intelligent robot. For global navigation, we use a global positioning system achieved by means of a global planner and local navigation accomplished with recognizing road lane and curb which is road boundary on the road and estimating the location of lane and curb from the current robot with EKF(Extended Kalman Filter) algorithm in the road assumed that it has prior information. The complete system has been tested on the electronic vehicles which is equipped with cameras, lasers, GPS. Experimental results are presented to demonstrate the effectiveness of the combined laser and vision system by our approach for detecting the curb of road and lane boundary detection.

Dual Foot-PDR System Considering Lateral Position Error Characteristics

  • Lee, Jae Hong;Cho, Seong Yun;Park, Chan Gook
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.11 no.1
    • /
    • pp.35-44
    • /
    • 2022
  • In this paper, a dual foot (DF)-PDR system is proposed for the fusion of integration (IA)-based PDR systems independently applied on both shoes. The horizontal positions of the two shoes estimated from each PDR system are fused based on a particle filter. The proposed method bounds the position error even if the walking time increases without an additional sensor. The distribution of particles is a non-Gaussian distribution to express the lateral error due to systematic drift. Assuming that the shoe position is the pedestrian position, the multi-modal position distribution can be fused into one using the Gaussian sum. The fused pedestrian position is used as a measurement of each particle filter so that the position error is corrected. As a result, experimental results show that position of pedestrians can be effectively estimated by using only the inertial sensors attached to both shoes.

A Neural Network and Kalman Filter Hybrid Approach for GPS/INS Integration

  • Wang, Jianguo Jack;Wang, Jinling;Sinclair, David;Watts, Leo
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • v.1
    • /
    • pp.277-282
    • /
    • 2006
  • It is well known that Kalman filtering is an optimal real-time data fusion method for GPS/INS integration. However, it has some limitations in terms of stability, adaptability and observability. A Kalman filter can perform optimally only when its dynamic model is correctly defined and the noise statistics for the measurement and process are completely known. It is found that estimated Kalman filter states could be influenced by several factors, including vehicle dynamic variations, filter tuning results, and environment changes, etc., which are difficult to model. Neural networks can map input-output relationships without apriori knowledge about them; hence a proper designed neural network is capable of learning and extracting these complex relationships with enough training. This paper presents a GPS/INS integrated system that combines Kalman filtering and neural network algorithms to improve navigation solutions during GPS outages. An Extended Kalman filter estimates INS measurement errors, plus position, velocity and attitude errors etc. Kalman filter states, and gives precise navigation solutions while GPS signals are available. At the same time, a multi-layer neural network is trained to map the vehicle dynamics with corresponding Kalman filter states, at the same rate of measurement update. After the output of the neural network meets a similarity threshold, it can be used to correct INS measurements when no GPS measurements are available. Selecting suitable inputs and outputs of the neural network is critical for this hybrid method. Detailed analysis unveils that some Kalman filter states are highly correlated with vehicle dynamic variations. The filter states that heavily impact system navigation solutions are selected as the neural network outputs. The principle of this hybrid method and the neural network design are presented. Field test data are processed to evaluate the performance of the proposed method.

  • PDF

Design of Multi-Sensor-Based Open Architecture Integrated Navigation System for Localization of UGV

  • Choi, Ji-Hoon;Oh, Sang Heon;Kim, Hyo Seok;Lee, Yong Woo
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.1 no.1
    • /
    • pp.35-43
    • /
    • 2012
  • The UGV is one of the special field robot developed for mine detection, surveillance and transportation. To achieve successfully the missions of the UGV, the accurate and reliable navigation data should be provided. This paper presents design and implementation of multi-sensor-based open architecture integrated navigation for localization of UGV. The presented architecture hierarchically classifies the integrated system into four layers and data communications between layers are based on the distributed object oriented middleware. The navigation manager determines the navigation mode with the QoS information of each navigation sensor and the integrated filter performs the navigation mode-based data fusion in the filtering process. Also, all navigation variables including the filter parameters and QoS of navigation data can be modified in GUI and consequently, the user can operate the integrated navigation system more usefully. The conventional GPS/INS integrated system does not guarantee the long-term reliability of localization when GPS solution is not available by signal blockage and intentional jamming in outdoor environment. The presented integration algorithm, however, based on the adaptive federated filter structure with FDI algorithm can integrate effectively the output of multi-sensor such as 3D LADAR, vision, odometer, magnetic compass and zero velocity to enhance the accuracy of localization result in the case that GPS is unavailable. The field test was carried out with the UGV and the test results show that the presented integrated navigation system can provide more robust and accurate localization performance than the conventional GPS/INS integrated system in outdoor environments.

Robust Maneuvering Target Tracking Applying the Concept of Multiple Model Filter and the Fusion of Multi-Sensor (다중센서 융합 및 다수모델 필터 개념을 적용한 강인한 기동물체 추적)

  • Hyun, Dae-Hwan;Yoon, Hee-Byung
    • Journal of Intelligence and Information Systems
    • /
    • v.15 no.1
    • /
    • pp.51-64
    • /
    • 2009
  • A location tracking sensor such as GPS, INS, Radar, and optical equipments is used in tracking Maneuvering Targets with a multi-sensor, and such systems are used to track, detect, and control UAV, guided missile, and spaceship. Until now, Most of the studies related to tracking Maneuvering Targets are on fusing multiple Radars, or adding a supplementary sensor to INS and GPS. However, A study is required to change the degree of application in fusions since the system property and error property are different from sensors. In this paper, we perform the error analysis of the sensor properties by adding a ground radar to GPS and INS for improving the tracking performance by multi-sensor fusion, and suggest the tracking algorithm that improves the precision and stability by changing the sensor probability of each sensor according to the error. For evaluation, we extract the altitude values in a simulation for the trajectory of UAV and apply the suggested algorithm to carry out the performance analysis. In this study, we change the weight of the evaluated values according to the degree of error between the navigation information of each sensor to improve the precision of navigation information, and made it possible to have a strong tracking which is not affected by external purposed environmental change and disturbance.

  • PDF