• 제목/요약/키워드: Multi-filter fusion

검색결과 82건 처리시간 0.022초

Out of Sequence Measurement 환경에서의 MPDA 성능 분석 (The Performance Analysis of MPDA in Out of Sequence Measurement Environment)

  • 서일환;임영택;송택열
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제55권9호
    • /
    • pp.401-408
    • /
    • 2006
  • In a multi-sensor multi-target tracking systems, the local sensors have the role of tracking the target and transferring the measurements to the fusion center. The measurements from the same target can arrive out of sequence called the out-of-sequence measurements(OOSMs). Out-of-sequence measurements can arise at the fusion center due to communication delay and varying preprocessing time for different sensor platforms. In general, the track fusion occurs to enhance the tracking performance of the sensors using the measurements from the sensors at the fusion center. The target informations can wive at the fusion center with the clutter informations in cluttered environment. In this paper, the OOSM update step with MPDA(Most Probable Data Association) is introduced and tested in several cases with the various clutter density through the Monte Carlo simulation. The performance of the MPDA with OOSM update step is compared with the existing NN, PDA, and PDA-AI for the air target tracking in cluttered and out-of-sequence measurement environment. Simulation results show that MPDA with the OOSM has compatible root mean square errors with out-of-sequence PDA-AI filter and the MPDA is sufficient to be used in out-of-sequence environment.

UTV localization from fusion of Dead -reckoning and LBL System

  • Woon, Jeon-Sang;Jung Sul;Cheol, Won-Moon;Hong Sup
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.64.4-64
    • /
    • 2001
  • Localization is the key role in controlling the Mobile Robot. In this papers, a development of the sensor fusion algorithm for controling UTV(Unmanned Tracked Vehicle) is presented. The multi-sensocial dead-rocking subsystem is established based on the optimal filtering by first fusing heading angle reading from a magnetic compass, a rate-gyro and two encoders mouned on the robot wheels, thereby computing the deat-reckoned location. These data and the position data provoded by LBL system are fused together by means of an extended Kalman filter. This algorithm is proved by simulation studies.

  • PDF

비전 센서와 자이로 센서의 융합을 통한 보행 로봇의 자세 추정 (Attitude Estimation for the Biped Robot with Vision and Gyro Sensor Fusion)

  • 박진성;박영진;박윤식;홍덕화
    • 제어로봇시스템학회논문지
    • /
    • 제17권6호
    • /
    • pp.546-551
    • /
    • 2011
  • Tilt sensor is required to control the attitude of the biped robot when it walks on an uneven terrain. Vision sensor, which is used for recognizing human or detecting obstacles, can be used as a tilt angle sensor by comparing current image and reference image. However, vision sensor alone has a lot of technological limitations to control biped robot such as low sampling frequency and estimation time delay. In order to verify limitations of vision sensor, experimental setup of an inverted pendulum, which represents pitch motion of the walking or running robot, is used and it is proved that only vision sensor cannot control an inverted pendulum mainly because of the time delay. In this paper, to overcome limitations of vision sensor, Kalman filter for the multi-rate sensor fusion algorithm is applied with low-quality gyro sensor. It solves limitations of the vision sensor as well as eliminates drift of gyro sensor. Through the experiment of an inverted pendulum control, it is found that the tilt estimation performance of fusion sensor is greatly improved enough to control the attitude of an inverted pendulum.

Similarity Measurement using Gabor Energy Feature and Mutual Information for Image Registration

  • Ye, Chul-Soo
    • 대한원격탐사학회지
    • /
    • 제27권6호
    • /
    • pp.693-701
    • /
    • 2011
  • Image registration is an essential process to analyze the time series of satellite images for the purpose of image fusion and change detection. The Mutual Information (MI) is commonly used as similarity measure for image registration because of its robustness to noise. Due to the radiometric differences, it is not easy to apply MI to multi-temporal satellite images using directly the pixel intensity. Image features for MI are more abundantly obtained by employing a Gabor filter which varies adaptively with the filter characteristics such as filter size, frequency and orientation for each pixel. In this paper we employed Bidirectional Gabor Filter Energy (BGFE) defined by Gabor filter features and applied the BGFE to similarity measure calculation as an image feature for MI. The experiment results show that the proposed method is more robust than the conventional MI method combined with intensity or gradient magnitude.

SWT -based Wavelet Filter Application for De-noising of Remotely Sensed Imageries

  • Yoo Hee-Young;Lee Kiwon;Kwon Byung-Doo
    • 대한원격탐사학회:학술대회논문집
    • /
    • 대한원격탐사학회 2005년도 Proceedings of ISRS 2005
    • /
    • pp.505-508
    • /
    • 2005
  • Wavelet scheme can be applied to the various remote sensing problems: conventional multi-resolution image analysis, compression of large image sets, fusion of heterogeneous sensor image and segmentation of features. In this study, we attempted wavelet-based filtering and its analysis. Traditionally, statistical methods and adaptive filter are used to manipulate noises in the image processing procedure. While we tried to filter random noise from optical image and radar image using Discrete Wavelet Transform (DW1) and Stationary Wavelet Transform (SW1) and compared with existing methods such as median filter and adaptive filter. In result, SWT preserved boundaries and reduced noises most effectively. If appropriate thresholds are used, wavelet filtering will be applied to detect road boundaries, buildings, cars and other complex features from high-resolution imagery in an urban environment as well as noise filtering

  • PDF

An integrated visual-inertial technique for structural displacement and velocity measurement

  • Chang, C.C.;Xiao, X.H.
    • Smart Structures and Systems
    • /
    • 제6권9호
    • /
    • pp.1025-1039
    • /
    • 2010
  • Measuring displacement response for civil structures is very important for assessing their performance, safety and integrity. Recently, video-based techniques that utilize low-cost high-resolution digital cameras have been developed for such an application. These techniques however have relatively low sampling frequency and the results are usually contaminated with noises. In this study, an integrated visual-inertial measurement method that combines a monocular videogrammetric displacement measurement technique and a collocated accelerometer is proposed for displacement and velocity measurement of civil engineering structures. The monocular videogrammetric technique extracts three-dimensional translation and rotation of a planar target from an image sequence recorded by one camera. The obtained displacement is then fused with acceleration measured from a collocated accelerometer using a multi-rate Kalman filter with smoothing technique. This data fusion not only can improve the accuracy and the frequency bandwidth of displacement measurement but also provide estimate for velocity. The proposed measurement technique is illustrated by a shake table test and a pedestrian bridge test. Results show that the fusion of displacement and acceleration can mitigate their respective limitations and produce more accurate displacement and velocity responses with a broader frequency bandwidth.

Quasi real-time and continuous non-stationary strain estimation in bottom-fixed offshore structures by multimetric data fusion

  • Palanisamy, Rajendra P.;Jung, Byung-Jin;Sim, Sung-Han;Yi, Jin-Hak
    • Smart Structures and Systems
    • /
    • 제23권1호
    • /
    • pp.61-69
    • /
    • 2019
  • Offshore structures are generally exposed to harsh environments such as strong tidal currents and wind loadings. Monitoring the structural soundness and integrity of offshore structures is crucial to prevent catastrophic collapses and to prolong their lifetime; however, it is intrinsically challenging because of the difficulties in accessing the critical structural members that are located under water for installing and repairing sensors and data acquisition systems. Virtual sensing technologies have the potential to alleviate such difficulties by estimating the unmeasured structural responses at the desired locations using other measured responses. Despite the usefulness of virtual sensing, its performance and applicability to the structural health monitoring of offshore structures have not been fully studied to date. This study investigates the use of virtual sensing of offshore structures. A Kalman filter based virtual sensing algorithm is developed to estimate responses at the location of interest. Further, this algorithm performs a multi-sensor data fusion to improve the estimation accuracy under non-stationary tidal loading. Numerical analysis and laboratory experiments are conducted to verify the performance of the virtual sensing strategy using a bottom-fixed offshore structural model. Numerical and experimental results show that the unmeasured responses can be reasonably recovered from the measured responses.

Simultaneous Localization and Mobile Robot Navigation using a Sensor Network

  • Jin Tae-Seok;Bashimoto Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제6권2호
    • /
    • pp.161-166
    • /
    • 2006
  • Localization of mobile agent within a sensing network is a fundamental requirement for many applications, using networked navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems, This paper describes a networked sensor-based navigation method in an indoor environment for an autonomous mobile robot which can navigate and avoid obstacle. In this method, the self-localization of the robot is done with a model-based vision system using networked sensors, and nonstop navigation is realized by a Kalman filter-based STSF(Space and Time Sensor Fusion) method. Stationary obstacles and moving obstacles are avoided with networked sensor data such as CCD camera and sonar ring. We will report on experiments in a hallway using the Pioneer-DX robot. In addition to that, the localization has inevitable uncertainties in the features and in the robot position estimation. Kalman filter scheme is used for the estimation of the mobile robot localization. And Extensive experiments with a robot and a sensor network confirm the validity of the approach.

융합된 다중 센서와 EKF 기반의 무인잠수정의 항법시스템 설계 (Navigation System of UUV Using Multi-Sensor Fusion-Based EKF)

  • 박영식;최원석;한성익;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제22권7호
    • /
    • pp.562-569
    • /
    • 2016
  • This paper proposes a navigation system with a robust localization method for an underwater unmanned vehicle. For robust localization with IMU (Inertial Measurement Unit), a DVL (Doppler Velocity Log), and depth sensors, the EKF (Extended Kalman Filter) has been utilized to fuse multiple nonlinear data. Note that the GPS (Global Positioning System), which can obtain the absolute coordinates of the vehicle, cannot be used in the water. Additionally, the DVL has been used for measuring the relative velocity of the underwater vehicle. The DVL sensor measures the velocity of an object by using Doppler effects, which cause sound frequency changes from the relative velocity between a sound source and an observer. When the vehicle is moving, the motion trajectory to a target position can be recorded by the sensors attached to the vehicle. The performance of the proposed navigation system has been verified through real experiments in which an underwater unmanned vehicle reached a target position by using an IMU as a primary sensor and a DVL as the secondary sensor.

다중센서 오차특성을 고려한 융합 알고리즘 (A Fusion Algorithm considering Error Characteristics of the Multi-Sensor)

  • 현대환;윤희병
    • 한국정보과학회논문지:시스템및이론
    • /
    • 제36권4호
    • /
    • pp.274-282
    • /
    • 2009
  • 기동물체 추적을 위해서 GPS, INS, 레이더 및 광학장비 등의 다양한 위치추적 센서가 이용되고 있으며, 기동물체의 강인한 추적성능을 유지하기 위해 이기종 센서의 효과적인 융합방법이 필요하다. 이기종 다중센서를 이용한 추적성능 향상을 위해 센서의 서로 다른 오차특성을 고려하여 각 센서의 측정치를 상이한 모델로 간주하여 융합하는 연구가 수행되었지만, 한 센서의 오차가 급격히 증가하는 구간에서 다른 센서의 추정치에 대한 오차가 증가하고 각 센서의 측정값이 참 값일 확률인 Sensor Probability 값에 대해 센서 측정치 변화를 실시간으로 반영하지 못하였다. 본 논문에서는 각 센서 칼만필터의 갱신추정치와 측정치 간의 차이에 대한 RMSE(Root Mean Square Error)를 비교하여 Sensor Probability를 구하고, 결합추정치를 다시 각 센서 칼만필터 입력값으로 대입하는 과정을 제외하여 센서 측정치에 대한 실시간적인 반영과 센서 성능이 급격히 저하되는 구간에서의 추적성능을 개선한다. 제안하는 알고리즘은 각 센서의 오차특성을 조건부 확률값으로 추가하여 각 센서의 Sensor Probability에 따라 가장 양호한 성능을 보이는 센서 위주로 트랙융합을 함으로써 강인성을 보장 한다. 실험을 통해 UAV의 기동 경로를 생성하고 제안 알고리즘을 적용하여 다른 융합 알고리즘과 성능분석을 실시한다.