• Title/Summary/Keyword: Multisensor Fusion

Search Result 41, Processing Time 0.023 seconds

Obstacle Avoidance and Planning using Optimization of Cost Fuction based Distributed Control Command (분산제어명령 기반의 비용함수 최소화를 이용한 장애물회피와 주행기법)

  • Bae, Dongseog;Jin, Taeseok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.21 no.3
    • /
    • pp.125-131
    • /
    • 2018
  • In this paper, we propose a homogeneous multisensor-based navigation algorithm for a mobile robot, which is intelligently searching the goal location in unknown dynamic environments with moving obstacles using multi-ultrasonic sensor. Instead of using "sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data, "command fusion" method by fuzzy inference is used to govern the robot motions. The major factors for robot navigation are represented as a cost function. Using the data of the robot states and the environment, the weight value of each factor using fuzzy inference is determined for an optimal trajectory in dynamic environments. For the evaluation of the proposed algorithm, we performed simulations in PC as well as real experiments with mobile robot, AmigoBot. The results show that the proposed algorithm is apt to identify obstacles in unknown environments to guide the robot to the goal location safely.

Recognition of contact surfaces using optical tactile and F/T sensors integrated by fuzzy fusion algorithm (광촉각 센서와 힘/역학센서의 퍼지융합을 통한 접촉면의 인식)

  • 고동환;한헌수
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.628-631
    • /
    • 1996
  • This paper proposes a surface recognition algorithm which determines the types of contact surfaces by fusing the information collected by the multisensor system, consisted of the optical tactile and force/torque sensors. Since the image shape measured by the optical tactile sensor system, which is used for determining the surface type, varies depending on the forces provided at the measuring moment, the force information measured by the f/t sensor takes an important role. In this paper, an image contour is represented by the long and short axes and they are fuzzified individually by the membership function formulated by observing the variation of the lengths of the long and short axes depending on the provided force. The fuzzified values of the long and short axes are fused using the average Minkowski's distance. Compared to the case where only the contour information is used, the proposed algorithm has shown about 14% of enhancement in the recognition ratio. Especially, when imposing the optimal force determined by the experiments, the recognition ratio has been measured over 91%.

  • PDF

Accurate Vehicle Positioning on a Numerical Map

  • Laneurit Jean;Chapuis Roland;Chausse Fr d ric
    • International Journal of Control, Automation, and Systems
    • /
    • v.3 no.1
    • /
    • pp.15-31
    • /
    • 2005
  • Nowadays, the road safety is an important research field. One of the principal research topics in this field is the vehicle localization in the road network. This article presents an approach of multi sensor fusion able to locate a vehicle with a decimeter precision. The different informations used in this method come from the following sensors: a low cost GPS, a numeric camera, an odometer and a steer angle sensor. Taking into account a complete model of errors on GPS data (bias on position and nonwhite errors) as well as the data provided by an original approach coupling a vision algorithm with a precise numerical map allow us to get this precision.

Hierarchical Clustering Approach of Multisensor Data Fusion: Application of SAR and SPOT-7 Data on Korean Peninsula

  • Lee, Sang-Hoon;Hong, Hyun-Gi
    • Proceedings of the KSRS Conference
    • /
    • 2002.10a
    • /
    • pp.65-65
    • /
    • 2002
  • In remote sensing, images are acquired over the same area by sensors of different spectral ranges (from the visible to the microwave) and/or with different number, position, and width of spectral bands. These images are generally partially redundant, as they represent the same scene, and partially complementary. For many applications of image classification, the information provided by a single sensor is often incomplete or imprecise resulting in misclassification. Fusion with redundant data can draw more consistent inferences for the interpretation of the scene, and can then improve classification accuracy. The common approach to the classification of multisensor data as a data fusion scheme at pixel level is to concatenate the data into one vector as if they were measurements from a single sensor. The multiband data acquired by a single multispectral sensor or by two or more different sensors are not completely independent, and a certain degree of informative overlap may exist between the observation spaces of the different bands. This dependence may make the data less informative and should be properly modeled in the analysis so that its effect can be eliminated. For modeling and eliminating the effect of such dependence, this study employs a strategy using self and conditional information variation measures. The self information variation reflects the self certainty of the individual bands, while the conditional information variation reflects the degree of dependence of the different bands. One data set might be very less reliable than others in the analysis and even exacerbate the classification results. The unreliable data set should be excluded in the analysis. To account for this, the self information variation is utilized to measure the degrees of reliability. The team of positively dependent bands can gather more information jointly than the team of independent ones. But, when bands are negatively dependent, the combined analysis of these bands may give worse information. Using the conditional information variation measure, the multiband data are split into two or more subsets according the dependence between the bands. Each subsets are classified separately, and a data fusion scheme at decision level is applied to integrate the individual classification results. In this study. a two-level algorithm using hierarchical clustering procedure is used for unsupervised image classification. Hierarchical clustering algorithm is based on similarity measures between all pairs of candidates being considered for merging. In the first level, the image is partitioned as any number of regions which are sets of spatially contiguous pixels so that no union of adjacent regions is statistically uniform. The regions resulted from the low level are clustered into a parsimonious number of groups according to their statistical characteristics. The algorithm has been applied to satellite multispectral data and airbone SAR data.

  • PDF

Precise Geometric Registration of Aerial Imagery and LIDAR Data

  • Choi, Kyoung-Ah;Hong, Ju-Seok;Lee, Im-Pyeong
    • ETRI Journal
    • /
    • v.33 no.4
    • /
    • pp.506-516
    • /
    • 2011
  • In this paper, we develop a registration method to eliminate the geometric inconsistency between the stereo-images and light detection and ranging (LIDAR) data obtained by an airborne multisensor system. This method consists of three steps: registration primitive extraction, correspondence establishment, and exterior orientation parameter (EOP) adjustment. As the primitives, we employ object points and linked edges from the stereo-images and planar patches and intersection edges from the LIDAR data. After extracting these primitives, we establish the correspondence between them, being classified into vertical and horizontal groups. These corresponding pairs are simultaneously incorporated as stochastic constraints into aerial triangulation based on the bundle block adjustment. Finally, the EOPs of the images are adjusted to minimize the inconsistency. The results from the application of our method to real data demonstrate that the inconsistency between both data sets is significantly reduced from the range of 0.5 m to 2 m to less than 0.05 m. Hence, the results show that the proposed method is useful for the data fusion of aerial images and LIDAR data.

Dempster-Shafer Fusion of Multisensor Imagery Using Gaussian Mass Function (Gaussian분포의 질량함수를 사용하는 Dempster-Shafer영상융합)

  • Lee Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.20 no.6
    • /
    • pp.419-425
    • /
    • 2004
  • This study has proposed a data fusion method based on the Dempster-Shafer evidence theory The Dempster-Shafer fusion uses mass functions obtained under the assumption of class-independent Gaussian assumption. In the Dempster-Shafer approach, uncertainty is represented by 'belief interval' equal to the difference between the values of 'belief' function and 'plausibility' function which measure imprecision and uncertainty By utilizing the Dempster-Shafer scheme to fuse the data from multiple sensors, the results of classification can be improved. It can make the users consider the regions with mixed classes in a training process. In most practices, it is hard to find the regions with a pure class. In this study, the proposed method has applied to the KOMPSAT-EOC panchromatic image and LANDSAT ETM+ NDVI data acquired over Yongin/Nuengpyung. area of Kyunggi-do. The results show that it has potential of effective data fusion for multiple sensor imagery.

Matching Points Extraction Between Optical and TIR Images by Using SURF and Local Phase Correlation (SURF와 지역적 위상 상관도를 활용한 광학 및 열적외선 영상 간 정합쌍 추출)

  • Han, You Kyung;Choi, Jae Wan
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.23 no.1
    • /
    • pp.81-88
    • /
    • 2015
  • Various satellite sensors having ranges of the visible, infrared, and thermal wavelengths have been launched due to the improvement of hardware technologies of satellite sensors development. According to the development of satellite sensors with various wavelength ranges, the fusion and integration of multisensor images are proceeded. Image matching process is an essential step for the application of multisensor images. Some algorithms, such as SIFT and SURF, have been proposed to co-register satellite images. However, when the existing algorithms are applied to extract matching points between optical and thermal images, high accuracy of co-registration might not be guaranteed because these images have difference spectral and spatial characteristics. In this paper, location of control points in a reference image is extracted by SURF, and then, location of their corresponding pairs is estimated from the correlation of the local similarity. In the case of local similarity, phase correlation method, which is based on fourier transformation, is applied. In the experiments by simulated, Landsat-8, and ASTER datasets, the proposed algorithm could extract reliable matching points compared to the existing SURF-based method.

Design of a vehicle navigation system using the federated kalman filter (연합형 칼만 필터를 이용한 차량항법시스템의 설계)

  • 김진원;지규인;이장규
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1348-1351
    • /
    • 1997
  • The federated Kaman filter(FKF) is being widely used in many multisensor navigatiion systems. It is know that the FKF has advantages of simplicity and fault-tolerance over other decentralized filter techniques. In this paper, optimal and suboptimal FKF configuratiions are mentioned and a covariance analysis technique for the suboptimal FKF is newly presented. The suboptimal FKF configuration, known as No-reset(NR) mode, has better fault tolerance capability than the optimal FKF coniguratioin. In the suggested technique, a suboptimal fusion process of FKF is considered a swell as suboptimal gains of local filters. An upper boun of error covariance for suboptimal FKF is derived. Also, it is mathematically shown that this bound is smaller thanexisting bound in the literatrue. A vehicle-navigaion system is designed using the FKF. In thissystem, a map constraing equation is introduced and used as a measurement equatioin of Kalman filter. Performance analysis is done by the suggested covariance analysis techniques.

  • PDF

Effectiveness Analysis of Multistatic Sonar Network (Multistatic 소나망의 효과도 분석)

  • Goo Bonhwa;Hong Wooyoung;Ko Hanseok
    • Proceedings of the Acoustical Society of Korea Conference
    • /
    • autumn
    • /
    • pp.475-478
    • /
    • 2004
  • 본 논문에서는 multistatic 소나망의 효과도 분석을 하였다. 특히 본 논문에서는 multistatic 소나망의 탐지 성능 분석을 통해 효용성을 알아보았다. Multistatic 소나망은 송/수신기가 분리된 일종의 다중 분산 센서 시스템으로, 최적의 탐지 성능을 갖기 위해서는 적절한 융합 규칙 및 센서 배치가 필요하다. 분산 센서 융합 기법으로 bayesian 결정 기법을 기반으로 한 융합 기법을 적용하였으며, 실제 해양 환경하에서의 탐지 성능 분석을 위해 개선된 bistatic 표적 강도 모델과 거리 종속 전송 손실 모델을 이용한 multistatic 소나망 탐지 모델을 제안하였다. 기존 소나망과의 모의 비교 실험을 통해 multistatic 소나망의 우수성을 입증하였다.

  • PDF

A Fusion Algorithm considering Error Characteristics of the Multi-Sensor (다중센서 오차특성을 고려한 융합 알고리즘)

  • Hyun, Dae-Hwan;Yoon, Hee-Byung
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.36 no.4
    • /
    • pp.274-282
    • /
    • 2009
  • Various location tracking sensors; such as GPS, INS, radar, and optical equipment; are used for tracking moving targets. In order to effectively track moving targets, it is necessary to develop an effective fusion method for these heterogeneous devices. There have been studies in which the estimated values of each sensors were regarded as different models and fused together, considering the different error characteristics of the sensors for the improvement of tracking performance using heterogeneous multi-sensor. However, the rate of errors for the estimated values of other sensors has increased, in that there has been a sharp increase in sensor errors and the attempts to change the estimated sensor values for the Sensor Probability could not be applied in real time. In this study, the Sensor Probability is obtained by comparing the RMSE (Root Mean Square Error) for the difference between the updated and measured values of the Kalman filter for each sensor. The process of substituting the new combined values for the Kalman filter input values for each sensor is excluded. There are improvements in both the real-time application of estimated sensor values, and the tracking performance for the areas in which the sensor performance has rapidly decreased. The proposed algorithm adds the error characteristic of each sensor as a conditional probability value, and ensures greater accuracy by performing the track fusion with the sensors with the most reliable performance. The trajectory of a UAV is generated in an experiment and a performance analysis is conducted with other fusion algorithms.