• Title/Summary/Keyword: Multi-sensor fusion

Search Result 199, Processing Time 0.026 seconds

Improvement of Land Cover Classification Accuracy by Optimal Fusion of Aerial Multi-Sensor Data

  • Choi, Byoung Gil;Na, Young Woo;Kwon, Oh Seob;Kim, Se Hun
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.3
    • /
    • pp.135-152
    • /
    • 2018
  • The purpose of this study is to propose an optimal fusion method of aerial multi - sensor data to improve the accuracy of land cover classification. Recently, in the fields of environmental impact assessment and land monitoring, high-resolution image data has been acquired for many regions for quantitative land management using aerial multi-sensor, but most of them are used only for the purpose of the project. Hyperspectral sensor data, which is mainly used for land cover classification, has the advantage of high classification accuracy, but it is difficult to classify the accurate land cover state because only the visible and near infrared wavelengths are acquired and of low spatial resolution. Therefore, there is a need for research that can improve the accuracy of land cover classification by fusing hyperspectral sensor data with multispectral sensor and aerial laser sensor data. As a fusion method of aerial multisensor, we proposed a pixel ratio adjustment method, a band accumulation method, and a spectral graph adjustment method. Fusion parameters such as fusion rate, band accumulation, spectral graph expansion ratio were selected according to the fusion method, and the fusion data generation and degree of land cover classification accuracy were calculated by applying incremental changes to the fusion variables. Optimal fusion variables for hyperspectral data, multispectral data and aerial laser data were derived by considering the correlation between land cover classification accuracy and fusion variables.

An Efficient Outdoor Localization Method Using Multi-Sensor Fusion for Car-Like Robots (다중 센서 융합을 사용한 자동차형 로봇의 효율적인 실외 지역 위치 추정 방법)

  • Bae, Sang-Hoon;Kim, Byung-Kook
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.10
    • /
    • pp.995-1005
    • /
    • 2011
  • An efficient outdoor local localization method is suggested using multi-sensor fusion with MU-EKF (Multi-Update Extended Kalman Filter) for car-like mobile robots. In outdoor environments, where mobile robots are used for explorations or military services, accurate localization with multiple sensors is indispensable. In this paper, multi-sensor fusion outdoor local localization algorithm is proposed, which fuses sensor data from LRF (Laser Range Finder), Encoder, and GPS. First, encoder data is used for the prediction stage of MU-EKF. Then the LRF data obtained by scanning the environment is used to extract objects, and estimates the robot position and orientation by mapping with map objects, as the first update stage of MU-EKF. This estimation is finally fused with GPS as the second update stage of MU-EKF. This MU-EKF algorithm can also fuse more than three sensor data efficiently even with different sensor data sampling periods, and ensures high accuracy in localization. The validity of the proposed algorithm is revealed via experiments.

Simulation of Mobile Robot Navigation based on Multi-Sensor Data Fusion by Probabilistic Model

  • Jin, Tae-seok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.21 no.4
    • /
    • pp.167-174
    • /
    • 2018
  • Presently, the exploration of an unknown environment is an important task for the development of mobile robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems, In mobile robotics, multi-sensor data fusion(MSDF) became useful method for navigation and collision avoiding. Moreover, their applicability for map building and navigation has exploited in recent years. In this paper, as the preliminary step for developing a multi-purpose autonomous carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as ultrasonic sensor, IR sensor for mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within indoor environments. Simulation results with a mobile robot will demonstrate the effectiveness of the discussed methods.

Visual Control of Mobile Robots Using Multisensor Fusion System

  • Kim, Jung-Ha;Sugisaka, Masanori
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.91.4-91
    • /
    • 2001
  • In this paper, a development of the sensor fusion algorithm for a visual control of mobile robot is presented. The output data from the visual sensor include a time-lag due to the image processing computation. The sampling rate of the visual sensor is considerably low so that it should be used with other sensors to control fast motion. The main purpose of this paper is to develop a method which constitutes a sensor fusion system to give the optimal state estimates. The proposed sensor fusion system combines the visual sensor and inertial sensor using a modified Kalman filter. A kind of multi-rate Kalman filter which treats the slow sampling rate ...

  • PDF

A Study on a Multi-sensor Information Fusion Architecture for Avionics (항공전자 멀티센서 정보 융합 구조 연구)

  • Kang, Shin-Woo;Lee, Seoung-Pil;Park, Jun-Hyeon
    • Journal of Advanced Navigation Technology
    • /
    • v.17 no.6
    • /
    • pp.777-784
    • /
    • 2013
  • Synthesis process from the data produced by different types of sensor into a single information is being studied and used in a variety of platforms in terms of multi-sensor data fusion. Heterogeneous sensors has been integrated into various aircraft and modern avionic systems manage them. As the performance of sensors in aircraft is getting higher, the integration of sensor information is required from the viewpoint of avionics gradually. Information fusion is not studied widely in the view of software that provide a pilot with fused information from data produced by the sensor in the form of symbology on a display device. The purpose of information fusion is to assist pilots to make a decision in order to perform mission by providing the correct combat situation from avionics of the aircraft and to minimize their workload consequently. In the aircraft avionics equipped with different types of sensors, the software architecture that produce a comprehensive information using the sensor data through multi-sensor data fusion process to the user is shown in this paper.

Sensor fault diagnosis for bridge monitoring system using similarity of symmetric responses

  • Xu, Xiang;Huang, Qiao;Ren, Yuan;Zhao, Dan-Yang;Yang, Juan
    • Smart Structures and Systems
    • /
    • v.23 no.3
    • /
    • pp.279-293
    • /
    • 2019
  • To ensure high quality data being used for data mining or feature extraction in the bridge structural health monitoring (SHM) system, a practical sensor fault diagnosis methodology has been developed based on the similarity of symmetric structure responses. First, the similarity of symmetric response is discussed using field monitoring data from different sensor types. All the sensors are initially paired and sensor faults are then detected pair by pair to achieve the multi-fault diagnosis of sensor systems. To resolve the coupling response issue between structural damage and sensor fault, the similarity for the target zone (where the studied sensor pair is located) is assessed to determine whether the localized structural damage or sensor fault results in the dissimilarity of the studied sensor pair. If the suspected sensor pair is detected with at least one sensor being faulty, field test could be implemented to support the regression analysis based on the monitoring and field test data for sensor fault isolation and reconstruction. Finally, a case study is adopted to demonstrate the effectiveness of the proposed methodology. As a result, Dasarathy's information fusion model is adopted for multi-sensor information fusion. Euclidean distance is selected as the index to assess the similarity. In conclusion, the proposed method is practical for actual engineering which ensures the reliability of further analysis based on monitoring data.

Design of a Multi-Sensor Data Simulator and Development of Data Fusion Algorithm (다중센서자료 시뮬레이터 설계 및 자료융합 알고리듬 개발)

  • Lee, Yong-Jae;Lee, Ja-Seong;Go, Seon-Jun;Song, Jong-Hwa
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.34 no.5
    • /
    • pp.93-100
    • /
    • 2006
  • This paper presents a multi-sensor data simulator and a data fusion algorithm for tracking high dynamic flight target from Radar and Telemetry System. The designed simulator generates time-asynchronous multiple sensor data with different data rates and communication delays. Measurement noises are incorporated by using realistic sensor models. The proposed fusion algorithm is designed by a 21st order distributed Kalman Filter which is based on the PVA model with sensor bias states. A fault detection and correction logics are included in the algorithm for bad data and sensor faults. The designed algorithm is verified by using both simulation data and actual real data.

Classification of Multi-sensor Remote Sensing Images Using Fuzzy Logic Fusion and Iterative Relaxation Labeling (퍼지 논리 융합과 반복적 Relaxation Labeling을 이용한 다중 센서 원격탐사 화상 분류)

  • Park No-Wook;Chi Kwang-Hoon;Kwon Byung-Doo
    • Korean Journal of Remote Sensing
    • /
    • v.20 no.4
    • /
    • pp.275-288
    • /
    • 2004
  • This paper presents a fuzzy relaxation labeling approach incorporated to the fuzzy logic fusion scheme for the classification of multi-sensor remote sensing images. The fuzzy logic fusion and iterative relaxation labeling techniques are adopted to effectively integrate multi-sensor remote sensing images and to incorporate spatial neighboring information into spectral information for contextual classification, respectively. Especially, the iterative relaxation labeling approach can provide additional information that depicts spatial distributions of pixels updated by spatial information. Experimental results for supervised land-cover classification using optical and multi-frequency/polarization images indicate that the use of multi-sensor images and spatial information can improve the classification accuracy.

Development of Multi-sensor Image Fusion software(InFusion) for Value-added applications (고부가 활용을 위한 이종영상 융합 소프트웨어(InFusion) 개발)

  • Choi, Myung-jin;Chung, Inhyup;Ko, Hyeong Ghun;Jang, Sumin
    • Journal of Satellite, Information and Communications
    • /
    • v.12 no.3
    • /
    • pp.15-21
    • /
    • 2017
  • Following the successful launch of KOMPSAT-3 in May 2012, KOMPSAT-5 in August 2013, and KOMPSAT-3A in March 2015 have succeeded in launching the integrated operation of optical, radar and thermal infrared sensors in Korea. We have established a foundation to utilize the characteristics of each sensors. In order to overcome limitations in the range of application and accuracy of the application of a single sensor, multi-sensor image fusion techniques have been developed which take advantage of multiple sensors and complement each other. In this paper, we introduce the development of software (InFusion) for multi-sensor image fusion and valued-added product generation using KOMPSAT series. First, we describe the characteristics of each sensor and the necessity of fusion software development, and describe the entire development process. It aims to increase the data utilization of KOMPSAT series and to inform the superiority of domestic software through creation of high value-added products.

Motion Estimation of 3D Planar Objects using Multi-Sensor Data Fusion (센서 융합을 이용한 움직이는 물체의 동작예측에 관한 연구)

  • Yang, Woo-Suk
    • Journal of Sensor Science and Technology
    • /
    • v.5 no.4
    • /
    • pp.57-70
    • /
    • 1996
  • Motion can be estimated continuously from each sensor through the analysis of the instantaneous states of an object. This paper is aimed to introduce a method to estimate the general 3D motion of a planar object from the instantaneous states of an object using multi-sensor data fusion. The instantaneous states of an object is estimated using the linear feedback estimation algorithm. The motion estimated from each sensor is fused to provide more accurate and reliable information about the motion of an unknown planar object. We present a fusion algorithm which combines averaging and deciding. With the assumption that the motion is smooth, the approach can handle the data sequences from multiple sensors with different sampling times. Simulation results show proposed algorithm is advantageous in terms of accuracy, speed, and versatility.

  • PDF