• Title/Summary/Keyword: Sensor fusion

Search Result 815, Processing Time 0.028 seconds

Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot (이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합)

  • Kim, Min-Young;Ahn, Sang-Tae;Cho, Hyung-Suck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.4
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

Sensor fault diagnosis for bridge monitoring system using similarity of symmetric responses

  • Xu, Xiang;Huang, Qiao;Ren, Yuan;Zhao, Dan-Yang;Yang, Juan
    • Smart Structures and Systems
    • /
    • v.23 no.3
    • /
    • pp.279-293
    • /
    • 2019
  • To ensure high quality data being used for data mining or feature extraction in the bridge structural health monitoring (SHM) system, a practical sensor fault diagnosis methodology has been developed based on the similarity of symmetric structure responses. First, the similarity of symmetric response is discussed using field monitoring data from different sensor types. All the sensors are initially paired and sensor faults are then detected pair by pair to achieve the multi-fault diagnosis of sensor systems. To resolve the coupling response issue between structural damage and sensor fault, the similarity for the target zone (where the studied sensor pair is located) is assessed to determine whether the localized structural damage or sensor fault results in the dissimilarity of the studied sensor pair. If the suspected sensor pair is detected with at least one sensor being faulty, field test could be implemented to support the regression analysis based on the monitoring and field test data for sensor fault isolation and reconstruction. Finally, a case study is adopted to demonstrate the effectiveness of the proposed methodology. As a result, Dasarathy's information fusion model is adopted for multi-sensor information fusion. Euclidean distance is selected as the index to assess the similarity. In conclusion, the proposed method is practical for actual engineering which ensures the reliability of further analysis based on monitoring data.

An Effective Mapping for a Mobile Robot using Error Backpropagation based Sensor Fusion (오류 역전파 신경망 기반의 센서융합을 이용한 이동로봇의 효율적인 지도 작성)

  • Kim, Kyoung-Dong;Qu, Xiao-Chuan;Choi, Kyung-Sik;Lee, Suk-Gyu
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.9
    • /
    • pp.1040-1047
    • /
    • 2011
  • This paper proposes a novel method based on error back propagation neural networks to fuse laser sensor data and ultrasonic sensor data for enhancing the accuracy of mapping. For navigation of single robot, the robot has to know its initial position and accurate environment information around it. However, due to the inherent properties of sensors, each sensor has its own advantages and drawbacks. In our system, the robot equipped with seven ultrasonic sensors and a laser sensor navigates to map two different corridor environments. The experimental results show the effectiveness of the heterogeneous sensor fusion using an error backpropagation algorithm for mapping.

A Correction System of Odometry Error for Map Building of Mobile Robot Based on Sensor fusion

  • Hyun, Woong-Keun
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.6
    • /
    • pp.709-715
    • /
    • 2010
  • This paper represents a map building and localization system for mobile robot. Map building and navigation is a complex problem because map integrity cannot be sustained by odometry alone due to errors introduced by wheel slippage, distortion and simple linealized odometry equation. For accurate localization, we propose sensor fusion system using encoder sensor and indoor GPS module as relative sensor and absolute sensor, respectively. To build a map, we developed a sensor based navigation algorithm and grid based map building algorithm based on Embedded Linux O.S. A wall following decision engine like an expert system was proposed for map building navigation. We proved this system's validity through field test.

Development of Sensor Fusion-Based Low-Speed Short-Distance Collision Warning Algorithm for Urban Area (도시 환경을 위한 센서 융합 기반 저속 근거리 충돌 경보 알고리즘 개발)

  • Jeon, Jong-Ki;Kim, Man-Ho;Lee, Suk;Lee, Kyung-Chang
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.6 no.3
    • /
    • pp.157-167
    • /
    • 2011
  • Although vehicles become more intelligent for convenience and safety of drivers, traffic accidents are increased more and more. Especially, car-to-car single rear impacts in the urban area are increased rapidly because of driver inattention. To prevent rear impacts in the urban area, commercial automobile vendor applies the low-speed short-distance collision warning system. This paper presents low-speed short-distance collision warning algorithm for the city driving by using sensor fusion of laser sensor and ultrasonic sensor. An experiment using embedded microprocessor in the driving track was used to demonstrate the feasibility of the collision warning algorithm.

Design of a Multi-Sensor Data Simulator and Development of Data Fusion Algorithm (다중센서자료 시뮬레이터 설계 및 자료융합 알고리듬 개발)

  • Lee, Yong-Jae;Lee, Ja-Seong;Go, Seon-Jun;Song, Jong-Hwa
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.34 no.5
    • /
    • pp.93-100
    • /
    • 2006
  • This paper presents a multi-sensor data simulator and a data fusion algorithm for tracking high dynamic flight target from Radar and Telemetry System. The designed simulator generates time-asynchronous multiple sensor data with different data rates and communication delays. Measurement noises are incorporated by using realistic sensor models. The proposed fusion algorithm is designed by a 21st order distributed Kalman Filter which is based on the PVA model with sensor bias states. A fault detection and correction logics are included in the algorithm for bad data and sensor faults. The designed algorithm is verified by using both simulation data and actual real data.

Asynchronous Sensor Fusion using Multi-rate Kalman Filter (다중주기 칼만 필터를 이용한 비동기 센서 융합)

  • Son, Young Seop;Kim, Wonhee;Lee, Seung-Hi;Chung, Chung Choo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.11
    • /
    • pp.1551-1558
    • /
    • 2014
  • We propose a multi-rate sensor fusion of vision and radar using Kalman filter to solve problems of asynchronized and multi-rate sampling periods in object vehicle tracking. A model based prediction of object vehicles is performed with a decentralized multi-rate Kalman filter for each sensor (vision and radar sensors.) To obtain the improvement in the performance of position prediction, different weighting is applied to each sensor's predicted object position from the multi-rate Kalman filter. The proposed method can provide estimated position of the object vehicles at every sampling time of ECU. The Mahalanobis distance is used to make correspondence among the measured and predicted objects. Through the experimental results, we validate that the post-processed fusion data give us improved tracking performance. The proposed method obtained two times improvement in the object tracking performance compared to single sensor method (camera or radar sensor) in the view point of roots mean square error.

An Efficient Outdoor Localization Method Using Multi-Sensor Fusion for Car-Like Robots (다중 센서 융합을 사용한 자동차형 로봇의 효율적인 실외 지역 위치 추정 방법)

  • Bae, Sang-Hoon;Kim, Byung-Kook
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.10
    • /
    • pp.995-1005
    • /
    • 2011
  • An efficient outdoor local localization method is suggested using multi-sensor fusion with MU-EKF (Multi-Update Extended Kalman Filter) for car-like mobile robots. In outdoor environments, where mobile robots are used for explorations or military services, accurate localization with multiple sensors is indispensable. In this paper, multi-sensor fusion outdoor local localization algorithm is proposed, which fuses sensor data from LRF (Laser Range Finder), Encoder, and GPS. First, encoder data is used for the prediction stage of MU-EKF. Then the LRF data obtained by scanning the environment is used to extract objects, and estimates the robot position and orientation by mapping with map objects, as the first update stage of MU-EKF. This estimation is finally fused with GPS as the second update stage of MU-EKF. This MU-EKF algorithm can also fuse more than three sensor data efficiently even with different sensor data sampling periods, and ensures high accuracy in localization. The validity of the proposed algorithm is revealed via experiments.

Implementation of a Sensor Fusion FPGA for an IoT System (사물인터넷 시스템을 위한 센서 융합 FPGA 구현)

  • Jung, Chang-Min;Lee, Kwang-Yeob;Park, Tae-Ryong
    • Journal of IKEEE
    • /
    • v.19 no.2
    • /
    • pp.142-147
    • /
    • 2015
  • In this paper, a Kalman filter-based sensor fusion filter that measures posture by calibrating and combining information obtained from acceleration and gyro sensors was proposed. Recent advancements in sensor network technology have required sensor fusion technology. In the proposed approach, the nonlinear system model of the filter is converted to a linear system model through a Jacobian matrix operation, and the measurement value predicted via Euler integration. The proposed filter was implemented at an operating frequency of 74 MHz using a Virtex-6 FPGA Board from Xilinx Inc. Further, the accuracy and reliability of the measured posture were validated by comparing the values obtained using the implemented filters with those from existing filters.

Virtual Environment Building and Navigation of Mobile Robot using Command Fusion and Fuzzy Inference

  • Jin, Taeseok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.22 no.4
    • /
    • pp.427-433
    • /
    • 2019
  • This paper propose a fuzzy inference model for map building and navigation for a mobile robot with an active camera, which is intelligently navigating to the goal location in unknown environments using sensor fusion, based on situational command using an active camera sensor. Active cameras provide a mobile robot with the capability to estimate and track feature images over a hallway field of view. In this paper, instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. Command fusion method is used to govern the robot navigation. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of active camera sensor for navigation experiments are fused into the identification process. Navigation performance improves on that achieved using fuzzy inference alone and shows significant advantages over command fusion techniques. Experimental evidences are provided, demonstrating that the proposed method can be reliably used over a wide range of relative positions between the active camera and the feature images.