• Title/Summary/Keyword: Sensor Fusion System

Search Result 435, Processing Time 0.024 seconds

Robust Hierarchical Data Fusion Scheme for Large-Scale Sensor Network

  • Song, Il Young
    • Journal of Sensor Science and Technology
    • /
    • v.26 no.1
    • /
    • pp.1-6
    • /
    • 2017
  • The advanced driver assistant system (ADAS) requires the collection of a large amount of information including road conditions, environment, vehicle status, condition of the driver, and other useful data. In this regard, large-scale sensor networks can be an appropriate solution since they have been designed for this purpose. Recent advances in sensor network technology have enabled the management and monitoring of large-scale tasks such as the monitoring of road surface temperature on a highway. In this paper, we consider the estimation and fusion problems of the large-scale sensor networks used in the ADAS. Hierarchical fusion architecture is proposed for an arbitrary topology of the large-scale sensor network. A robust cluster estimator is proposed to achieve robustness of the network against outliers or failure of sensors. Lastly, a robust hierarchical data fusion scheme is proposed for the communication channel between the clusters and fusion center, considering the non-Gaussian channel noise, which is typical in communication systems.

Improvement of Control Performance by Data Fusion of Sensors

  • Na, Seung-You;Shin, Dae-Jung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.63-69
    • /
    • 2004
  • In this paper, we propose a general framework for sensor data fusion applied to control systems. Since many kinds of disturbances are introduced to a control system, it is necessary to rely on multisensor data fusion to improve control performance in spite of the disturbances. Multisensor data fusion for a control system is considered a sequence of making decisions for a combination of sensor data to make a proper control input in uncertain conditions of disturbance effects on sensors. The proposed method is applied to a typical control system of a flexible link system in which reduction of oscillation is obtained using a photo sensor at the tip of the link. But the control performance depends heavily on the environmental light conditions. To overcome the light disturbance difficulties, an accelerometer is used in addition to the existing photo sensor. Improvement of control performance is possible by utilizing multisensor data fusion for various output responses to show the feasibility of the proposed method in this paper.

Kalman Filter Baded Pose Data Fusion with Optical Traking System and Inertial Navigation System Networks for Image Guided Surgery (영상유도수술을 위한 광학추적 센서 및 관성항법 센서 네트웍의 칼만필터 기반 자세정보 융합)

  • Oh, Hyun Min;Kim, Min Young
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.66 no.1
    • /
    • pp.121-126
    • /
    • 2017
  • Tracking system is essential for Image Guided Surgery(IGS). Optical Tracking System(OTS) is widely used to IGS for its high accuracy and easy usage. However, OTS doesn't work when occlusion of marker occurs. In this paper sensor data fusion with OTS and Inertial Navigation System(INS) is proposed to solve this problem. The proposed system improves the accuracy of tracking system by eliminating gaussian error of the sensor and supplements the disadvantages of OTS and IMU through sensor fusion based on Kalman filter. Also, sensor calibration method that improves the accuracy is introduced. The performed experiment verifies the effectualness of the proposed algorithm.

Centralized Kalman Filter with Adaptive Measurement Fusion: its Application to a GPS/SDINS Integration System with an Additional Sensor

  • Lee, Tae-Gyoo
    • International Journal of Control, Automation, and Systems
    • /
    • v.1 no.4
    • /
    • pp.444-452
    • /
    • 2003
  • An integration system with multi-measurement sets can be realized via combined application of a centralized and federated Kalman filter. It is difficult for the centralized Kalman filter to remove a failed sensor in comparison with the federated Kalman filter. All varieties of Kalman filters monitor innovation sequence (residual) for detection and isolation of a failed sensor. The innovation sequence, which is selected as an indicator of real time estimation error plays an important role in adaptive mechanism design. In this study, the centralized Kalman filter with adaptive measurement fusion is introduced by means of innovation sequence. The objectives of adaptive measurement fusion are automatic isolation and recovery of some sensor failures as well as inherent monitoring capability. The proposed adaptive filter is applied to the GPS/SDINS integration system with an additional sensor. Simulation studies attest that the proposed adaptive scheme is effective for isolation and recovery of immediate sensor failures.

Implementation of a sensor fusion system for autonomous guided robot navigation in outdoor environments (실외 자율 로봇 주행을 위한 센서 퓨전 시스템 구현)

  • Lee, Seung-H.;Lee, Heon-C.;Lee, Beom-H.
    • Journal of Sensor Science and Technology
    • /
    • v.19 no.3
    • /
    • pp.246-257
    • /
    • 2010
  • Autonomous guided robot navigation which consists of following unknown paths and avoiding unknown obstacles has been a fundamental technique for unmanned robots in outdoor environments. The unknown path following requires techniques such as path recognition, path planning, and robot pose estimation. In this paper, we propose a novel sensor fusion system for autonomous guided robot navigation in outdoor environments. The proposed system consists of three monocular cameras and an array of nine infrared range sensors. The two cameras equipped on the robot's right and left sides are used to recognize unknown paths and estimate relative robot pose on these paths through bayesian sensor fusion method, and the other camera equipped at the front of the robot is used to recognize abrupt curves and unknown obstacles. The infrared range sensor array is used to improve the robustness of obstacle avoidance. The forward camera and the infrared range sensor array are fused through rule-based method for obstacle avoidance. Experiments in outdoor environments show the mobile robot with the proposed sensor fusion system performed successfully real-time autonomous guided navigation.

Modeling and Design of a Distributed Detection System Based on Active Sonar Sensor Networks (능동 소나망 분산탐지 체계의 모델링 및 설계)

  • Choi, Won-Yong;Kim, Song-Geun;Hong, Sun-Mog
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.14 no.1
    • /
    • pp.123-131
    • /
    • 2011
  • In this paper, modeling and design of a distributed detection system are considered for an active sonar sensor network. The sensor network has a parallel configuration and it consists of a fusion center and a set of receiver nodes. A system with two receiver nodes is considered to investigate a theoretical aspect of design. To be specific, AND rule and OR rule are considered as the fusion rules of the sensor network. For the fusion rules, it is shown that a threshold rule of each sensor node has uniformly most powerful properties. Optimum threshold for each sensor is obtained that maximizes the probability of detection given probability of false alarm. Numerical experiments were also performed to investigate the detection characteristics of a distributed detection system with multiple sensor nodes. The experimental results show how signal strength, false alarm probability, and the distance between nodes in a sensor field affect the system detection performances.

The Sensory-Motor Fusion System for Object Tracking (이동 물체를 추적하기 위한 감각 운동 융합 시스템 설계)

  • Lee, Sang-Hee;Wee, Jae-Woo;Lee, Chong-Ho
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.52 no.3
    • /
    • pp.181-187
    • /
    • 2003
  • For the moving objects with environmental sensors such as object tracking moving robot with audio and video sensors, environmental information acquired from sensors keep changing according to movements of objects. In such case, due to lack of adaptability and system complexity, conventional control schemes show limitations on control performance, and therefore, sensory-motor systems, which can intuitively respond to various types of environmental information, are desirable. And also, to improve the system robustness, it is desirable to fuse more than two types of sensory information simultaneously. In this paper, based on Braitenberg's model, we propose a sensory-motor based fusion system, which can trace the moving objects adaptively to environmental changes. With the nature of direct connecting structure, sensory-motor based fusion system can control each motor simultaneously, and the neural networks are used to fuse information from various types of sensors. And also, even if the system receives noisy information from one sensor, the system still robustly works with information from other sensors which compensates the noisy information through sensor fusion. In order to examine the performance, sensory-motor based fusion model is applied to object-tracking four-foot robot equipped with audio and video sensors. The experimental results show that the sensory-motor based fusion system can tract moving objects robustly with simpler control mechanism than model-based control approaches.

Design of ToF-Stereo Fusion Sensor System for 3D Spatial Scanning (3차원 공간 스캔을 위한 ToF-Stereo 융합 센서 시스템 설계)

  • Yun Ju Lee;Sun Kook Yoo
    • Smart Media Journal
    • /
    • v.12 no.9
    • /
    • pp.134-141
    • /
    • 2023
  • In this paper, we propose a ToF-Stereo fusion sensor system for 3D space scanning that increases the recognition rate of 3D objects, guarantees object detection quality, and is robust to the environment. The ToF-Stereo sensor fusion system uses a method of fusing the sensing values of the ToF sensor and the Stereo RGB sensor, and even if one sensor does not operate, the other sensor can be used to continuously detect an object. Since the quality of the ToF sensor and the Stereo RGB sensor varies depending on the sensing distance, sensing resolution, light reflectivity, and illuminance, a module that can adjust the function of the sensor based on reliability estimation is placed. The ToF-Stereo sensor fusion system combines the sensing values of the ToF sensor and the Stereo RGB sensor, estimates the reliability, and adjusts the function of the sensor according to the reliability to fuse the two sensing values, thereby improving the quality of the 3D space scan.

A Correction System of Odometry Error for Map Building of Mobile Robot Based on Sensor fusion

  • Hyun, Woong-Keun
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.6
    • /
    • pp.709-715
    • /
    • 2010
  • This paper represents a map building and localization system for mobile robot. Map building and navigation is a complex problem because map integrity cannot be sustained by odometry alone due to errors introduced by wheel slippage, distortion and simple linealized odometry equation. For accurate localization, we propose sensor fusion system using encoder sensor and indoor GPS module as relative sensor and absolute sensor, respectively. To build a map, we developed a sensor based navigation algorithm and grid based map building algorithm based on Embedded Linux O.S. A wall following decision engine like an expert system was proposed for map building navigation. We proved this system's validity through field test.

Sensor Fusion System for Improving the Recognition Performance of 3D Object (3차원 물체의 인식 성능 향상을 위한 감각 융합 시스템)

  • Kim, Ji-Kyoung;Oh, Yeong-Jae;Chong, Kab-Sung;Wee, Jae-Woo;Lee, Chong-Ho
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.107-109
    • /
    • 2004
  • In this paper, authors propose the sensor fusion system that can recognize multiple 3D objects from 2D projection images and tactile information. The proposed system focuses on improving recognition performance of 3D object. Unlike the conventional object recognition system that uses image sensor alone, the proposed method uses tactual sensors in addition to visual sensor. Neural network is used to fuse these informations. Tactual signals are obtained from the reaction force by the pressure sensors at the fingertips when unknown objects are grasped by four-fingered robot hand. The experiment evaluates the recognition rate and the number of teaming iterations of various objects. The merits of the proposed systems are not only the high performance of the learning ability but also the reliability of the system with tactual information for recognizing various objects even though visual information has a defect. The experimental results show that the proposed system can improve recognition rate and reduce learning time. These results verify the effectiveness of the proposed sensor fusion system as recognition scheme of 3D object.

  • PDF