• Title/Summary/Keyword: Sensor fusion

Search Result 815, Processing Time 0.029 seconds

A wireless sensor with data-fusion algorithm for structural tilt measurement

  • Dan Li;Guangwei Zhang;Ziyang Su;Jian Zhang
    • Smart Structures and Systems
    • /
    • v.31 no.3
    • /
    • pp.301-309
    • /
    • 2023
  • Tilt is a key indicator of structural safety. Real-time monitoring of tilt responses helps to evaluate structural condition, enable cost-effective maintenance, and enhance lifetime resilience. This paper presents a prototype wireless sensing system for structural tilt measurement. Long range (LoRa) technology is adopted by the sensing system to offer long-range wireless communication with low power consumption. The sensor integrates a gyroscope and an accelerometer as the sensing module. Although tilt can be estimated from the gyroscope or the accelerometer measurements, these estimates suffer from either drift issue or high noise. To address this challenging issue and obtain more reliable tilt results, two sensor fusion algorithms, the complementary filter and the Kalman filter, are investigated to fully exploit the advantages of both gyroscope and accelerometer measurements. Numerical simulation is carried out to validate and compare the sensor fusion algorithms. Laboratory experiment is conducted on a simply supported beam under moving vehicle load to further investigate the performance of the proposed wireless tilt sensing system.

Lane Information Fusion Scheme using Multiple Lane Sensors (다중센서 기반 차선정보 시공간 융합기법)

  • Lee, Soomok;Park, Gikwang;Seo, Seung-woo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.12
    • /
    • pp.142-149
    • /
    • 2015
  • Most of the mono-camera based lane detection systems are fragile on poor illumination conditions. In order to compensate limitations of single sensor utilization, lane information fusion system using multiple lane sensors is an alternative to stabilize performance and guarantee high precision. However, conventional fusion schemes, which only concerns object detection, are inappropriate to apply to the lane information fusion. Even few studies considering lane information fusion have dealt with limited aids on back-up sensor or omitted cases of asynchronous multi-rate and coverage. In this paper, we propose a lane information fusion scheme utilizing multiple lane sensors with different coverage and cycle. The precise lane information fusion is achieved by the proposed fusion framework which considers individual ranging capability and processing time of diverse types of lane sensors. In addition, a novel lane estimation model is proposed to synchronize multi-rate sensors precisely by up-sampling spare lane information signals. Through quantitative vehicle-level experiments with around view monitoring system and frontal camera system, we demonstrate the robustness of the proposed lane fusion scheme.

Aerial Object Detection and Tracking based on Fusion of Vision and Lidar Sensors using Kalman Filter for UAV

  • Park, Cheonman;Lee, Seongbong;Kim, Hyeji;Lee, Dongjin
    • International journal of advanced smart convergence
    • /
    • v.9 no.3
    • /
    • pp.232-238
    • /
    • 2020
  • In this paper, we study on aerial objects detection and position estimation algorithm for the safety of UAV that flight in BVLOS. We use the vision sensor and LiDAR to detect objects. We use YOLOv2 architecture based on CNN to detect objects on a 2D image. Additionally we use a clustering method to detect objects on point cloud data acquired from LiDAR. When a single sensor used, detection rate can be degraded in a specific situation depending on the characteristics of sensor. If the result of the detection algorithm using a single sensor is absent or false, we need to complement the detection accuracy. In order to complement the accuracy of detection algorithm based on a single sensor, we use the Kalman filter. And we fused the results of a single sensor to improve detection accuracy. We estimate the 3D position of the object using the pixel position of the object and distance measured to LiDAR. We verified the performance of proposed fusion algorithm by performing the simulation using the Gazebo simulator.

A Study on Mobile Robot Navigation Using a New Sensor Fusion

  • Tack, Han-Ho;Jin, Tae-Seok;Lee, Sang-Bae
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.471-475
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

  • PDF

ACCOUNTING FOR IMPORTANCE OF VARIABLES IN MUL TI-SENSOR DATA FUSION USING RANDOM FORESTS

  • Park No-Wook;Chi Kwang-Hoon
    • Proceedings of the KSRS Conference
    • /
    • 2005.10a
    • /
    • pp.283-285
    • /
    • 2005
  • To account for the importance of variable in multi-sensor data fusion, random forests are applied to supervised land-cover classification. The random forests approach is a non-parametric ensemble classifier based on CART-like trees. Its distinguished feature is that the importance of variable can be estimated by randomly permuting the variable of interest in all the out-of-bag samples for each classifier. Supervised classification with a multi-sensor remote sensing data set including optical and polarimetric SAR data was carried out to illustrate the applicability of random forests. From the experimental result, the random forests approach could extract important variables or bands for land-cover discrimination and showed good performance, as compared with other non-parametric data fusion algorithms.

  • PDF

Motion and Structure Estimation Using Fusion of Inertial and Vision Data for Helmet Tracker

  • Heo, Se-Jong;Shin, Ok-Shik;Park, Chan-Gook
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.11 no.1
    • /
    • pp.31-40
    • /
    • 2010
  • For weapon cueing and Head-Mounted Display (HMD), it is essential to continuously estimate the motion of the helmet. The problem of estimating and predicting the position and orientation of the helmet is approached by fusing measurements from inertial sensors and stereo vision system. The sensor fusion approach in this paper is based on nonlinear filtering, especially expended Kalman filter(EKF). To reduce the computation time and improve the performance in vision processing, we separate the structure estimation and motion estimation. The structure estimation tracks the features which are the part of helmet model structure in the scene and the motion estimation filter estimates the position and orientation of the helmet. This algorithm is tested with using synthetic and real data. And the results show that the result of sensor fusion is successful.

Obstacle Avoidance of Mobile Robot Based on Behavior Hierarchy by Fuzzy Logic

  • Jin, Tae-Seok
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.12 no.3
    • /
    • pp.245-249
    • /
    • 2012
  • In this paper, we propose a navigation algorithm for a mobile robot, which is intelligently searching the goal location in unknown dynamic environments using an ultrasonic sensor. Instead of using "sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data, "command fusion" method is used to govern the robot motions. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process.

Suboptimal Decision Fusion in Wireless Sensor Networks under Non-Gaussian Noise Channels (비가우시안 잡음 채널을 갖는 무선 센서 네트워크의 준 최적화 결정 융합에 관한 연구)

  • Park, Jin-Tae;Koo, In-Soo;Kim, Ki-Seon
    • Journal of Internet Computing and Services
    • /
    • v.8 no.4
    • /
    • pp.1-9
    • /
    • 2007
  • Decision fusion in wireless sensor networks under non-Gaussian noise channels is studied. To consider the tail behavior noise distributions, we use a exponentially-tailed distribution as a wide class of noise distributions. Based on a canonical parallel fusion model with fading and noise channels, the likelihood ratio(LR) based fusion rule is considered as an optimal fusion rule under Neyman-Pearson criterion. With both high and low signal-to-noise ratio (SNR) approximation to the optimal rule, we obtain several suboptimal fusion rules. and we propose a simple fusion rule that provides robust detection performance with a minimum prior information, Performance evaluation for several fusion rules is peformed through simulation. Simulation results show the robustness of the Proposed simple fusion rule.

  • PDF

A Fusion Algorithm considering Error Characteristics of the Multi-Sensor (다중센서 오차특성을 고려한 융합 알고리즘)

  • Hyun, Dae-Hwan;Yoon, Hee-Byung
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.36 no.4
    • /
    • pp.274-282
    • /
    • 2009
  • Various location tracking sensors; such as GPS, INS, radar, and optical equipment; are used for tracking moving targets. In order to effectively track moving targets, it is necessary to develop an effective fusion method for these heterogeneous devices. There have been studies in which the estimated values of each sensors were regarded as different models and fused together, considering the different error characteristics of the sensors for the improvement of tracking performance using heterogeneous multi-sensor. However, the rate of errors for the estimated values of other sensors has increased, in that there has been a sharp increase in sensor errors and the attempts to change the estimated sensor values for the Sensor Probability could not be applied in real time. In this study, the Sensor Probability is obtained by comparing the RMSE (Root Mean Square Error) for the difference between the updated and measured values of the Kalman filter for each sensor. The process of substituting the new combined values for the Kalman filter input values for each sensor is excluded. There are improvements in both the real-time application of estimated sensor values, and the tracking performance for the areas in which the sensor performance has rapidly decreased. The proposed algorithm adds the error characteristic of each sensor as a conditional probability value, and ensures greater accuracy by performing the track fusion with the sensors with the most reliable performance. The trajectory of a UAV is generated in an experiment and a performance analysis is conducted with other fusion algorithms.

Sensor Fusion based Obstacle Avoidance for Terrain-Adaptive Mobile Robot (센서융합을 이용한 부정지형 적응형 이동로봇의 장애물 회피)

  • Yuk, Gyung-Hwan;Yang, Hyun-Seok;Park, Noh-Chul;Lee, Sang-Won
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.2
    • /
    • pp.93-100
    • /
    • 2007
  • The mobile robots to rescue a life in a disaster area and to explore planets demand high mobility as well as recognition of the environment. To avoid unknown obstacles exactly in unknown environment, accurate sensing is required. This paper proposes a sensor fusion to recognize unknown obstacles accurately by using low-cost sensors. Ultrasonic sensors and infrared sensors are used in this paper to avoid obstacles. If only one of these sensors is used alone, it is not useful fer the mobile robots to complete their tasks in the real world since the surrounding environment in the real world is complex and composed of many kinds of materials. So infrared sensor may not recognize transparent or reflective obstacles and ultrasonic sensor may not recognize narrow obstacles, far example, columns of small diameter. Therefore, I selected six ultrasonic sensors and five infrared sensors to detect obstacles. Then, I fused ultrasonic sensors with infrared sensors in order that both advantages and disadvantages of each sensor are utilized together. In fusing sensors, fuzzy algorithm is used to cope with the uncertainties of each sensor. TAMRY which is terrain-adaptive mobile robot is used as the mobile robot for experiments.