• Title/Summary/Keyword: sensor-fusion technique

Search Result 109, Processing Time 0.026 seconds

A Study on the Multi-sensor Data Fusion System for Ground Target Identification (지상표적식별을 위한 다중센서기반의 정보융합시스템에 관한 연구)

  • Gang, Seok-Hun
    • Journal of National Security and Military Science
    • /
    • s.1
    • /
    • pp.191-229
    • /
    • 2003
  • Multi-sensor data fusion techniques combine evidences from multiple sensors in order to get more accurate and efficient meaningful information through several process levels that may not be possible from a single sensor alone. One of the most important parts in the data fusion system is the identification fusion, and it can be categorized into physical models, parametric classification and cognitive-based models, and parametric classification technique is usually used in multi-sensor data fusion system by its characteristic. In this paper, we propose a novel heuristic identification fusion method in which we adopt desirable properties from not only parametric classification technique but also cognitive-based models in order to meet the realtime processing requirements.

  • PDF

Control of the Mobile Robot Navigation Using a New Time Sensor Fusion

  • Tack, Han-Ho;Kim, Chang-Geun;Kim, Myeong-Kyu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.23-28
    • /
    • 2004
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion(STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

A Triple Nested PID Controller based on Sensor Fusion for Quadrotor Attitude Stabilization (쿼드로터 자세 안정화를 위한 센서융합 기반 3중 중첩 PID 제어기)

  • Cho, Youngwan
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.7
    • /
    • pp.871-877
    • /
    • 2018
  • In this paper, we propose a triple nested PID control scheme for stable hovering of a quadrotor and propose a complementary filter based sensor fusion technique to improve the performance of attitude, altitude and velocity measurement. The triple nested controller has a structure in which a double nested attitude controller that has the angular velocity PD controller in inner loop and the angular PI controller in outer loop, is nested in a velocity control loop to enable stable hovering even in the case of disturbance. We also propose a sensor fusion technique by applying a complementary filter in order to reduce the noise and drift error included in the acceleration and gyro sensor and to measure the velocity by fusing image, gyro, and acceleration sensor. In order to verity the performance, we applied the proposed control and measurement scheme to hovering control of quadrotor.

A Study on the Fail Safety of Electronics Power Steering Using Sensor Fusion (Sensor Fusion을 이용한 전자식 조향장치의 Fail Safety 연구)

  • Kim, Byeong-Woo;Her, Jin;Cho, Hyun-Duck;Lee, Young-Seok
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.57 no.8
    • /
    • pp.1371-1376
    • /
    • 2008
  • A Steer-by-Wire system has so many advantages comparing with conventional mechanical steering system that it is expected to take key role in future environment friendly vehicle and intelligent transportation system. The mechanical connection between the hand wheel and the front axle will become obsolete. SBW system provides many benefits in terms of functionality, and at the same time present significant challenges - fault tolerant, fail safety - too. In this paper, failure analysis of SBW system will be performed and than sensor fusion technique will be proposed for fail safety of SBW system. A sensor fusion logic of steering angle sensor by using steering angle sensor, torque sensor and rack position sensor will be developed and simulated by fault injection simulation.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Development of a Monitoring and Verification Tool for Sensor Fusion (센서융합 검증을 위한 실시간 모니터링 및 검증 도구 개발)

  • Kim, Hyunwoo;Shin, Seunghwan;Bae, Sangjin
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.22 no.3
    • /
    • pp.123-129
    • /
    • 2014
  • SCC (Smart Cruise Control) and AEBS (Autonomous Emergency Braking System) are using various types of sensors data, so it is important to consider about sensor data reliability. In this paper, data from radar and vision sensor is fused by applying a Bayesian sensor fusion technique to improve the reliability of sensors data. Then, it presents a sensor fusion verification tool developed to monitor acquired sensors data and to verify sensor fusion results, efficiently. A parallel computing method was applied to reduce verification time and a series of simulation results of this method are discussed in detail.

A Development of Wireless Sensor Networks for Collaborative Sensor Fusion Based Speaker Gender Classification (협동 센서 융합 기반 화자 성별 분류를 위한 무선 센서네트워크 개발)

  • Kwon, Ho-Min
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.12 no.2
    • /
    • pp.113-118
    • /
    • 2011
  • In this paper, we develop a speaker gender classification technique using collaborative sensor fusion for use in a wireless sensor network. The distributed sensor nodes remove the unwanted input data using the BER(Band Energy Ration) based voice activity detection, process only the relevant data, and transmit the hard labeled decisions to the fusion center where a global decision fusion is carried out. This takes advantages of power consumption and network resource management. The Bayesian sensor fusion and the global weighting decision fusion methods are proposed to achieve the gender classification. As the number of the sensor nodes varies, the Bayesian sensor fusion yields the best classification accuracy using the optimal operating points of the ROC(Receiver Operating Characteristic) curves_ For the weights used in the global decision fusion, the BER and MCL(Mutual Confidence Level) are employed to effectively combined at the fusion center. The simulation results show that as the number of the sensor nodes increases, the classification accuracy was even more improved in the low SNR(Signal to Noise Ration) condition.

Command Fusion for Navigation of Mobile Robots in Dynamic Environments with Objects

  • Jin, Taeseok
    • Journal of information and communication convergence engineering
    • /
    • v.11 no.1
    • /
    • pp.24-29
    • /
    • 2013
  • In this paper, we propose a fuzzy inference model for a navigation algorithm for a mobile robot that intelligently searches goal location in unknown dynamic environments. Our model uses sensor fusion based on situational commands using an ultrasonic sensor. Instead of using the "physical sensor fusion" method, which generates the trajectory of a robot based upon the environment model and sensory data, a "command fusion" method is used to govern the robot motions. The navigation strategy is based on a combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance based on a hierarchical behavior-based control architecture. To identify the environments, a command fusion technique is introduced where the sensory data of the ultrasonic sensors and a vision sensor are fused into the identification process. The result of experiment has shown that highlights interesting aspects of the goal seeking, obstacle avoiding, decision making process that arise from navigation interaction.

Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot (이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합)

  • Kim, Min-Young;Ahn, Sang-Tae;Cho, Hyung-Suck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.4
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

A Study on the Fail Safety Logic of Smart Air Conditioner using Model based Design (모델 기반 설계 기법을 이용한 지능형 공조 장치의 이중 안전성 로직 연구)

  • Kim, Ji-Ho;Kim, Byeong-Woo
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.12
    • /
    • pp.1372-1378
    • /
    • 2011
  • The smart air condition system is superior to conventional air condition system in the aspect of control accuracy, environmental preservation and it is foundation for intelligent vehicle such as electric vehicle, fuel cell vehicle. In this paper, failure analyses of smart air condition system will be performed and then sensor fusion technique will be proposed for fail safety of smart air condition system. A sensor fusion logic of air condition system by using CO sensor, $CO_2$ sensor and VOC, $NO_x$ sensor will be developed and simulated by fault injection simulation. The fusion technology of smart air condition system is generated in an experiment and a performance analysis is conducted with fusion algorithms. The proposed algorithm adds the error characteristic of each sensor as a conditional probability value, and ensures greater accuracy by performing the track fusion with the sensors with the most reliable performance.