• Title/Summary/Keyword: time sensor fusion

Search Result 215, Processing Time 0.03 seconds

Control of the Mobile Robot Navigation Using a New Time Sensor Fusion

  • Tack, Han-Ho;Kim, Chang-Geun;Kim, Myeong-Kyu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.23-28
    • /
    • 2004
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion(STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

A Novel Clustering Method with Time Interval for Context Inference based on the Multi-sensor Data Fusion (다중센서 데이터융합 기반 상황추론에서 시간경과를 고려한 클러스터링 기법)

  • Ryu, Chang-Keun;Park, Chan-Bong
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.8 no.3
    • /
    • pp.397-402
    • /
    • 2013
  • Time variation is the essential component of the context awareness. It is a beneficial way not only including time lapse but also clustering time interval for the context inference using the information from sensor mote. In this study, we proposed a novel way of clustering based multi-sensor data fusion for the context inference. In the time interval, we fused the sensed signal of each time slot, and fused again with the results of th first fusion. We could reach the enhanced context inference with assessing the segmented signal according to the time interval at the Dempster-Shafer evidence theory based multi-sensor data fusion.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Visual Control of Mobile Robots Using Multisensor Fusion System

  • Kim, Jung-Ha;Sugisaka, Masanori
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.91.4-91
    • /
    • 2001
  • In this paper, a development of the sensor fusion algorithm for a visual control of mobile robot is presented. The output data from the visual sensor include a time-lag due to the image processing computation. The sampling rate of the visual sensor is considerably low so that it should be used with other sensors to control fast motion. The main purpose of this paper is to develop a method which constitutes a sensor fusion system to give the optimal state estimates. The proposed sensor fusion system combines the visual sensor and inertial sensor using a modified Kalman filter. A kind of multi-rate Kalman filter which treats the slow sampling rate ...

  • PDF

Implementation of a Real-time Data fusion Algorithm for Flight Test Computer (비행시험통제컴퓨터용 실시간 데이터 융합 알고리듬의 구현)

  • Lee, Yong-Jae;Won, Jong-Hoon;Lee, Ja-Sung
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.8 no.4 s.23
    • /
    • pp.24-31
    • /
    • 2005
  • This paper presents an implementation of a real-time multi-sensor data fusion algorithm for Flight Test Computer. The sensor data consist of positional information of the target from a radar, a GPS receiver and an INS. The data fusion algorithm is designed by the 21st order distributed Kalman Filter which is based on the PVA model with sensor bias states. A fault detection and correction logics are included in the algorithm for bad measurements and sensor faults. The statistical parameters for the states are obtained from Monte Carlo simulations and covariance analysis using test tracking data. The designed filter is verified by using real data both in post processing and real-time processing.

Attitude Estimation for the Biped Robot with Vision and Gyro Sensor Fusion (비전 센서와 자이로 센서의 융합을 통한 보행 로봇의 자세 추정)

  • Park, Jin-Seong;Park, Young-Jin;Park, Youn-Sik;Hong, Deok-Hwa
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.6
    • /
    • pp.546-551
    • /
    • 2011
  • Tilt sensor is required to control the attitude of the biped robot when it walks on an uneven terrain. Vision sensor, which is used for recognizing human or detecting obstacles, can be used as a tilt angle sensor by comparing current image and reference image. However, vision sensor alone has a lot of technological limitations to control biped robot such as low sampling frequency and estimation time delay. In order to verify limitations of vision sensor, experimental setup of an inverted pendulum, which represents pitch motion of the walking or running robot, is used and it is proved that only vision sensor cannot control an inverted pendulum mainly because of the time delay. In this paper, to overcome limitations of vision sensor, Kalman filter for the multi-rate sensor fusion algorithm is applied with low-quality gyro sensor. It solves limitations of the vision sensor as well as eliminates drift of gyro sensor. Through the experiment of an inverted pendulum control, it is found that the tilt estimation performance of fusion sensor is greatly improved enough to control the attitude of an inverted pendulum.

A Study on the Fail Safety of Electronics Power Steering Using Sensor Fusion (Sensor Fusion을 이용한 전자식 조향장치의 Fail Safety 연구)

  • Kim, Byeong-Woo;Her, Jin;Cho, Hyun-Duck;Lee, Young-Seok
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.57 no.8
    • /
    • pp.1371-1376
    • /
    • 2008
  • A Steer-by-Wire system has so many advantages comparing with conventional mechanical steering system that it is expected to take key role in future environment friendly vehicle and intelligent transportation system. The mechanical connection between the hand wheel and the front axle will become obsolete. SBW system provides many benefits in terms of functionality, and at the same time present significant challenges - fault tolerant, fail safety - too. In this paper, failure analysis of SBW system will be performed and than sensor fusion technique will be proposed for fail safety of SBW system. A sensor fusion logic of steering angle sensor by using steering angle sensor, torque sensor and rack position sensor will be developed and simulated by fault injection simulation.

A Study on Multi Sensor Track Fusion Algorithm for Naval Combat System (함정 전투체계 표적 융합 정확도 향상을 위한 알고리즘 연구)

  • Jung, Young-Ran
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.10 no.3
    • /
    • pp.34-42
    • /
    • 2007
  • It is very important for the combat system to process extensive data exactly at short time for the better situation awareness compared with the threats in these days. This paper suggests to add radial velocity on the decision factor of sensor data fusion in the existing algorithm for the accuracy enhancement of the sensor data fusion in the combat system.

Distributed Fusion Moving Average Prediction for Linear Stochastic Systems

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • Journal of Sensor Science and Technology
    • /
    • v.28 no.2
    • /
    • pp.88-93
    • /
    • 2019
  • This paper is concerned with distributed fusion moving average prediction for continuous-time linear stochastic systems with multiple sensors. A distributed fusion with the weighted sum structure is applied to the optimal local moving average predictors. The distributed fusion prediction algorithm represents the optimal linear fusion by weighting matrices under the minimum mean square criterion. The derivation of equations for error cross-covariances between the local predictors is the key of this paper. Example demonstrates effectiveness of the distributed fusion moving average predictor.

AGV Navigation Using a Space and Time Sensor Fusion of an Active Camera

  • Jin, Tae-Seok;Lee, Bong-Ki;Lee, Jang-Myung
    • Journal of Navigation and Port Research
    • /
    • v.27 no.3
    • /
    • pp.273-282
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where rho data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent only on the current data sets. As the results, more of sensors are required to measure a certain physical promoter or to improve the accuracy of the measurement. However, in this approach, intend of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples md the effectiveness is proved through the simulation. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in the indoor environment and the performance was demonstrated by the real experiments.