• 제목/요약/키워드: time sensor fusion

검색결과 215건 처리시간 0.03초

Control of the Mobile Robot Navigation Using a New Time Sensor Fusion

  • Tack, Han-Ho;Kim, Chang-Geun;Kim, Myeong-Kyu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제4권1호
    • /
    • pp.23-28
    • /
    • 2004
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion(STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

다중센서 데이터융합 기반 상황추론에서 시간경과를 고려한 클러스터링 기법 (A Novel Clustering Method with Time Interval for Context Inference based on the Multi-sensor Data Fusion)

  • 유창근;박찬봉
    • 한국전자통신학회논문지
    • /
    • 제8권3호
    • /
    • pp.397-402
    • /
    • 2013
  • 다중센서를 이용한 상황인식에서 시간변화는 고려해야 하는 요소이다. 센서가 감지하여 보고한 정보를 바탕으로 상황추론에 도달하고자 하는 경우, 일정 시간 간격별로 묶어서 검토하는 것이 유용하다. 본 논문에서는 시간경과를 고려하는 클러스터링 기법을 이용한 다중센서 데이터융합을 제안한다. 각 센서별로 일정시간 간격동안 수집되어 보고된 센싱 정보를 묶어 1차 데이터융합을 실시하고 그 결과를 대상으로 다시 2차 데이터융합을 실시하였다. Dempster-Shafer이론을 이용하여 다중센서 데이터융합을 실시하고 그 결과를 분석하여 상황을 추론하는데 시간간격을 기준으로 세분화시켜 평가하고 이것을 다시 융합함으로써 향상된 상황 정보를 추론할 수 있다.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제10권4호
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Visual Control of Mobile Robots Using Multisensor Fusion System

  • Kim, Jung-Ha;Sugisaka, Masanori
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.91.4-91
    • /
    • 2001
  • In this paper, a development of the sensor fusion algorithm for a visual control of mobile robot is presented. The output data from the visual sensor include a time-lag due to the image processing computation. The sampling rate of the visual sensor is considerably low so that it should be used with other sensors to control fast motion. The main purpose of this paper is to develop a method which constitutes a sensor fusion system to give the optimal state estimates. The proposed sensor fusion system combines the visual sensor and inertial sensor using a modified Kalman filter. A kind of multi-rate Kalman filter which treats the slow sampling rate ...

  • PDF

비행시험통제컴퓨터용 실시간 데이터 융합 알고리듬의 구현 (Implementation of a Real-time Data fusion Algorithm for Flight Test Computer)

  • 이용재;원종훈;이자성
    • 한국군사과학기술학회지
    • /
    • 제8권4호
    • /
    • pp.24-31
    • /
    • 2005
  • This paper presents an implementation of a real-time multi-sensor data fusion algorithm for Flight Test Computer. The sensor data consist of positional information of the target from a radar, a GPS receiver and an INS. The data fusion algorithm is designed by the 21st order distributed Kalman Filter which is based on the PVA model with sensor bias states. A fault detection and correction logics are included in the algorithm for bad measurements and sensor faults. The statistical parameters for the states are obtained from Monte Carlo simulations and covariance analysis using test tracking data. The designed filter is verified by using real data both in post processing and real-time processing.

비전 센서와 자이로 센서의 융합을 통한 보행 로봇의 자세 추정 (Attitude Estimation for the Biped Robot with Vision and Gyro Sensor Fusion)

  • 박진성;박영진;박윤식;홍덕화
    • 제어로봇시스템학회논문지
    • /
    • 제17권6호
    • /
    • pp.546-551
    • /
    • 2011
  • Tilt sensor is required to control the attitude of the biped robot when it walks on an uneven terrain. Vision sensor, which is used for recognizing human or detecting obstacles, can be used as a tilt angle sensor by comparing current image and reference image. However, vision sensor alone has a lot of technological limitations to control biped robot such as low sampling frequency and estimation time delay. In order to verify limitations of vision sensor, experimental setup of an inverted pendulum, which represents pitch motion of the walking or running robot, is used and it is proved that only vision sensor cannot control an inverted pendulum mainly because of the time delay. In this paper, to overcome limitations of vision sensor, Kalman filter for the multi-rate sensor fusion algorithm is applied with low-quality gyro sensor. It solves limitations of the vision sensor as well as eliminates drift of gyro sensor. Through the experiment of an inverted pendulum control, it is found that the tilt estimation performance of fusion sensor is greatly improved enough to control the attitude of an inverted pendulum.

Sensor Fusion을 이용한 전자식 조향장치의 Fail Safety 연구 (A Study on the Fail Safety of Electronics Power Steering Using Sensor Fusion)

  • 김병우;허진;조현덕;이영석
    • 전기학회논문지
    • /
    • 제57권8호
    • /
    • pp.1371-1376
    • /
    • 2008
  • A Steer-by-Wire system has so many advantages comparing with conventional mechanical steering system that it is expected to take key role in future environment friendly vehicle and intelligent transportation system. The mechanical connection between the hand wheel and the front axle will become obsolete. SBW system provides many benefits in terms of functionality, and at the same time present significant challenges - fault tolerant, fail safety - too. In this paper, failure analysis of SBW system will be performed and than sensor fusion technique will be proposed for fail safety of SBW system. A sensor fusion logic of steering angle sensor by using steering angle sensor, torque sensor and rack position sensor will be developed and simulated by fault injection simulation.

함정 전투체계 표적 융합 정확도 향상을 위한 알고리즘 연구 (A Study on Multi Sensor Track Fusion Algorithm for Naval Combat System)

  • 정영란
    • 한국군사과학기술학회지
    • /
    • 제10권3호
    • /
    • pp.34-42
    • /
    • 2007
  • It is very important for the combat system to process extensive data exactly at short time for the better situation awareness compared with the threats in these days. This paper suggests to add radial velocity on the decision factor of sensor data fusion in the existing algorithm for the accuracy enhancement of the sensor data fusion in the combat system.

Distributed Fusion Moving Average Prediction for Linear Stochastic Systems

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • 센서학회지
    • /
    • 제28권2호
    • /
    • pp.88-93
    • /
    • 2019
  • This paper is concerned with distributed fusion moving average prediction for continuous-time linear stochastic systems with multiple sensors. A distributed fusion with the weighted sum structure is applied to the optimal local moving average predictors. The distributed fusion prediction algorithm represents the optimal linear fusion by weighting matrices under the minimum mean square criterion. The derivation of equations for error cross-covariances between the local predictors is the key of this paper. Example demonstrates effectiveness of the distributed fusion moving average predictor.

AGV Navigation Using a Space and Time Sensor Fusion of an Active Camera

  • Jin, Tae-Seok;Lee, Bong-Ki;Lee, Jang-Myung
    • 한국항해항만학회지
    • /
    • 제27권3호
    • /
    • pp.273-282
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where rho data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent only on the current data sets. As the results, more of sensors are required to measure a certain physical promoter or to improve the accuracy of the measurement. However, in this approach, intend of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples md the effectiveness is proved through the simulation. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in the indoor environment and the performance was demonstrated by the real experiments.