• 제목/요약/키워드: time sensor fusion

검색결과 216건 처리시간 0.026초

센서융합 검증을 위한 실시간 모니터링 및 검증 도구 개발 (Development of a Monitoring and Verification Tool for Sensor Fusion)

  • 김현우;신승환;배상진
    • 한국자동차공학회논문집
    • /
    • 제22권3호
    • /
    • pp.123-129
    • /
    • 2014
  • SCC (Smart Cruise Control) and AEBS (Autonomous Emergency Braking System) are using various types of sensors data, so it is important to consider about sensor data reliability. In this paper, data from radar and vision sensor is fused by applying a Bayesian sensor fusion technique to improve the reliability of sensors data. Then, it presents a sensor fusion verification tool developed to monitor acquired sensors data and to verify sensor fusion results, efficiently. A parallel computing method was applied to reduce verification time and a series of simulation results of this method are discussed in detail.

Vision Sensor and Ultrasonic Sensor Fusion Using Neural Network

  • Baek, Sang-Hoon;Oh, Se-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.668-671
    • /
    • 2004
  • This paper proposes a new method of sensor fusion of an ultrasonic sensor and a vision sensor at the sensor level. In general vision system, the vision system finds edges of objects. And in general ultrasonic system, the ultrasonic system finds absolute distance between robot and object. So, the method integrates data of two different types. The system makes perfect output for robot control in the end. But this paper does not propose only integrating a different kind of data but also fusion information which receives from different kind of sensors. This method has advantages which can simply embody algorithm and can control robot on real time.

  • PDF

Sensor Data Fusion for Navigation of Mobile Robot With Collision Avoidance and Trap Recovery

  • Jeon, Young-Su;Ahn, Byeong-Kyu;Kuc, Tae-Yong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.2461-2466
    • /
    • 2003
  • This paper presents a simple sensor fusion algorithm using neural network for navigation of mobile robots with obstacle avoidance and trap recovery. The multiple sensors input sensor data to the input layer of neural network activating the input nodes. The multiple sensors used include optical encoders, ultrasonic sensors, infrared sensors, a magnetic compass sensor, and GPS sensors. The proposed sensor fusion algorithm is combined with the VFH(Vector Field Histogram) algorithm for obstacle avoidance and AGPM(Adaptive Goal Perturbation Method) which sets adaptive virtual goals to escape trap situations. The experiment results show that the proposed low-level fusion algorithm is effective for real-time navigation of mobile robot.

  • PDF

센서데이터 융합을 이용한 원주형 물체인식 (Cylindrical Object Recognition using Sensor Data Fusion)

  • 김동기;윤광익;윤지섭;강이석
    • 제어로봇시스템학회논문지
    • /
    • 제7권8호
    • /
    • pp.656-663
    • /
    • 2001
  • This paper presents a sensor fusion method to recognize a cylindrical object a CCD camera, a laser slit beam and ultrasonic sensors on a pan/tilt device. For object recognition with a vision sensor, an active light source projects a stripe pattern of light on the object surface. The 2D image data are transformed into 3D data using the geometry between the camera and the laser slit beam. The ultrasonic sensor uses an ultrasonic transducer array mounted in horizontal direction on the pan/tilt device. The time of flight is estimated by finding the maximum correlation between the received ultrasonic pulse and a set of stored templates - also called a matched filter. The distance of flight is calculated by simply multiplying the time of flight by the speed of sound and the maximum amplitude of the filtered signal is used to determine the face angle to the object. To determine the position and the radius of cylindrical objects, we use a statistical sensor fusion. Experimental results show that the fused data increase the reliability for the object recognition.

  • PDF

Hierarchical Behavior Control of Mobile Robot Based on Space & Time Sensor Fusion(STSF)

  • Han, Ho-Tack
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제6권4호
    • /
    • pp.314-320
    • /
    • 2006
  • Navigation in environments that are densely cluttered with obstacles is still a challenge for Autonomous Ground Vehicles (AGVs), especially when the configuration of obstacles is not known a priori. Reactive local navigation schemes that tightly couple the robot actions to the sensor information have proved to be effective in these environments, and because of the environmental uncertainties, STSF(Space and Time Sensor Fusion)-based fuzzy behavior systems have been proposed. Realization of autonomous behavior in mobile robots, using STSF control based on spatial data fusion, requires formulation of rules which are collectively responsible for necessary levels of intelligence. This collection of rules can be conveniently decomposed and efficiently implemented as a hierarchy of fuzzy-behaviors. This paper describes how this can be done using a behavior-based architecture. The approach is motivated by ethological models which suggest hierarchical organizations of behavior. Experimental results show that the proposed method can smoothly and effectively guide a robot through cluttered environments such as dense forests.

Centralized Kalman Filter with Adaptive Measurement Fusion: its Application to a GPS/SDINS Integration System with an Additional Sensor

  • Lee, Tae-Gyoo
    • International Journal of Control, Automation, and Systems
    • /
    • 제1권4호
    • /
    • pp.444-452
    • /
    • 2003
  • An integration system with multi-measurement sets can be realized via combined application of a centralized and federated Kalman filter. It is difficult for the centralized Kalman filter to remove a failed sensor in comparison with the federated Kalman filter. All varieties of Kalman filters monitor innovation sequence (residual) for detection and isolation of a failed sensor. The innovation sequence, which is selected as an indicator of real time estimation error plays an important role in adaptive mechanism design. In this study, the centralized Kalman filter with adaptive measurement fusion is introduced by means of innovation sequence. The objectives of adaptive measurement fusion are automatic isolation and recovery of some sensor failures as well as inherent monitoring capability. The proposed adaptive filter is applied to the GPS/SDINS integration system with an additional sensor. Simulation studies attest that the proposed adaptive scheme is effective for isolation and recovery of immediate sensor failures.

간접 칼만 필터 기반의 센서융합을 이용한 실외 주행 이동로봇의 위치 추정 (Localization of Outdoor Wheeled Mobile Robots using Indirect Kalman Filter Based Sensor fusion)

  • 권지욱;박문수;김태은;좌동경;홍석교
    • 제어로봇시스템학회논문지
    • /
    • 제14권8호
    • /
    • pp.800-808
    • /
    • 2008
  • This paper presents a localization algorithm of the outdoor wheeled mobile robot using the sensor fusion method based on indirect Kalman filter(IKF). The wheeled mobile robot considered with in this paper is approximated to the two wheeled mobile robot. The mobile robot has the IMU and encoder sensor for inertia positioning system and GPS. Because the IMU and encoder sensor have bias errors, divergence of the estimated position from the measured data can occur when the mobile robot moves for a long time. Because of many natural and artificial conditions (i.e. atmosphere or GPS body itself), GPS has the maximum error about $10{\sim}20m$ when the mobile robot moves for a short time. Thus, the fusion algorithm of IMU, encoder sensor and GPS is needed. For the sensor fusion algorithm, we use IKF that estimates the errors of the position of the mobile robot. IKF proposed in this paper can be used other autonomous agents (i.e. UAV, UGV) because IKF in this paper use the position errors of the mobile robot. We can show the stability of the proposed sensor fusion method, using the fact that the covariance of error state of the IKF is bounded. To evaluate the performance of proposed algorithm, simulation and experimental results of IKF for the position(x-axis position, y-axis position, and yaw angle) of the outdoor wheeled mobile robot are presented.

실외 자율 로봇 주행을 위한 센서 퓨전 시스템 구현 (Implementation of a sensor fusion system for autonomous guided robot navigation in outdoor environments)

  • 이승환;이헌철;이범희
    • 센서학회지
    • /
    • 제19권3호
    • /
    • pp.246-257
    • /
    • 2010
  • Autonomous guided robot navigation which consists of following unknown paths and avoiding unknown obstacles has been a fundamental technique for unmanned robots in outdoor environments. The unknown path following requires techniques such as path recognition, path planning, and robot pose estimation. In this paper, we propose a novel sensor fusion system for autonomous guided robot navigation in outdoor environments. The proposed system consists of three monocular cameras and an array of nine infrared range sensors. The two cameras equipped on the robot's right and left sides are used to recognize unknown paths and estimate relative robot pose on these paths through bayesian sensor fusion method, and the other camera equipped at the front of the robot is used to recognize abrupt curves and unknown obstacles. The infrared range sensor array is used to improve the robustness of obstacle avoidance. The forward camera and the infrared range sensor array are fused through rule-based method for obstacle avoidance. Experiments in outdoor environments show the mobile robot with the proposed sensor fusion system performed successfully real-time autonomous guided navigation.

SPAD과 CNN의 특성을 반영한 ToF 센서와 스테레오 카메라 융합 시스템 (Fusion System of Time-of-Flight Sensor and Stereo Cameras Considering Single Photon Avalanche Diode and Convolutional Neural Network)

  • 김동엽;이재민;전세웅
    • 로봇학회논문지
    • /
    • 제13권4호
    • /
    • pp.230-236
    • /
    • 2018
  • 3D depth perception has played an important role in robotics, and many sensory methods have also proposed for it. As a photodetector for 3D sensing, single photon avalanche diode (SPAD) is suggested due to sensitivity and accuracy. We have researched for applying a SPAD chip in our fusion system of time-of-fight (ToF) sensor and stereo camera. Our goal is to upsample of SPAD resolution using RGB stereo camera. Currently, we have 64 x 32 resolution SPAD ToF Sensor, even though there are higher resolution depth sensors such as Kinect V2 and Cube-Eye. This may be a weak point of our system, however we exploit this gap using a transition of idea. A convolution neural network (CNN) is designed to upsample our low resolution depth map using the data of the higher resolution depth as label data. Then, the upsampled depth data using CNN and stereo camera depth data are fused using semi-global matching (SGM) algorithm. We proposed simplified fusion method created for the embedded system.

실시간 임베디드 센서 네트워크 시스템에서 강건한 데이터, 이벤트 및 프라이버시 서비스 기술 (Robust Data, Event, and Privacy Services in Real-Time Embedded Sensor Network Systems)

  • 정강수;;손상혁;박석
    • 한국정보과학회논문지:데이타베이스
    • /
    • 제37권6호
    • /
    • pp.324-332
    • /
    • 2010
  • 실시간 임베디드 센서 네트워크 시스템에서의 이벤트 감지는 대부분 현실세계에서 수집된 센서 데이터들의 조합에 기반한다. 이에 최근에 이루어진 연구들에선 센서 데이터들을 수집, 집계하는 낮은 수준의 다양한 메커니즘들을 제안하였다. 그러나 실시간에서 연속적으로 발생하는 복잡한 이벤트들의 감지와 다양한 종류의 센서들로부터 입력되는 실시간 데이터의 처리를 위한 시스템에 대한 솔루션은 보다 많은 연구를 필요로 한다. 즉, 경량의 데이터 혼합이 가능하고 많은 컴퓨팅 자원을 필요로 하지 않는 실시간 이벤트 감지 기법이 필요하다. 이벤트 감지 프레임워크는 실시간 모니터링과 센서 데이터의 도착으로 일어나는 데이터 융합 메커니즘을 통하여 적시성과 임베디드 센서 네트워크의 자원 요구량을 감소시킬 수 있는 잠재력을 지니고 있다. 또한 임베디드 센서 네트워크 시스템이 신뢰성을 지닐 수 있도록 하기 위한 기반 기술인 프라이버시를 보장할 수 있는 익명화 기술을 설명한다.