• 제목/요약/키워드: Sensor-Fusion

검색결과 812건 처리시간 0.045초

이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합 (Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot)

  • 김민영;안상태;조형석
    • 제어로봇시스템학회논문지
    • /
    • 제16권4호
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

Sensor fault diagnosis for bridge monitoring system using similarity of symmetric responses

  • Xu, Xiang;Huang, Qiao;Ren, Yuan;Zhao, Dan-Yang;Yang, Juan
    • Smart Structures and Systems
    • /
    • 제23권3호
    • /
    • pp.279-293
    • /
    • 2019
  • To ensure high quality data being used for data mining or feature extraction in the bridge structural health monitoring (SHM) system, a practical sensor fault diagnosis methodology has been developed based on the similarity of symmetric structure responses. First, the similarity of symmetric response is discussed using field monitoring data from different sensor types. All the sensors are initially paired and sensor faults are then detected pair by pair to achieve the multi-fault diagnosis of sensor systems. To resolve the coupling response issue between structural damage and sensor fault, the similarity for the target zone (where the studied sensor pair is located) is assessed to determine whether the localized structural damage or sensor fault results in the dissimilarity of the studied sensor pair. If the suspected sensor pair is detected with at least one sensor being faulty, field test could be implemented to support the regression analysis based on the monitoring and field test data for sensor fault isolation and reconstruction. Finally, a case study is adopted to demonstrate the effectiveness of the proposed methodology. As a result, Dasarathy's information fusion model is adopted for multi-sensor information fusion. Euclidean distance is selected as the index to assess the similarity. In conclusion, the proposed method is practical for actual engineering which ensures the reliability of further analysis based on monitoring data.

오류 역전파 신경망 기반의 센서융합을 이용한 이동로봇의 효율적인 지도 작성 (An Effective Mapping for a Mobile Robot using Error Backpropagation based Sensor Fusion)

  • 김경동;곡효천;최경식;이석규
    • 한국정밀공학회지
    • /
    • 제28권9호
    • /
    • pp.1040-1047
    • /
    • 2011
  • This paper proposes a novel method based on error back propagation neural networks to fuse laser sensor data and ultrasonic sensor data for enhancing the accuracy of mapping. For navigation of single robot, the robot has to know its initial position and accurate environment information around it. However, due to the inherent properties of sensors, each sensor has its own advantages and drawbacks. In our system, the robot equipped with seven ultrasonic sensors and a laser sensor navigates to map two different corridor environments. The experimental results show the effectiveness of the heterogeneous sensor fusion using an error backpropagation algorithm for mapping.

A Correction System of Odometry Error for Map Building of Mobile Robot Based on Sensor fusion

  • Hyun, Woong-Keun
    • Journal of information and communication convergence engineering
    • /
    • 제8권6호
    • /
    • pp.709-715
    • /
    • 2010
  • This paper represents a map building and localization system for mobile robot. Map building and navigation is a complex problem because map integrity cannot be sustained by odometry alone due to errors introduced by wheel slippage, distortion and simple linealized odometry equation. For accurate localization, we propose sensor fusion system using encoder sensor and indoor GPS module as relative sensor and absolute sensor, respectively. To build a map, we developed a sensor based navigation algorithm and grid based map building algorithm based on Embedded Linux O.S. A wall following decision engine like an expert system was proposed for map building navigation. We proved this system's validity through field test.

도시 환경을 위한 센서 융합 기반 저속 근거리 충돌 경보 알고리즘 개발 (Development of Sensor Fusion-Based Low-Speed Short-Distance Collision Warning Algorithm for Urban Area)

  • 전종기;김만호;이석;이경창
    • 대한임베디드공학회논문지
    • /
    • 제6권3호
    • /
    • pp.157-167
    • /
    • 2011
  • Although vehicles become more intelligent for convenience and safety of drivers, traffic accidents are increased more and more. Especially, car-to-car single rear impacts in the urban area are increased rapidly because of driver inattention. To prevent rear impacts in the urban area, commercial automobile vendor applies the low-speed short-distance collision warning system. This paper presents low-speed short-distance collision warning algorithm for the city driving by using sensor fusion of laser sensor and ultrasonic sensor. An experiment using embedded microprocessor in the driving track was used to demonstrate the feasibility of the collision warning algorithm.

다중센서자료 시뮬레이터 설계 및 자료융합 알고리듬 개발 (Design of a Multi-Sensor Data Simulator and Development of Data Fusion Algorithm)

  • 이용재;이자성;고선준;송종화
    • 한국항공우주학회지
    • /
    • 제34권5호
    • /
    • pp.93-100
    • /
    • 2006
  • 본 논문에서는 레이더와 원격측정시스템으로부터 수신되는 다중센서자료를 모사하는 시뮬레이터 설계와 이들 자료를 융합하기 위한 알고리듬 개발에 대하여 소개한다. 설계된 데이터 시뮬레이터는 실제 센서 시스템으로부터 얻게 되는 시간의 비동기, 통신지연, 다중 갱신주기들을 갖는 모의센서 자료를 생성하며 실제적인 센서 모델을 이용하여 측정 잡음을 생성한다. 융합알고리듬은 센서 바이어스 상태를 고려한 PVA모델을 기초로 21차 분산형 칼만 필터로 설계되었고, 센서의 이상이나 정상적이 아닌 측정치를 검출하기 위한 로직도 포함되었다. 설계된 알고리듬을 시뮬레이터에서 생성한 모의 자료 및 실제 자료를 적용하여 검증하였다.

다중주기 칼만 필터를 이용한 비동기 센서 융합 (Asynchronous Sensor Fusion using Multi-rate Kalman Filter)

  • 손영섭;김원희;이승희;정정주
    • 전기학회논문지
    • /
    • 제63권11호
    • /
    • pp.1551-1558
    • /
    • 2014
  • We propose a multi-rate sensor fusion of vision and radar using Kalman filter to solve problems of asynchronized and multi-rate sampling periods in object vehicle tracking. A model based prediction of object vehicles is performed with a decentralized multi-rate Kalman filter for each sensor (vision and radar sensors.) To obtain the improvement in the performance of position prediction, different weighting is applied to each sensor's predicted object position from the multi-rate Kalman filter. The proposed method can provide estimated position of the object vehicles at every sampling time of ECU. The Mahalanobis distance is used to make correspondence among the measured and predicted objects. Through the experimental results, we validate that the post-processed fusion data give us improved tracking performance. The proposed method obtained two times improvement in the object tracking performance compared to single sensor method (camera or radar sensor) in the view point of roots mean square error.

다중 센서 융합을 사용한 자동차형 로봇의 효율적인 실외 지역 위치 추정 방법 (An Efficient Outdoor Localization Method Using Multi-Sensor Fusion for Car-Like Robots)

  • 배상훈;김병국
    • 제어로봇시스템학회논문지
    • /
    • 제17권10호
    • /
    • pp.995-1005
    • /
    • 2011
  • An efficient outdoor local localization method is suggested using multi-sensor fusion with MU-EKF (Multi-Update Extended Kalman Filter) for car-like mobile robots. In outdoor environments, where mobile robots are used for explorations or military services, accurate localization with multiple sensors is indispensable. In this paper, multi-sensor fusion outdoor local localization algorithm is proposed, which fuses sensor data from LRF (Laser Range Finder), Encoder, and GPS. First, encoder data is used for the prediction stage of MU-EKF. Then the LRF data obtained by scanning the environment is used to extract objects, and estimates the robot position and orientation by mapping with map objects, as the first update stage of MU-EKF. This estimation is finally fused with GPS as the second update stage of MU-EKF. This MU-EKF algorithm can also fuse more than three sensor data efficiently even with different sensor data sampling periods, and ensures high accuracy in localization. The validity of the proposed algorithm is revealed via experiments.

사물인터넷 시스템을 위한 센서 융합 FPGA 구현 (Implementation of a Sensor Fusion FPGA for an IoT System)

  • 정창민;이광엽;박태룡
    • 전기전자학회논문지
    • /
    • 제19권2호
    • /
    • pp.142-147
    • /
    • 2015
  • 본 논문에서는 자이로 센서와 가속도 센서로부터 얻은 정보를 보정 및 융합하여 자세를 추정하는 칼만 필터 기반 센서 융합 필터의 설계를 제안한다. 최근 센서 네트워크 기술의 발전으로 인해 센터 데이터의 융합 기술이 요구되고 있다. 본 논문에서는 필터의 비선형 시스템 모델을 Jacobian Matrix 연산을 통해 선형 시스템 모델로 변환하며, 오일러 적분을 통해 추정 값을 예측한다. 제안한 필터는 Xilinx 사의 Virtex-6 FPGA Board 를 이용하여 구현하였다. 구현한 필터는 74MHz 동작 주파수로 동작하며, 기존 필터들과 구현한 필터를 비교하여 추정 자세의 정확도 및 신뢰도를 확인하였다.

Virtual Environment Building and Navigation of Mobile Robot using Command Fusion and Fuzzy Inference

  • Jin, Taeseok
    • 한국산업융합학회 논문집
    • /
    • 제22권4호
    • /
    • pp.427-433
    • /
    • 2019
  • This paper propose a fuzzy inference model for map building and navigation for a mobile robot with an active camera, which is intelligently navigating to the goal location in unknown environments using sensor fusion, based on situational command using an active camera sensor. Active cameras provide a mobile robot with the capability to estimate and track feature images over a hallway field of view. In this paper, instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. Command fusion method is used to govern the robot navigation. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of active camera sensor for navigation experiments are fused into the identification process. Navigation performance improves on that achieved using fuzzy inference alone and shows significant advantages over command fusion techniques. Experimental evidences are provided, demonstrating that the proposed method can be reliably used over a wide range of relative positions between the active camera and the feature images.