• 제목/요약/키워드: sensor data fusion

검색결과 382건 처리시간 0.032초

Sensor Data Fusion for Navigation of Mobile Robot With Collision Avoidance and Trap Recovery

  • Jeon, Young-Su;Ahn, Byeong-Kyu;Kuc, Tae-Yong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.2461-2466
    • /
    • 2003
  • This paper presents a simple sensor fusion algorithm using neural network for navigation of mobile robots with obstacle avoidance and trap recovery. The multiple sensors input sensor data to the input layer of neural network activating the input nodes. The multiple sensors used include optical encoders, ultrasonic sensors, infrared sensors, a magnetic compass sensor, and GPS sensors. The proposed sensor fusion algorithm is combined with the VFH(Vector Field Histogram) algorithm for obstacle avoidance and AGPM(Adaptive Goal Perturbation Method) which sets adaptive virtual goals to escape trap situations. The experiment results show that the proposed low-level fusion algorithm is effective for real-time navigation of mobile robot.

  • PDF

센서 융합을 이용한 움직이는 물체의 동작예측에 관한 연구 (Motion Estimation of 3D Planar Objects using Multi-Sensor Data Fusion)

  • 양우석
    • 센서학회지
    • /
    • 제5권4호
    • /
    • pp.57-70
    • /
    • 1996
  • Motion can be estimated continuously from each sensor through the analysis of the instantaneous states of an object. This paper is aimed to introduce a method to estimate the general 3D motion of a planar object from the instantaneous states of an object using multi-sensor data fusion. The instantaneous states of an object is estimated using the linear feedback estimation algorithm. The motion estimated from each sensor is fused to provide more accurate and reliable information about the motion of an unknown planar object. We present a fusion algorithm which combines averaging and deciding. With the assumption that the motion is smooth, the approach can handle the data sequences from multiple sensors with different sampling times. Simulation results show proposed algorithm is advantageous in terms of accuracy, speed, and versatility.

  • PDF

Data Fusion 기술을 활용한 스마트선박 내 효율적 데이터 전송 방안 (Efficient Data Transmission Scheme with Data Fusion inside a Smart Vessel)

  • 김연근;이성로;정민아;김범무;민상원
    • 한국통신학회논문지
    • /
    • 제39C권11호
    • /
    • pp.1146-1150
    • /
    • 2014
  • 최근 스마트선박 기술 관련 연구에 대한 관심이 증가하고 있다. 또한 스마트선박 기술의 발달로 데이터가 sensor node들에 의해 수집되면서 전송횟수의 증가로 인해 network의 과부하를 야기할 수 있다. 따라서 본 논문에서는 스마트선박 내부의 sensor node에 대한 data 전송횟수에 대한 빈도수를 줄이기 위해 data fusion를 이용하여 data를 가공 처리하는 기법을 적용하였다. 스마트선박 내부에서 수집되는 data는 의미 있는 data 가공하여 센터로 전송되어 network의 부하를 줄이고 효율적으로 data를 전송할 수 있는 방안을 제시하였다.

MULTI-SENSOR DATA FUSION FOR FUTURE TELEMATICS APPLICATION

  • Kim, Seong-Baek;Lee, Seung-Yong;Choi, Ji-Hoon;Choi, Kyung-Ho;Jang, Byung-Tae
    • Journal of Astronomy and Space Sciences
    • /
    • 제20권4호
    • /
    • pp.359-364
    • /
    • 2003
  • In this paper, we present multi-sensor data fusion for telematics application. Successful telematics can be realized through the integration of navigation and spatial information. The well-determined acquisition of vehicle's position plays a vital role in application service. The development of GPS is used to provide the navigation data, but the performance is limited in areas where poor satellite visibility environment exists. Hence, multi-sensor fusion including IMU (Inertial Measurement Unit), GPS(Global Positioning System), and DMI (Distance Measurement Indicator) is required to provide the vehicle's position to service provider and driver behind the wheel. The multi-sensor fusion is implemented via algorithm based on Kalman filtering technique. Navigation accuracy can be enhanced using this filtering approach. For the verification of fusion approach, land vehicle test was performed and the results were discussed. Results showed that the horizontal position errors were suppressed around 1 meter level accuracy under simulated non-GPS availability environment. Under normal GPS environment, the horizontal position errors were under 40㎝ in curve trajectory and 27㎝ in linear trajectory, which are definitely depending on vehicular dynamics.

다중 레이더 환경에서의 바이어스 오차 추정의 가관측성에 대한 연구와 정보 융합 (A Study of Observability Analysis and Data Fusion for Bias Estimation in a Multi-Radar System)

  • 원건희;송택렬;김다솔;서일환;황규환
    • 제어로봇시스템학회논문지
    • /
    • 제17권8호
    • /
    • pp.783-789
    • /
    • 2011
  • Target tracking performance improvement using multi-sensor data fusion is a challenging work. However, biases in the measurements should be removed before various data fusion techniques are applied. In this paper, a bias removing algorithm using measurement data from multi-radar tracking systems is proposed and evaluated by computer simulation. To predict bias estimation performance in various geometric relations between the radar systems and target, a system observability index is proposed and tested via computer simulation results. It is also studied that target tracking which utilizes multi-sensor data fusion with bias-removed measurements results in better performance.

다중센서 데이터융합 기반 상황추론에서 시간경과를 고려한 클러스터링 기법 (A Novel Clustering Method with Time Interval for Context Inference based on the Multi-sensor Data Fusion)

  • 유창근;박찬봉
    • 한국전자통신학회논문지
    • /
    • 제8권3호
    • /
    • pp.397-402
    • /
    • 2013
  • 다중센서를 이용한 상황인식에서 시간변화는 고려해야 하는 요소이다. 센서가 감지하여 보고한 정보를 바탕으로 상황추론에 도달하고자 하는 경우, 일정 시간 간격별로 묶어서 검토하는 것이 유용하다. 본 논문에서는 시간경과를 고려하는 클러스터링 기법을 이용한 다중센서 데이터융합을 제안한다. 각 센서별로 일정시간 간격동안 수집되어 보고된 센싱 정보를 묶어 1차 데이터융합을 실시하고 그 결과를 대상으로 다시 2차 데이터융합을 실시하였다. Dempster-Shafer이론을 이용하여 다중센서 데이터융합을 실시하고 그 결과를 분석하여 상황을 추론하는데 시간간격을 기준으로 세분화시켜 평가하고 이것을 다시 융합함으로써 향상된 상황 정보를 추론할 수 있다.

Sensor fault diagnosis for bridge monitoring system using similarity of symmetric responses

  • Xu, Xiang;Huang, Qiao;Ren, Yuan;Zhao, Dan-Yang;Yang, Juan
    • Smart Structures and Systems
    • /
    • 제23권3호
    • /
    • pp.279-293
    • /
    • 2019
  • To ensure high quality data being used for data mining or feature extraction in the bridge structural health monitoring (SHM) system, a practical sensor fault diagnosis methodology has been developed based on the similarity of symmetric structure responses. First, the similarity of symmetric response is discussed using field monitoring data from different sensor types. All the sensors are initially paired and sensor faults are then detected pair by pair to achieve the multi-fault diagnosis of sensor systems. To resolve the coupling response issue between structural damage and sensor fault, the similarity for the target zone (where the studied sensor pair is located) is assessed to determine whether the localized structural damage or sensor fault results in the dissimilarity of the studied sensor pair. If the suspected sensor pair is detected with at least one sensor being faulty, field test could be implemented to support the regression analysis based on the monitoring and field test data for sensor fault isolation and reconstruction. Finally, a case study is adopted to demonstrate the effectiveness of the proposed methodology. As a result, Dasarathy's information fusion model is adopted for multi-sensor information fusion. Euclidean distance is selected as the index to assess the similarity. In conclusion, the proposed method is practical for actual engineering which ensures the reliability of further analysis based on monitoring data.

다중 센서 융합을 사용한 자동차형 로봇의 효율적인 실외 지역 위치 추정 방법 (An Efficient Outdoor Localization Method Using Multi-Sensor Fusion for Car-Like Robots)

  • 배상훈;김병국
    • 제어로봇시스템학회논문지
    • /
    • 제17권10호
    • /
    • pp.995-1005
    • /
    • 2011
  • An efficient outdoor local localization method is suggested using multi-sensor fusion with MU-EKF (Multi-Update Extended Kalman Filter) for car-like mobile robots. In outdoor environments, where mobile robots are used for explorations or military services, accurate localization with multiple sensors is indispensable. In this paper, multi-sensor fusion outdoor local localization algorithm is proposed, which fuses sensor data from LRF (Laser Range Finder), Encoder, and GPS. First, encoder data is used for the prediction stage of MU-EKF. Then the LRF data obtained by scanning the environment is used to extract objects, and estimates the robot position and orientation by mapping with map objects, as the first update stage of MU-EKF. This estimation is finally fused with GPS as the second update stage of MU-EKF. This MU-EKF algorithm can also fuse more than three sensor data efficiently even with different sensor data sampling periods, and ensures high accuracy in localization. The validity of the proposed algorithm is revealed via experiments.

Command Fusion for Navigation of Mobile Robots in Dynamic Environments with Objects

  • Jin, Taeseok
    • Journal of information and communication convergence engineering
    • /
    • 제11권1호
    • /
    • pp.24-29
    • /
    • 2013
  • In this paper, we propose a fuzzy inference model for a navigation algorithm for a mobile robot that intelligently searches goal location in unknown dynamic environments. Our model uses sensor fusion based on situational commands using an ultrasonic sensor. Instead of using the "physical sensor fusion" method, which generates the trajectory of a robot based upon the environment model and sensory data, a "command fusion" method is used to govern the robot motions. The navigation strategy is based on a combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance based on a hierarchical behavior-based control architecture. To identify the environments, a command fusion technique is introduced where the sensory data of the ultrasonic sensors and a vision sensor are fused into the identification process. The result of experiment has shown that highlights interesting aspects of the goal seeking, obstacle avoiding, decision making process that arise from navigation interaction.

항공전자 멀티센서 정보 융합 구조 연구 (A Study on a Multi-sensor Information Fusion Architecture for Avionics)

  • 강신우;이승필;박준현
    • 한국항행학회논문지
    • /
    • 제17권6호
    • /
    • pp.777-784
    • /
    • 2013
  • 다른 종류의 센서에서 생산되는 데이터를 하나의 정보로 종합하는 공정으로 멀티 센서 데이터 융합이 연구되고 있으며 다양한 플랫폼에서 활용되고 있다. 항공기도 여러 종류의 센서들을 장착하고 있으며 항공전자 체계에서 이를 통합하여 관리하고 있다. 항공기 센서의 성능이 높아지면서 항공전자 관점에서 센서 정보의 통합이 점차 증가하고 있다. 센서에서 생산된 데이터를 하나의 융합된 정보로 항공기 조종사에게 전시 장비로 시현하기 위한 융합을 담당하는 소프트웨어 관점에서 정보 융합을 다루는 연구는 활성화 되지 않고 있다. 항공기에서 정보 융합의 목적은 올바른 전투상황을 조종사에게 제공하여 임무를 수행하는 데에 필요한 결정을 돕고 이를 위한 조종 업무량을 최소화하는 것이다. 본 논문에서는 다양한 센서들이 운용되는 항공전자 시스템의 멀티센서데이터 융합을 위한 소프트웨어 관점에서 센서들이 생산하는 데이터가 사용자에게 종합된 정보 형태로 제공되기 위한 구조를 보인다.