• Title/Summary/Keyword: Multi-sensor Data Fusion System

Search Result 67, Processing Time 0.028 seconds

Design of a Situation-Awareness Processing System with Autonomic Management based on Multi-Sensor Data Fusion (다중센서 데이터 융합 기반의 자율 관리 능력을 갖는 상황인식처리 시스템의 설계)

  • Young-Gyun Kim;Chang-Won Hyun;Jang Hun Oh;Hyo-Chul Ahn;Young-Soo Kim
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2008.11a
    • /
    • pp.913-916
    • /
    • 2008
  • 다중 센서 데이터 융합(Multi-Sensor Data Fusion)에 기반하여 자율관리 기능을 갖는 상황인식시스템에 대해 연구하였다. 다양한 형태의 센서들이 대규모의 네트워크로 연결된 환경에서 센서로부터 실시간으로 입력되는 데이터들을 융합하여 상황인식처리를 수행하는 시스템으로 노드에 설치된 소프트웨어 콤포넌트의 이상 유무를 자동 감지하고 치료하는 자율관리(Autonomic management) 기능을 갖는다. 제안한 시스템은 유비쿼터스 및 국방 무기체계의 감시·정찰, 지능형 자율 로봇, 지능형 자동차 등 다양한 상황인식 시스템에 적용가능하다.

Simultaneous Localization and Mobile Robot Navigation using a Sensor Network

  • Jin Tae-Seok;Bashimoto Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.2
    • /
    • pp.161-166
    • /
    • 2006
  • Localization of mobile agent within a sensing network is a fundamental requirement for many applications, using networked navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems, This paper describes a networked sensor-based navigation method in an indoor environment for an autonomous mobile robot which can navigate and avoid obstacle. In this method, the self-localization of the robot is done with a model-based vision system using networked sensors, and nonstop navigation is realized by a Kalman filter-based STSF(Space and Time Sensor Fusion) method. Stationary obstacles and moving obstacles are avoided with networked sensor data such as CCD camera and sonar ring. We will report on experiments in a hallway using the Pioneer-DX robot. In addition to that, the localization has inevitable uncertainties in the features and in the robot position estimation. Kalman filter scheme is used for the estimation of the mobile robot localization. And Extensive experiments with a robot and a sensor network confirm the validity of the approach.

Performance enhancement of launch vehicle tracking using GPS-based multiple radar bias estimation and sensor fusion (GPS 기반 추적레이더 실시간 바이어스 추정 및 비동기 정보융합을 통한 발사체 추적 성능 개선)

  • Song, Ha-Ryong
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.20 no.6
    • /
    • pp.47-56
    • /
    • 2015
  • In the multi-sensor system, sensor registration errors such as a sensor bias must be corrected so that the individual sensor data are expressed in a common reference frame. If registration process is not properly executed, large tracking errors or formation of multiple track on the same target can be occured. Especially for launch vehicle tracking system, each multiple observation lies on the same reference frame and then fused trajectory can be the best track for slaving data. Hence, this paper describes an on-line bias estimation/correction and asynchronous sensor fusion for launch vehicle tracking. The bias estimation architecture is designed based on pseudo bias measurement which derived from error observation between GPS and radar measurements. Then, asynchronous sensor fusion is adapted to enhance tracking performance.

Flight trajectory generation through post-processing of launch vehicle tracking data (발사체 추적자료 후처리를 통한 비행궤적 생성)

  • Yun, Sek-Young;Lyou, Joon
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.19 no.6
    • /
    • pp.53-61
    • /
    • 2014
  • For monitoring the flight trajectory and the status of a launch vehicle, the mission control system in NARO space center process data acquired from the ground tracking system, which consists of two tracking radars, four telemetry stations, and one electro-optical tracking system. Each tracking unit exhibits its own tracking error mainly due to multi-path, clutter and radio refraction, and by utilizing only one among transmitted informations, it is not possible to determine the actual vehicle trajectory. This paper presents a way of generating flight trajectory via post-processing the data received from the ground tracking system. The post-processing algorithm is divided into two parts: compensation for atmosphere radio refraction and multi-sensor fusion, for which a decentralized Kalman filter was adopted and implemented based on constant acceleration model. Applications of the present scheme to real data resulted in the flight trajectory where the tracking errors were minimized than done by any one sensor.

Multi-sensor Fusion Based Guidance and Navigation System Design of Autonomous Mine Disposal System Using Finite State Machine (유한 상태 기계를 이용한 자율무인기뢰처리기의 다중센서융합기반 수중유도항법시스템 설계)

  • Kim, Ki-Hun;Choi, Hyun-Taek;Lee, Chong-Moo
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.47 no.6
    • /
    • pp.33-42
    • /
    • 2010
  • This research propose a practical guidance system considering ocean currents in real sea operation. Optimality of generated path is not an issue in this paper. Way-points from start point to possible goal positions are selected by experienced human supervisors considering major ocean current axis. This paper also describes the implementation of a precise underwater navigation solution using multi-sensor fusion technique based on USBL, GPS, DVL and AHRS measurements in detail. To implement the precise, accurate and frequent underwater navigation solution, three strategies are chosen. The first one is the heading alignment angle identification to enhance the performance of standalone dead-reckoning algorithm. The second one is that absolute position is fused timely to prevent accumulation of integration error, where the absolute position can be selected between USBL and GPS considering sensor status. The third one is introduction of effective outlier rejection algorithm. The performance of the developed algorithm is verified with experimental data of mine disposal vehicle and deep-sea ROV.

Development of A Multi-sensor Fusion-based Traffic Information Acquisition System with Robust to Environmental Changes using Mono Camera, Radar and Infrared Range Finder (환경변화에 강인한 단안카메라 레이더 적외선거리계 센서 융합 기반 교통정보 수집 시스템 개발)

  • Byun, Ki-hoon;Kim, Se-jin;Kwon, Jang-woo
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.16 no.2
    • /
    • pp.36-54
    • /
    • 2017
  • The purpose of this paper is to develop a multi-sensor fusion-based traffic information acquisition system with robust to environmental changes. it combines the characteristics of each sensor and is more robust to the environmental changes than the video detector. Moreover, it is not affected by the time of day and night, and has less maintenance cost than the inductive-loop traffic detector. This is accomplished by synthesizing object tracking informations based on a radar, vehicle classification informations based on a video detector and reliable object detections of a infrared range finder. To prove the effectiveness of the proposed system, I conducted experiments for 6 hours over 5 days of the daytime and early evening on the pedestrian - accessible road. According to the experimental results, it has 88.7% classification accuracy and 95.5% vehicle detection rate. If the parameters of this system is optimized to adapt to the experimental environment changes, it is expected that it will contribute to the advancement of ITS.

Map-Building and Position Estimation based on Multi-Sensor Fusion for Mobile Robot Navigation in an Unknown Environment (이동로봇의 자율주행을 위한 다중센서융합기반의 지도작성 및 위치추정)

  • Jin, Tae-Seok;Lee, Min-Jung;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.5
    • /
    • pp.434-443
    • /
    • 2007
  • Presently, the exploration of an unknown environment is an important task for thee new generation of mobile service robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems. This paper presents a technique for localization of a mobile robot using fusion data of multi-ultrasonic sensors and vision system. The mobile robot is designed for operating in a well-structured environment that can be represented by planes, edges, comers and cylinders in the view of structural features. In the case of ultrasonic sensors, these features have the range information in the form of the arc of a circle that is generally named as RCD(Region of Constant Depth). Localization is the continual provision of a knowledge of position which is deduced from it's a priori position estimation. The environment of a robot is modeled into a two dimensional grid map. we defines a vision-based environment recognition, phisically-based sonar sensor model and employs an extended Kalman filter to estimate position of the robot. The performance and simplicity of the approach is demonstrated with the results produced by sets of experiments using a mobile robot.

MULTI-SENSOR INTEGRATION SYSTEM FOR FOREST FIRE PREVENTION

  • Kim Eun Hee;Chi Jeong Hee;Shon Ho Sun;Jung Doo Young;Lee Chung Ho;Ryu Keun Ho
    • Proceedings of the KSRS Conference
    • /
    • 2005.10a
    • /
    • pp.450-453
    • /
    • 2005
  • A forest fire occurs mainly as natural factor such as wind, temperature or human factor such as light. Recently, the most of forest fire prevention is prediction or prevision against forest fire by using remote sensing technology. However in order to forest fire prevention, the remote sensing has many limitations such as high cost and advanced technologies and so on. Therefore, we need to multisensor integration system that utilize not only remote sensing but also in-situ sensing in order to reduce large damage of forest fire though analysis of happen cause and prediction routing of occurred forest fire. In this paper we propose a multisensor integration system that offers prediction information of factors and route of forest fire by integrates collected data from remote sensor and in-situ sensor for forest fire prevention. The proposed system is based on wireless sensor network for collect observed data from various sensors. The proposed system not only offers great quality information because firstly, raw data level fuse different format of collected data from remote and in-situ sensor but also accomplish information level fusion based on result of first stage. Offered information from our system can help early prevention of factor and early prevision against occurred forest fire which transfer to SMS service or alert service into monitoring interface of administrator.

  • PDF

Multi-sensor Fusion Filter for the Flight Safety System of a Space Launch Vehicle (우주발사체 비행안전시스템을 위한 다중센서 융합필터 구현)

  • Ryu, Seong-Sook;Kim, Jeong-Rae;Song, Yong-Kyu;Ko, Jeong-Hwan;Choi, Kyu-Sung
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.37 no.2
    • /
    • pp.156-165
    • /
    • 2009
  • Threat due to malfunction of space launch vehicles is significant since it is bigger and flights longer range than military missiles or scientific rockets. It is necessary to implement a flight safety system to minimize the possible hazard. Design objective of the tracking filter for the flight safety system is different from conventional tracking filters since estimation reliability is more emphasized than estimation accuracy. In this paper, a fusion tracking filter was implemented for processing multi-sensor data from a space launch vehicle. The filter performance is evaluated by analyzing the error of the estimated position and instantaneous impact point. Also a fault detection algorithm is implemented to guarantee fusion filter's reliability under any sensor failure and verified to maintain stability successfully.

Image Fusion of High Resolution SAR and Optical Image Using High Frequency Information (고해상도 SAR와 광학영상의 고주파 정보를 이용한 다중센서 융합)

  • Byun, Young-Gi;Chae, Tae-Byeong
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.30 no.1
    • /
    • pp.75-86
    • /
    • 2012
  • Synthetic Aperture Radar(SAR) imaging system is independent of solar illumination and weather conditions; however, SAR image is difficult to interpret as compared with optical images. It has been increased interest in multi-sensor fusion technique which can improve the interpretability of $SAR^{\circ\circ}$ images by fusing the spectral information from multispectral(MS) image. In this paper, a multi-sensor fusion method based on high-frequency extraction process using Fast Fourier Transform(FFT) and outlier elimination process is proposed, which maintain the spectral content of the original MS image while retaining the spatial detail of the high-resolution SAR image. We used TerraSAR-X which is constructed on the same X-band SAR system as KOMPSAT-5 and KOMPSAT-2 MS image as the test data set to evaluate the proposed method. In order to evaluate the efficiency of the proposed method, the fusion result was compared visually and quantitatively with the result obtained using existing fusion algorithms. The evaluation results showed that the proposed image fusion method achieved successful results in the fusion of SAR and MS image compared with the existing fusion algorithms.