• Title/Summary/Keyword: Sensor Fusion System

Search Result 435, Processing Time 0.023 seconds

Sensor Fusion for Underwater Navigation of Unmanned Underwater Vehicle (무인잠수정의 수중합법을 위한 센서융합)

  • Sur, Joo-No
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.8 no.4 s.23
    • /
    • pp.14-23
    • /
    • 2005
  • In this paper we propose a sensor fusion method for the navigation algorithm which can be used to estimate state vectors such as position and velocity for its motion control using multi-sensor output measurements. The output measurement we will use in estimating the state is a series of known multi-sensor asynchronous outputs with measurement noise. This paper investigates the Extended Kalman Filtering method to merge asynchronous heading, heading rate, velocity of DVL, and SSBL information to produce a single state vector. Different complexity of Kalman Filter, with. biases and measurement noise, are investigated with theoretically data from MOERI's SAUV. All levels of complexity of the Kalman Filters are shown to be much more close and smooth to real trajectories then the basic underwater acoustic navigation system commonly used aboard underwater vehicle.

The Improvement of Target Motion Analysis(TMA) for Submarine with Data Fusion (정보융합 기법을 활용한 잠수함 표적기동분석 성능향상 연구)

  • Lim, Young-Taek;Ko, Soon-Ju;Song, Taek-Lyul
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.12 no.6
    • /
    • pp.697-703
    • /
    • 2009
  • Target Motion Analysis(TMA) means to detect target position, velocity and course for using passive sonar system with bearing-only measurement. In this paper, we apply the TMA algorithm for a submarine with Multi-Sensor Data Fusion(MSDF) and we will decide the best TMA algorithm for a submarine by a series of computer simulation runs.

Bio-inspired neuro-symbolic approach to diagnostics of structures

  • Shoureshi, Rahmat A.;Schantz, Tracy;Lim, Sun W.
    • Smart Structures and Systems
    • /
    • v.7 no.3
    • /
    • pp.229-240
    • /
    • 2011
  • Recent developments in Smart Structures with very large scale embedded sensors and actuators have introduced new challenges in terms of data processing and sensor fusion. These smart structures are dynamically classified as a large-scale system with thousands of sensors and actuators that form the musculoskeletal of the structure, analogous to human body. In order to develop structural health monitoring and diagnostics with data provided by thousands of sensors, new sensor informatics has to be developed. The focus of our on-going research is to develop techniques and algorithms that would utilize this musculoskeletal system effectively; thus creating the intelligence for such a large-scale autonomous structure. To achieve this level of intelligence, three major research tasks are being conducted: development of a Bio-Inspired data analysis and information extraction from thousands of sensors; development of an analytical technique for Optimal Sensory System using Structural Observability; and creation of a bio-inspired decision-making and control system. This paper is focused on the results of our effort on the first task, namely development of a Neuro-Morphic Engineering approach, using a neuro-symbolic data manipulation, inspired by the understanding of human information processing architecture, for sensor fusion and structural diagnostics.

Development of a system architecture for an advanced autonomous underwater vehicle, ORCA

  • Choi, Hyun-Taek;Lee, Pan-Mook
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1791-1796
    • /
    • 2004
  • Recently, great improvements have been made in developing autonomous underwater vehicles (AUVs) using stateof- the-art technologies for various kinds of sophisticated underwater missions. To meet increasing demands posed on AUVs, a powerful on-board computer system and an accurate sensor system with an well-organized control system architecture are needed. In this paper, a new control system architecture is proposed for AUV, ORCA (Oceanic Reinforced Cruising Agent) which is being currently developed by Korea Research Institute of Ships and Ocean Engineering (KRISO). The proposed architecture uses a hybrid architecture that combines a hierarchical architecture and a behavior based control architecture with an evaluator for coordinating between the architectures. This paper also proposed a sensor fusion structure based on the definition of 4 categories of sensors called grouping and 5-step data processing procedure. The development of the AUV, ORCA involving the system architecture, vehicle layout, and hardware configuration of on-board system are described.

  • PDF

Development of In process Condition Monitoring System on Turning Process using Artificial Neural Network. (신경회로망 모델을 이용한 선삭 공정의 실시간 이상진단 시스템의 개발)

    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.7 no.3
    • /
    • pp.14-21
    • /
    • 1998
  • The in-process detection of the state of cutting tool is one of the most important technical problem in Intelligent Machining System. This paper presents a method of detecting the state of cutting tool in turning process, by using Artificial Neural Network. In order to sense the state of cutting tool. the sensor fusion of an acoustic emission sensor and a force sensor is applied in this paper. It is shown that AErms and three directional dynamic mean cutting forces are sensitive to the tool wear. Therefore the six pattern features that is, the four sensory signal features and two cutting conditions are selected for the monitoring system with Artificial Neural Network. The proposed monitoring system shows a good recogniton rate for the different cutting conditions.

  • PDF

Performance enhancement of launch vehicle tracking using GPS-based multiple radar bias estimation and sensor fusion (GPS 기반 추적레이더 실시간 바이어스 추정 및 비동기 정보융합을 통한 발사체 추적 성능 개선)

  • Song, Ha-Ryong
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.20 no.6
    • /
    • pp.47-56
    • /
    • 2015
  • In the multi-sensor system, sensor registration errors such as a sensor bias must be corrected so that the individual sensor data are expressed in a common reference frame. If registration process is not properly executed, large tracking errors or formation of multiple track on the same target can be occured. Especially for launch vehicle tracking system, each multiple observation lies on the same reference frame and then fused trajectory can be the best track for slaving data. Hence, this paper describes an on-line bias estimation/correction and asynchronous sensor fusion for launch vehicle tracking. The bias estimation architecture is designed based on pseudo bias measurement which derived from error observation between GPS and radar measurements. Then, asynchronous sensor fusion is adapted to enhance tracking performance.

Precision Analysis of NARX-based Vehicle Positioning Algorithm in GNSS Disconnected Area

  • Lee, Yong;Kwon, Jay Hyoun
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.39 no.5
    • /
    • pp.289-295
    • /
    • 2021
  • Recently, owing to the development of autonomous vehicles, research on precisely determining the position of a moving object has been actively conducted. Previous research mainly used the fusion of GNSS/IMU (Global Positioning System / Inertial Navigation System) and sensors attached to the vehicle through a Kalman filter. However, in recent years, new technologies have been used to determine the location of a moving object owing to the improvement in computing power and the advent of deep learning. Various techniques using RNN (Recurrent Neural Network), LSTM (Long Short-Term Memory), and NARX (Nonlinear Auto-Regressive eXogenous model) exist for such learning-based positioning methods. The purpose of this study is to compare the precision of existing filter-based sensor fusion technology and the NARX-based method in case of GNSS signal blockages using simulation data. When the filter-based sensor integration technology was used, an average horizontal position error of 112.8 m occurred during 60 seconds of GNSS signal outages. The same experiment was performed 100 times using the NARX. Among them, an improvement in precision was confirmed in approximately 20% of the experimental results. The horizontal position accuracy was 22.65 m, which was confirmed to be better than that of the filter-based fusion technique.

Road Surface Marking Detection for Sensor Fusion-based Positioning System (센서 융합 기반 정밀 측위를 위한 노면 표시 검출)

  • Kim, Dongsuk;Jung, Hogi
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.22 no.7
    • /
    • pp.107-116
    • /
    • 2014
  • This paper presents camera-based road surface marking detection methods suited to sensor fusion-based positioning system that consists of low-cost GPS (Global Positioning System), INS (Inertial Navigation System), EDM (Extended Digital Map), and vision system. The proposed vision system consists of two parts: lane marking detection and RSM (Road Surface Marking) detection. The lane marking detection provides ROIs (Region of Interest) that are highly likely to contain RSM. The RSM detection generates candidates in the regions and classifies their types. The proposed system focuses on detecting RSM without false detections and performing real time operation. In order to ensure real time operation, the gating varies for lane marking detection and changes detection methods according to the FSM (Finite State Machine) about the driving situation. Also, a single template matching is used to extract features for both lane marking detection and RSM detection, and it is efficiently implemented by horizontal integral image. Further, multiple step verification is performed to minimize false detections.

A Study on Mobile Robot Navigation Using a New Sensor Fusion

  • Tack, Han-Ho;Jin, Tae-Seok;Lee, Sang-Bae
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.471-475
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

  • PDF

Motion and Structure Estimation Using Fusion of Inertial and Vision Data for Helmet Tracker

  • Heo, Se-Jong;Shin, Ok-Shik;Park, Chan-Gook
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.11 no.1
    • /
    • pp.31-40
    • /
    • 2010
  • For weapon cueing and Head-Mounted Display (HMD), it is essential to continuously estimate the motion of the helmet. The problem of estimating and predicting the position and orientation of the helmet is approached by fusing measurements from inertial sensors and stereo vision system. The sensor fusion approach in this paper is based on nonlinear filtering, especially expended Kalman filter(EKF). To reduce the computation time and improve the performance in vision processing, we separate the structure estimation and motion estimation. The structure estimation tracks the features which are the part of helmet model structure in the scene and the motion estimation filter estimates the position and orientation of the helmet. This algorithm is tested with using synthetic and real data. And the results show that the result of sensor fusion is successful.