• 제목/요약/키워드: Sensor fusion

검색결과 815건 처리시간 0.029초

항공영상과 라이다 자료를 이용한 이종센서 자료간의 alignment에 관한 연구 (A study on the alignment of different sensor data with areial images and lidar data)

  • 곽태석;이재빈;조현기;김용일
    • 한국측량학회:학술대회논문집
    • /
    • 한국측량학회 2004년도 추계학술발표회 논문집
    • /
    • pp.257-262
    • /
    • 2004
  • The purpose of data fusion is collecting maximized information from combining the data attained from more than two same or different kind sensor systems. Data fusion of same kind sensor systems like optical imagery has been on focus, but recently, LIDAR emerged as a new technology for capturing rapidally data on physical surfaces and the high accuray results derived from the LIDAR data. Considering the nature of aerial imagery and LIDAR data, it is clear that the two systems provide complementary information. Data fusion is consisted of two steps, alignment and matching. However, the complementary information can only be fully utilized after sucessful alignment of the aerial imagery and lidar data. In this research, deal with centroid of building extracted from lidar data as control information for estimating exterior orientation parameters of aerial imagery relative to the LIDAR reference frame.

  • PDF

유연링크시스템 기반에서 WLAN 방식을 적용한 퓨전 주유시스템의 설계와 구현 (Design and Implementation fusion oil lubricator system using WLAN on based flexible link system)

  • 김휘영
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2002년도 하계종합학술대회 논문집(1)
    • /
    • pp.283-286
    • /
    • 2002
  • For the satisfying performance of a oil lubricator, design of a oil controller for the system which meets the required specifications and its supporting hardware that keep their functioning is important. Among the hardware of a control system, oil system are most vulnerable to malfunction. Thus it is necessary to keep track of accurate and reliable oil readings for good fusion oil lubricator performance. In case of oil lubricator ,data loss, ssr trigger error faults, they are detected by examining the data system output values and the major values of the system, and then the faults are recognized by the analysis of symptoms of faults. If necessary electronic -sensor values are synthesized according to the types of faults, and then they are used for the controller instead of the raw data. In this paper, a fast-32bit cpu micorprocessor applied to the control of flexible link system with the sensor fault problems in the error module fer exact positioning to show the applicability. It is shown that the fusion oil lubricator can provide a satisfactory loop performance even when the sensor faults occure

  • PDF

Application of Random Forests to Assessment of Importance of Variables in Multi-sensor Data Fusion for Land-cover Classification

  • Park No-Wook;Chi kwang-Hoon
    • 대한원격탐사학회지
    • /
    • 제22권3호
    • /
    • pp.211-219
    • /
    • 2006
  • A random forests classifier is applied to multi-sensor data fusion for supervised land-cover classification in order to account for the importance of variable. The random forests approach is a non-parametric ensemble classifier based on CART-like trees. The distinguished feature is that the importance of variable can be estimated by randomly permuting the variable of interest in all the out-of-bag samples for each classifier. Two different multi-sensor data sets for supervised classification were used to illustrate the applicability of random forests: one with optical and polarimetric SAR data and the other with multi-temporal Radarsat-l and ENVISAT ASAR data sets. From the experimental results, the random forests approach could extract important variables or bands for land-cover discrimination and showed reasonably good performance in terms of classification accuracy.

확장 칼만 필터를 이용한 로봇의 실내위치측정 (Indoor Localization for Mobile Robot using Extended Kalman Filter)

  • 김정민;김연태;김성신
    • 한국지능시스템학회논문지
    • /
    • 제18권5호
    • /
    • pp.706-711
    • /
    • 2008
  • 본 논문에서는 Inertial Navigation System (INS)와 Ultrasonic-SATellite (U-SAT)의 센서융합을 기반으로 100mm 이하의 정밀위치측정 시스템을 보여준다. INS는 자이로와 두 개의 엔코더로 구성되고, U-SAT는 네 개의 송신기와 한 개의 수신기로 구성하였다. 구성된 센서들은 정밀한 정밀위치측정을 위하여 Extended Kalman Filler (EKF)를 통해 센서들을 융합하였다. 위치측정의 성능을 증명하기 위해 본 논문에서는 로봇이 0.5 m/s의 속도로 주행한 실제 데이터(직진, 곡선)와 시뮬레이션을 통한 실험을 하였으며, 실험에 사용된 위치측정방법은 일반적인 센서융합과 INS 데이터만을 칼만 필터에 이용한 센서융합을 비교하였다. 시뮬레이션과 실제 데이터를 통해 실험한 결과, INS 데이터만을 칼만 필터에 이용한 센서융합이 더 정밀함을 확인할 수 있었다.

Space and Time Sensor Fusion Using an Active Camera For Mobile Robot Navigation

  • Jin, Tae-Seok;Lee, Bong-Ki;Park, Soo-Min;Lee, Kwon-Soon;Lee, Jang-Myung
    • 한국항해항만학회:학술대회논문집
    • /
    • 한국항해항만학회 2002년도 추계공동학술대회논문집
    • /
    • pp.127-132
    • /
    • 2002
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

  • PDF

Fusion of DEMs Generated from Optical and SAR Sensor

  • Jin, Kveong-Hyeok;Yeu, Yeon;Hong, Jae-Min;Yoon, Chang-Rak;Yeu, Bock-Mo
    • 대한공간정보학회지
    • /
    • 제10권5호
    • /
    • pp.53-65
    • /
    • 2002
  • The most widespread techniques for DEM generation are stereoscopy for optical sensor images and SAR interferometry(InSAR) for SAR images. These techniques suffer from certain sensor and processing limitations, which can be overcome by the synergetic use of both sensors and DEMs respectively. This study is associated with improvements of accuracy with consistency of image's characteristics between two different DEMs coming from stereoscopy for the optical images and interferometry for SAR images. The MWD(Multiresolution Wavelet Decomposition) and HPF(High-Pass Filtering), which take advantage of the complementary properties of SAR and stereo optical DEMs, will be applied for the fusion process. DEM fusion is tested with two sets of SPOT and ERS-l/-2 satellite imagery and for the analysis of results, DEM generated from digital topographic map(1 to 5000) is used. As a result of an integration of DEMs, it can more clearly portray topographic slopes and tilts when applying the strengths of DEM of SAR image to DEM of an optical satellite image and in the case of HPF, the resulting DEM.

  • PDF

Data Alignment for Data Fusion in Wireless Multimedia Sensor Networks Based on M2M

  • Cruz, Jose Roberto Perez;Hernandez, Saul E. Pomares;Cote, Enrique Munoz De
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제6권1호
    • /
    • pp.229-240
    • /
    • 2012
  • Advances in MEMS and CMOS technologies have motivated the development of low cost/power sensors and wireless multimedia sensor networks (WMSN). The WMSNs were created to ubiquitously harvest multimedia content. Such networks have allowed researchers and engineers to glimpse at new Machine-to-Machine (M2M) Systems, such as remote monitoring of biosignals for telemedicine networks. These systems require the acquisition of a large number of data streams that are simultaneously generated by multiple distributed devices. This paradigm of data generation and transmission is known as event-streaming. In order to be useful to the application, the collected data requires a preprocessing called data fusion, which entails the temporal alignment task of multimedia data. A practical way to perform this task is in a centralized manner, assuming that the network nodes only function as collector entities. However, by following this scheme, a considerable amount of redundant information is transmitted to the central entity. To decrease such redundancy, data fusion must be performed in a collaborative way. In this paper, we propose a collaborative data alignment approach for event-streaming. Our approach identifies temporal relationships by translating temporal dependencies based on a timeline to causal dependencies of the media involved.

Precision Analysis of NARX-based Vehicle Positioning Algorithm in GNSS Disconnected Area

  • Lee, Yong;Kwon, Jay Hyoun
    • 한국측량학회지
    • /
    • 제39권5호
    • /
    • pp.289-295
    • /
    • 2021
  • Recently, owing to the development of autonomous vehicles, research on precisely determining the position of a moving object has been actively conducted. Previous research mainly used the fusion of GNSS/IMU (Global Positioning System / Inertial Navigation System) and sensors attached to the vehicle through a Kalman filter. However, in recent years, new technologies have been used to determine the location of a moving object owing to the improvement in computing power and the advent of deep learning. Various techniques using RNN (Recurrent Neural Network), LSTM (Long Short-Term Memory), and NARX (Nonlinear Auto-Regressive eXogenous model) exist for such learning-based positioning methods. The purpose of this study is to compare the precision of existing filter-based sensor fusion technology and the NARX-based method in case of GNSS signal blockages using simulation data. When the filter-based sensor integration technology was used, an average horizontal position error of 112.8 m occurred during 60 seconds of GNSS signal outages. The same experiment was performed 100 times using the NARX. Among them, an improvement in precision was confirmed in approximately 20% of the experimental results. The horizontal position accuracy was 22.65 m, which was confirmed to be better than that of the filter-based fusion technique.

카메라-라이다 센서 융합을 통한 VRU 분류 및 추적 알고리즘 개발 (Vision and Lidar Sensor Fusion for VRU Classification and Tracking in the Urban Environment)

  • 김유진;이호준;이경수
    • 자동차안전학회지
    • /
    • 제13권4호
    • /
    • pp.7-13
    • /
    • 2021
  • This paper presents an vulnerable road user (VRU) classification and tracking algorithm using vision and LiDAR sensor fusion method for urban autonomous driving. The classification and tracking for vulnerable road users such as pedestrian, bicycle, and motorcycle are essential for autonomous driving in complex urban environments. In this paper, a real-time object image detection algorithm called Yolo and object tracking algorithm from LiDAR point cloud are fused in the high level. The proposed algorithm consists of four parts. First, the object bounding boxes on the pixel coordinate, which is obtained from YOLO, are transformed into the local coordinate of subject vehicle using the homography matrix. Second, a LiDAR point cloud is clustered based on Euclidean distance and the clusters are associated using GNN. In addition, the states of clusters including position, heading angle, velocity and acceleration information are estimated using geometric model free approach (GMFA) in real-time. Finally, the each LiDAR track is matched with a vision track using angle information of transformed vision track and assigned a classification id. The proposed fusion algorithm is evaluated via real vehicle test in the urban environment.

Tracking of ARPA Radar Signals Based on UK-PDAF and Fusion with AIS Data

  • Chan Woo Han;Sung Wook Lee;Eun Seok Jin
    • 한국해양공학회지
    • /
    • 제37권1호
    • /
    • pp.38-48
    • /
    • 2023
  • To maintain the existing systems of ships and introduce autonomous operation technology, it is necessary to improve situational awareness through the sensor fusion of the automatic identification system (AIS) and automatic radar plotting aid (ARPA), which are installed sensors. This study proposes an algorithm for determining whether AIS and ARPA signals are sent to the same ship in real time. To minimize the number of errors caused by the time series and abnormal phenomena of heterogeneous signals, a tracking method based on the combination of the unscented Kalman filter and probabilistic data association filter is performed on ARPA radar signals, and a position prediction method is applied to AIS signals. Especially, the proposed algorithm determines whether the signal is for the same vessel by comparing motion-related components among data of heterogeneous signals to which the corresponding method is applied. Finally, a measurement test is conducted on a training ship. In this process, the proposed algorithm is validated using the AIS and ARPA signal data received by the voyage data recorder for the same ship. In addition, the proposed algorithm is verified by comparing the test results with those obtained from raw data. Therefore, it is recommended to use a sensor fusion algorithm that considers the characteristics of sensors to improve the situational awareness accuracy of existing ship systems.