• 제목/요약/키워드: sensor data fusion

검색결과 382건 처리시간 0.026초

A Study on Mobile Robot Navigation Using a New Sensor Fusion

  • Tack, Han-Ho;Jin, Tae-Seok;Lee, Sang-Bae
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2003년도 ISIS 2003
    • /
    • pp.471-475
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

  • PDF

오류 역전파 신경망 기반의 센서융합을 이용한 이동로봇의 효율적인 지도 작성 (An Effective Mapping for a Mobile Robot using Error Backpropagation based Sensor Fusion)

  • 김경동;곡효천;최경식;이석규
    • 한국정밀공학회지
    • /
    • 제28권9호
    • /
    • pp.1040-1047
    • /
    • 2011
  • This paper proposes a novel method based on error back propagation neural networks to fuse laser sensor data and ultrasonic sensor data for enhancing the accuracy of mapping. For navigation of single robot, the robot has to know its initial position and accurate environment information around it. However, due to the inherent properties of sensors, each sensor has its own advantages and drawbacks. In our system, the robot equipped with seven ultrasonic sensors and a laser sensor navigates to map two different corridor environments. The experimental results show the effectiveness of the heterogeneous sensor fusion using an error backpropagation algorithm for mapping.

Simulation of Mobile Robot Navigation based on Multi-Sensor Data Fusion by Probabilistic Model

  • Jin, Tae-seok
    • 한국산업융합학회 논문집
    • /
    • 제21권4호
    • /
    • pp.167-174
    • /
    • 2018
  • Presently, the exploration of an unknown environment is an important task for the development of mobile robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems, In mobile robotics, multi-sensor data fusion(MSDF) became useful method for navigation and collision avoiding. Moreover, their applicability for map building and navigation has exploited in recent years. In this paper, as the preliminary step for developing a multi-purpose autonomous carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as ultrasonic sensor, IR sensor for mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within indoor environments. Simulation results with a mobile robot will demonstrate the effectiveness of the discussed methods.

항공영상과 라이다 자료를 이용한 이종센서 자료간의 alignment에 관한 연구 (A study on the alignment of different sensor data with areial images and lidar data)

  • 곽태석;이재빈;조현기;김용일
    • 한국측량학회:학술대회논문집
    • /
    • 한국측량학회 2004년도 추계학술발표회 논문집
    • /
    • pp.257-262
    • /
    • 2004
  • The purpose of data fusion is collecting maximized information from combining the data attained from more than two same or different kind sensor systems. Data fusion of same kind sensor systems like optical imagery has been on focus, but recently, LIDAR emerged as a new technology for capturing rapidally data on physical surfaces and the high accuray results derived from the LIDAR data. Considering the nature of aerial imagery and LIDAR data, it is clear that the two systems provide complementary information. Data fusion is consisted of two steps, alignment and matching. However, the complementary information can only be fully utilized after sucessful alignment of the aerial imagery and lidar data. In this research, deal with centroid of building extracted from lidar data as control information for estimating exterior orientation parameters of aerial imagery relative to the LIDAR reference frame.

  • PDF

역공학에서 센서융합에 의한 효율적인 데이터 획득 (Efficient Digitizing in Reverse Engineering By Sensor Fusion)

  • 박영근;고태조;김희술
    • 한국정밀공학회지
    • /
    • 제18권9호
    • /
    • pp.61-70
    • /
    • 2001
  • This paper introduces a new digitization method with sensor fusion for shape measurement in reverse engineering. Digitization can be classified into contact and non-contact type according to the measurement devices. Important thing in digitization is speed and accuracy. The former is excellent in speed and the latter is good for accuracy. Sensor fusion in digitization intends to incorporate the merits of both types so that the system can be automatized. Firstly, non-contact sensor with vision system acquires coarse 3D point data rapidly. This process is needed to identify and loco]ice the object located at unknown position on the table. Secondly, accurate 3D point data can be automatically obtained using scanning probe based on the previously measured coarse 3D point data. In the research, a great number of measuring points of equi-distance were instructed along the line acquired by the vision system. Finally, the digitized 3D point data are approximated to the rational B-spline surface equation, and the free-formed surface information can be transferred to a commercial CAD/CAM system via IGES translation in order to machine the modeled geometric shape.

  • PDF

정보융합 기법을 활용한 잠수함 표적기동분석 성능향상 연구 (The Improvement of Target Motion Analysis(TMA) for Submarine with Data Fusion)

  • 임영택;고순주;송택렬
    • 한국군사과학기술학회지
    • /
    • 제12권6호
    • /
    • pp.697-703
    • /
    • 2009
  • Target Motion Analysis(TMA) means to detect target position, velocity and course for using passive sonar system with bearing-only measurement. In this paper, we apply the TMA algorithm for a submarine with Multi-Sensor Data Fusion(MSDF) and we will decide the best TMA algorithm for a submarine by a series of computer simulation runs.

ACC/AEBS 시스템용 센서퓨전을 통한 주행경로 추정 알고리즘 (Development of the Driving path Estimation Algorithm for Adaptive Cruise Control System and Advanced Emergency Braking System Using Multi-sensor Fusion)

  • 이동우;이경수;이재완
    • 자동차안전학회지
    • /
    • 제3권2호
    • /
    • pp.28-33
    • /
    • 2011
  • This paper presents driving path estimation algorithm for adaptive cruise control system and advanced emergency braking system using multi-sensor fusion. Through data collection, yaw rate filtering based road curvature and vision sensor road curvature characteristics are analyzed. Yaw rate filtering based road curvature and vision sensor road curvature are fused into the one curvature by weighting factor which are considering characteristics of each curvature data. The proposed driving path estimation algorithm has been investigated via simulation performed on a vehicle package Carsim and Matlab/Simulink. It has been shown via simulation that the proposed driving path estimation algorithm improves primary target detection rate.

Virtual Environment Building and Navigation of Mobile Robot using Command Fusion and Fuzzy Inference

  • Jin, Taeseok
    • 한국산업융합학회 논문집
    • /
    • 제22권4호
    • /
    • pp.427-433
    • /
    • 2019
  • This paper propose a fuzzy inference model for map building and navigation for a mobile robot with an active camera, which is intelligently navigating to the goal location in unknown environments using sensor fusion, based on situational command using an active camera sensor. Active cameras provide a mobile robot with the capability to estimate and track feature images over a hallway field of view. In this paper, instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. Command fusion method is used to govern the robot navigation. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of active camera sensor for navigation experiments are fused into the identification process. Navigation performance improves on that achieved using fuzzy inference alone and shows significant advantages over command fusion techniques. Experimental evidences are provided, demonstrating that the proposed method can be reliably used over a wide range of relative positions between the active camera and the feature images.

Data Alignment for Data Fusion in Wireless Multimedia Sensor Networks Based on M2M

  • Cruz, Jose Roberto Perez;Hernandez, Saul E. Pomares;Cote, Enrique Munoz De
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제6권1호
    • /
    • pp.229-240
    • /
    • 2012
  • Advances in MEMS and CMOS technologies have motivated the development of low cost/power sensors and wireless multimedia sensor networks (WMSN). The WMSNs were created to ubiquitously harvest multimedia content. Such networks have allowed researchers and engineers to glimpse at new Machine-to-Machine (M2M) Systems, such as remote monitoring of biosignals for telemedicine networks. These systems require the acquisition of a large number of data streams that are simultaneously generated by multiple distributed devices. This paradigm of data generation and transmission is known as event-streaming. In order to be useful to the application, the collected data requires a preprocessing called data fusion, which entails the temporal alignment task of multimedia data. A practical way to perform this task is in a centralized manner, assuming that the network nodes only function as collector entities. However, by following this scheme, a considerable amount of redundant information is transmitted to the central entity. To decrease such redundancy, data fusion must be performed in a collaborative way. In this paper, we propose a collaborative data alignment approach for event-streaming. Our approach identifies temporal relationships by translating temporal dependencies based on a timeline to causal dependencies of the media involved.

Application of Random Forests to Assessment of Importance of Variables in Multi-sensor Data Fusion for Land-cover Classification

  • Park No-Wook;Chi kwang-Hoon
    • 대한원격탐사학회지
    • /
    • 제22권3호
    • /
    • pp.211-219
    • /
    • 2006
  • A random forests classifier is applied to multi-sensor data fusion for supervised land-cover classification in order to account for the importance of variable. The random forests approach is a non-parametric ensemble classifier based on CART-like trees. The distinguished feature is that the importance of variable can be estimated by randomly permuting the variable of interest in all the out-of-bag samples for each classifier. Two different multi-sensor data sets for supervised classification were used to illustrate the applicability of random forests: one with optical and polarimetric SAR data and the other with multi-temporal Radarsat-l and ENVISAT ASAR data sets. From the experimental results, the random forests approach could extract important variables or bands for land-cover discrimination and showed reasonably good performance in terms of classification accuracy.