• 제목/요약/키워드: Sensor Fusion System

검색결과 435건 처리시간 0.026초

사물인터넷 시스템을 위한 센서 융합 FPGA 구현 (Implementation of a Sensor Fusion FPGA for an IoT System)

  • 정창민;이광엽;박태룡
    • 전기전자학회논문지
    • /
    • 제19권2호
    • /
    • pp.142-147
    • /
    • 2015
  • 본 논문에서는 자이로 센서와 가속도 센서로부터 얻은 정보를 보정 및 융합하여 자세를 추정하는 칼만 필터 기반 센서 융합 필터의 설계를 제안한다. 최근 센서 네트워크 기술의 발전으로 인해 센터 데이터의 융합 기술이 요구되고 있다. 본 논문에서는 필터의 비선형 시스템 모델을 Jacobian Matrix 연산을 통해 선형 시스템 모델로 변환하며, 오일러 적분을 통해 추정 값을 예측한다. 제안한 필터는 Xilinx 사의 Virtex-6 FPGA Board 를 이용하여 구현하였다. 구현한 필터는 74MHz 동작 주파수로 동작하며, 기존 필터들과 구현한 필터를 비교하여 추정 자세의 정확도 및 신뢰도를 확인하였다.

역공학에서 센서융합에 의한 효율적인 데이터 획득 (Efficient Digitizing in Reverse Engineering By Sensor Fusion)

  • 박영근;고태조;김희술
    • 한국정밀공학회지
    • /
    • 제18권9호
    • /
    • pp.61-70
    • /
    • 2001
  • This paper introduces a new digitization method with sensor fusion for shape measurement in reverse engineering. Digitization can be classified into contact and non-contact type according to the measurement devices. Important thing in digitization is speed and accuracy. The former is excellent in speed and the latter is good for accuracy. Sensor fusion in digitization intends to incorporate the merits of both types so that the system can be automatized. Firstly, non-contact sensor with vision system acquires coarse 3D point data rapidly. This process is needed to identify and loco]ice the object located at unknown position on the table. Secondly, accurate 3D point data can be automatically obtained using scanning probe based on the previously measured coarse 3D point data. In the research, a great number of measuring points of equi-distance were instructed along the line acquired by the vision system. Finally, the digitized 3D point data are approximated to the rational B-spline surface equation, and the free-formed surface information can be transferred to a commercial CAD/CAM system via IGES translation in order to machine the modeled geometric shape.

  • PDF

농업기계 내비게이션을 위한 INS/GPS 통합 연구 (Study on INS/GPS Sensor Fusion for Agricultural Vehicle Navigation System)

  • 노광모;박준걸;장영창
    • Journal of Biosystems Engineering
    • /
    • 제33권6호
    • /
    • pp.423-429
    • /
    • 2008
  • This study was performed to investigate the effects of inertial navigation system (INS) / global positioning system (GPS) sensor fusion for agricultural vehicle navigation. An extended Kalman filter algorithm was adopted for INS/GPS sensor fusion in an integrated mode, and the vehicle dynamic model was used instead of the navigation state error model. The INS/GPS system was consisted of a low-cost gyroscope, an odometer and a GPS receiver, and its performance was tested through computer simulations. When measurement noises of GPS receiver were 10, 1.0, 0.5, and 0.2 m ($1{\sigma}$), RMS position and heading errors of INS/GPS system at 5 m/s straight path were remarkably reduced with 10%, 35%, 40%, and 60% of those obtained from the GPS receiver, respectively. The decrease of position and heading errors by using INS/GPS rather than stand-alone GPS can provide more stable steering of agricultural equipments. Therefore, the low-cost INS/GPS system using the extended Kalman filter algorithm may enable the self-autonomous navigation to meet required performance like stable steering or more less position errors even in slow-speed operation.

2차원 라이다와 상업용 영상-관성 기반 주행 거리 기록계를 이용한 3차원 점 구름 지도 작성 시스템 개발 (Development of 3D Point Cloud Mapping System Using 2D LiDAR and Commercial Visual-inertial Odometry Sensor)

  • 문종식;이병윤
    • 대한임베디드공학회논문지
    • /
    • 제16권3호
    • /
    • pp.107-111
    • /
    • 2021
  • A 3D point cloud map is an essential elements in various fields, including precise autonomous navigation system. However, generating a 3D point cloud map using a single sensor has limitations due to the price of expensive sensor. In order to solve this problem, we propose a precise 3D mapping system using low-cost sensor fusion. Generating a point cloud map requires the process of estimating the current position and attitude, and describing the surrounding environment. In this paper, we utilized a commercial visual-inertial odometry sensor to estimate the current position and attitude states. Based on the state value, the 2D LiDAR measurement values describe the surrounding environment to create a point cloud map. To analyze the performance of the proposed algorithm, we compared the performance of the proposed algorithm and the 3D LiDAR-based SLAM (simultaneous localization and mapping) algorithm. As a result, it was confirmed that a precise 3D point cloud map can be generated with the low-cost sensor fusion system proposed in this paper.

Fusion of Sonar and Laser Sensor for Mobile Robot Environment Recognition

  • Kim, Kyung-Hoon;Cho, Hyung-Suck
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.91.3-91
    • /
    • 2001
  • A sensor fusion scheme for mobile robot environment recognition that incorporates range data and contour data is proposed. Ultrasonic sensor provides coarse spatial description but guarantees open space with no obstacle within sonic cone with relatively high belief. Laser structured light system provides detailed contour description of environment but prone to light noise and is easily affected by surface reflectivity. Overall fusion process is composed of two stages: Noise elimination and belief updates. Dempster Shafer´s evidential reasoning is applied at each stage. Open space estimation from sonar range measurements brings elimination of noisy lines from laser sensor. Comparing actual sonar data to the simulated sonar data enables ...

  • PDF

센서퓨젼 기반의 인공신경망을 이용한 드릴 마모 모니터링 (Sensor Fusion and Neural Network Analysis for Drill-Wear Monitoring)

  • ;권오양
    • 한국공작기계학회논문집
    • /
    • 제17권1호
    • /
    • pp.77-85
    • /
    • 2008
  • The objective of the study is to construct a sensor fusion system for tool-condition monitoring (TCM) that will lead to a more efficient and economical drill usage. Drill-wear monitoring has an important attribute in the automatic machining processes as it can help preventing the damage of tools and workpieces, and optimizing the drill usage. In this study, we present the architectures of a multi-layer feed-forward neural network with Levenberg-Marquardt training algorithm based on sensor fusion for the monitoring of drill-wear condition. The input features to the neural networks were extracted from AE, vibration and current signals using the wavelet packet transform (WPT) analysis. Training and testing were performed at a moderate range of cutting conditions in the dry drilling of steel plates. The results show good performance in drill- wear monitoring by the proposed method of sensor fusion and neural network analysis.

A Cyber-Physical Information System for Smart Buildings with Collaborative Information Fusion

  • Liu, Qing;Li, Lanlan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제16권5호
    • /
    • pp.1516-1539
    • /
    • 2022
  • This article shows a set of physical information fusion IoT systems that we designed for smart buildings. Its essence is a computer system that combines physical quantities in buildings with quantitative analysis and control. In the part of the Internet of Things, its mechanism is controlled by a monitoring system based on sensor networks and computer-based algorithms. Based on the design idea of the agent, we have realized human-machine interaction (HMI) and machine-machine interaction (MMI). Among them, HMI is realized through human-machine interaction, while MMI is realized through embedded computing, sensors, controllers, and execution. Device and wireless communication network. This article mainly focuses on the function of wireless sensor networks and MMI in environmental monitoring. This function plays a fundamental role in building security, environmental control, HVAC, and other smart building control systems. The article not only discusses various network applications and their implementation based on agent design but also demonstrates our collaborative information fusion strategy. This strategy can provide a stable incentive method for the system through collaborative information fusion when the sensor system is unstable in the physical measurements, thereby preventing system jitter and unstable response caused by uncertain disturbances and environmental factors. This article also gives the results of the system test. The results show that through the CPS interaction of HMI and MMI, the intelligent building IoT system can achieve comprehensive monitoring, thereby providing support and expansion for advanced automation management.

센서 데이터 융합을 이용한 이동 로보트의 자세 추정 (The Posture Estimation of Mobile Robots Using Sensor Data Fusion Algorithm)

  • 이상룡;배준영
    • 대한기계학회논문집
    • /
    • 제16권11호
    • /
    • pp.2021-2032
    • /
    • 1992
  • 본 연구에서는 이동 로보트의 구동모터들의 회전수를 측정하는 두 개의 엔코 더와 로보트의 회전각 속도를 측정하는 자이로센서를 결합하여 주행중인 로보트의 자 세를 정확하게 추정할 수 있는 복수센서 시스템의 신호처리회로 및 알고리즘을 개발하 고 자이로센서의 측정방정식을 모델링하기 위하여 성능시험을 수행하였다. 그리고 확률이론을 유도된 측정방정식에 적용하여 본 복수센서 시스템의 출력 신호들을 효율 적으로 융합할 수 있는 센서데이터 융합알고리즘을 개발하여 사용된 측정센서들에 내 재하는 측정오차의 영향을 최소로 줄이고자 하였다. 제안된 융합알고리즘의 타당성 을 검증하기 위하여 주행실험을 수행하여 이동 로보트의 실제자세와 본 융합알고리즘 의 결과를 비교하였다.

다중 레이더 환경에서의 바이어스 오차 추정의 가관측성에 대한 연구와 정보 융합 (A Study of Observability Analysis and Data Fusion for Bias Estimation in a Multi-Radar System)

  • 원건희;송택렬;김다솔;서일환;황규환
    • 제어로봇시스템학회논문지
    • /
    • 제17권8호
    • /
    • pp.783-789
    • /
    • 2011
  • Target tracking performance improvement using multi-sensor data fusion is a challenging work. However, biases in the measurements should be removed before various data fusion techniques are applied. In this paper, a bias removing algorithm using measurement data from multi-radar tracking systems is proposed and evaluated by computer simulation. To predict bias estimation performance in various geometric relations between the radar systems and target, a system observability index is proposed and tested via computer simulation results. It is also studied that target tracking which utilizes multi-sensor data fusion with bias-removed measurements results in better performance.

센서융합에 의한 열차위치 추정방법 (Estimation of Train Position Using Sensor Fusion Technique)

  • 윤의상;박태형;윤용기;황종규;이재호
    • 한국철도학회논문집
    • /
    • 제8권2호
    • /
    • pp.155-160
    • /
    • 2005
  • We propose a tram position estimation method for automatic train control system. The accurate train position should be continuously feedback to control system for safe and efficient operation of trains in railway. In this paper, we propose the sensor fusion method integrating a tachometer, a transponder, and a doppler sensor far estimation of train position. The external sensors(transponder, doppler sensor) are used to compensate for the error of internal sensor (tachometer). The Kalman filter is also applied to reduce the measurement error of the sensors. Simulation results are then presented to verify the usefulness of the proposed method.