• 제목/요약/키워드: time sensor fusion

검색결과 215건 처리시간 0.03초

Space and Time Sensor Fusion Using an Active Camera For Mobile Robot Navigation

  • Jin, Tae-Seok;Lee, Bong-Ki;Park, Soo-Min;Lee, Kwon-Soon;Lee, Jang-Myung
    • 한국항해항만학회:학술대회논문집
    • /
    • 한국항해항만학회 2002년도 추계공동학술대회논문집
    • /
    • pp.127-132
    • /
    • 2002
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

  • PDF

카메라-라이다 센서 융합을 통한 VRU 분류 및 추적 알고리즘 개발 (Vision and Lidar Sensor Fusion for VRU Classification and Tracking in the Urban Environment)

  • 김유진;이호준;이경수
    • 자동차안전학회지
    • /
    • 제13권4호
    • /
    • pp.7-13
    • /
    • 2021
  • This paper presents an vulnerable road user (VRU) classification and tracking algorithm using vision and LiDAR sensor fusion method for urban autonomous driving. The classification and tracking for vulnerable road users such as pedestrian, bicycle, and motorcycle are essential for autonomous driving in complex urban environments. In this paper, a real-time object image detection algorithm called Yolo and object tracking algorithm from LiDAR point cloud are fused in the high level. The proposed algorithm consists of four parts. First, the object bounding boxes on the pixel coordinate, which is obtained from YOLO, are transformed into the local coordinate of subject vehicle using the homography matrix. Second, a LiDAR point cloud is clustered based on Euclidean distance and the clusters are associated using GNN. In addition, the states of clusters including position, heading angle, velocity and acceleration information are estimated using geometric model free approach (GMFA) in real-time. Finally, the each LiDAR track is matched with a vision track using angle information of transformed vision track and assigned a classification id. The proposed fusion algorithm is evaluated via real vehicle test in the urban environment.

Tracking of ARPA Radar Signals Based on UK-PDAF and Fusion with AIS Data

  • Chan Woo Han;Sung Wook Lee;Eun Seok Jin
    • 한국해양공학회지
    • /
    • 제37권1호
    • /
    • pp.38-48
    • /
    • 2023
  • To maintain the existing systems of ships and introduce autonomous operation technology, it is necessary to improve situational awareness through the sensor fusion of the automatic identification system (AIS) and automatic radar plotting aid (ARPA), which are installed sensors. This study proposes an algorithm for determining whether AIS and ARPA signals are sent to the same ship in real time. To minimize the number of errors caused by the time series and abnormal phenomena of heterogeneous signals, a tracking method based on the combination of the unscented Kalman filter and probabilistic data association filter is performed on ARPA radar signals, and a position prediction method is applied to AIS signals. Especially, the proposed algorithm determines whether the signal is for the same vessel by comparing motion-related components among data of heterogeneous signals to which the corresponding method is applied. Finally, a measurement test is conducted on a training ship. In this process, the proposed algorithm is validated using the AIS and ARPA signal data received by the voyage data recorder for the same ship. In addition, the proposed algorithm is verified by comparing the test results with those obtained from raw data. Therefore, it is recommended to use a sensor fusion algorithm that considers the characteristics of sensors to improve the situational awareness accuracy of existing ship systems.

선삭공정시 공구파손의 실시간 검출에 관한 연구 (A Study on Real-time Monitoing of Tool Fracture in Turning)

  • 최덕기;주종남;이장무
    • 한국정밀공학회지
    • /
    • 제12권3호
    • /
    • pp.130-143
    • /
    • 1995
  • This paper presents a new methodology for on-line tool breadage detection by sensor fusion of an acoustic emission (AE) sensor and a built-in force sensor. A built-in piezoelectric force sensor, instead of a tool dynamometer, was used to measure the cutting force without altering the machine tool dynamics. The sensor was inserted in the tool turret housing of an NC lathe. FEM analysis was carried out to locate the most sensitive position for the sensor. A burst of AE signal was used as a triggering signal to inspect the cutting force. A sighificant drop of cutting force was utilized to detect tool breakage. The algorithm was implemented on a DSP board for in-process tool breakage detection. Experiental works showed an excellent monitoring capability of the proposed tool breakage detection system.

  • PDF

Improvement of Dynamic Respiration Monitoring Through Sensor Fusion of Accelerometer and Gyro-sensor

  • Yoon, Ja-Woong;Noh, Yeon-Sik;Kwon, Yi-Suk;Kim, Won-Ki;Yoon, Hyung-Ro
    • Journal of Electrical Engineering and Technology
    • /
    • 제9권1호
    • /
    • pp.334-343
    • /
    • 2014
  • In this paper, we suggest a method to improve the fusion of an accelerometer and gyro sensor by using a Kalman filter to produce a more high-quality respiration signal to supplement the weakness of using a single accelerometer. To evaluate our proposed algorithm's performance, we developed a chest belt-type module. We performed experiments consisting of aerobic exercise and muscular exercises with 10 subjects. We compared the derived respiration signal from the accelerometer with that from our algorithm using the standard respiration signal from the piezoelectric sensor in the time and frequency domains during the aerobic and muscular exercises. We also analyzed the time delay to verify the synchronization between the output and standard signals. We confirmed that our algorithm improved the respiratory rate's detection accuracy by 4.6% and 9.54% for the treadmill and leg press, respectively, which are dynamic. We also confirmed a small time delay of about 0.638 s on average. We determined that real-time monitoring of the respiration signal is possible. In conclusion, our suggested algorithm can acquire a more high-quality respiration signal in a dynamic exercise environment away from a limited static environment to provide safer and more effective exercises and improve exercise sustainability.

자율주행을 위한 센서 데이터 융합 기반의 맵 생성 (Map Building Based on Sensor Fusion for Autonomous Vehicle)

  • 강민성;허수정;박익현;박용완
    • 한국자동차공학회논문집
    • /
    • 제22권6호
    • /
    • pp.14-22
    • /
    • 2014
  • An autonomous vehicle requires a technology of generating maps by recognizing surrounding environment. The recognition of the vehicle's environment can be achieved by using distance information from a 2D laser scanner and color information from a camera. Such sensor information is used to generate 2D or 3D maps. A 2D map is used mostly for generating routs, because it contains information only about a section. In contrast, a 3D map involves height values also, and therefore can be used not only for generating routs but also for finding out vehicle accessible space. Nevertheless, an autonomous vehicle using 3D maps has difficulty in recognizing environment in real time. Accordingly, this paper proposes the technology for generating 2D maps that guarantee real-time recognition. The proposed technology uses only the color information obtained by removing height values from 3D maps generated based on the fusion of 2D laser scanner and camera data.

가스절연기기의 부분방전검출을 위한 SWNT-UHF 융합센서 (SWNT-UHF Fusion Sensor for GIS Partial Discharge Detection)

  • 이상욱;장용무;백승현;이종철
    • 한국전기전자재료학회:학술대회논문집
    • /
    • 한국전기전자재료학회 2010년도 하계학술대회 논문집
    • /
    • pp.120-120
    • /
    • 2010
  • To detect the PD events, we have studied a fusion sensor, the UHF sensor and the single-walled carbon nanotube(SWNT) gas sensor. We are accustomed to the UHF sensor which have employed to detect the partial discharges in apparatus GIS-like. But the SWNT gas sense is a newly way proposed to detect the partial discharges. In this study, we monitored not only the changes of the electrical conductance of the SWNT sensors in responding to the PD events but also the signal of the UHF sensor at the same time with IEC 60270 standard method for reference on the partial discharge events.

  • PDF

스마트팩토리 실현을 위한 다중센서기반 모바일로봇의 위치 및 자세제어에 관한 연구 (A Study on Orientation and Position Control of Mobile Robot Based on Multi-Sensors Fusion for Implimentation of Smart FA)

  • 동근한;김희진;배호영;김상현;백영태;한성현
    • 한국산업융합학회 논문집
    • /
    • 제22권2호
    • /
    • pp.209-218
    • /
    • 2019
  • This study proposes a new approach to Control the Orientation and position based on obstacle avoidance technology by multi sensors fusion and autonomous travelling control of mobile robot system for implimentation of Smart FA. The important focus is to control mobile robot based on by the multiple sensor module for autonomous travelling and obstacle avoidance of proposed mobile robot system, and the multiple sensor module is consit with sonar sensors, psd sensors, color recognition sensors, and position recognition sensors. Especially, it is proposed two points for the real time implementation of autonomous travelling control of mobile robot in limited manufacturing environments. One is on the development of the travelling trajectory control algorithm which obtain accurate and fast in considering any constraints. such as uncertain nonlinear dynamic effects. The other is on the real time implementation of obstacle avoidance and autonomous travelling control of mobile robot based on multiple sensors. The reliability of this study has been illustrated by the computer simulation and experiments for autonomous travelling control and obstacle avoidance.

비동기 이종 센서를 이용한 데이터 융합기반 근거리 표적 추적기법 (Short Range Target Tracking Based on Data Fusion Method Using Asynchronous Dissimilar Sensors)

  • 이의혁
    • 전자공학회논문지
    • /
    • 제49권9호
    • /
    • pp.335-343
    • /
    • 2012
  • 본 논문은 근거리에서 접근하는 표적에 대한 레이더와 열영상의 관측데이터를 기반으로 정보융합을 수행하여 표적을 추적하는 알고리즘을 기술하고 있다. 일반적으로 칼만필터를 이용한 추적 융합 방법은 동기화된 레이더 및 열영상의 데이터를 근간으로 하고 있으며, 비동기적으로 동작하는 실제 시스템에 적용하기에는 많은 제한사항을 가지고 있다. 제안된 알고리즘에서의 중점사항은 동기화되어 있지 않은 서로 다른 두 센서인 레이더와 열영상의 관측데이터가 입력되었을 때 레이더의 거리정보와 추적상태벡터를 이용하여 관측값의 시간차이를 보상하여 관측치 융합 후 추적을 수행하는 것이다. 제안된 알고리즘의 성능평가를 위해 기존의 궤적기반 정보융합방법 및 측정치 융합기법과 성능을 비교하여 제시한다.

다중센서 기반 차선정보 시공간 융합기법 (Lane Information Fusion Scheme using Multiple Lane Sensors)

  • 이수목;박기광;서승우
    • 전자공학회논문지
    • /
    • 제52권12호
    • /
    • pp.142-149
    • /
    • 2015
  • 단일 카메라 센서를 기반으로 한 차선검출 시스템은 급격한 조도 변화, 열악한 기상환경 등에 취약하다. 이러한 단일 센서 시스템의 한계를 극복하기 위한 방안으로 센서 융합을 통해 성능 안정화를 도모할 수 있다. 하지만, 기존 센서 융합의 연구는 대부분 물체 및 차량을 대상으로 한 융합 모델에 국한되어 차용하기 어렵거나, 차선 센서의 다양한 신호 주기 및 인식범위에 대한 상이성을 고려하지 않은 경우가 대부분이었다. 따라서 본 연구에서는 다중센서의 상이성을 고려하여 차선 정보를 최적으로 융합하는 기법을 제안한다. 제안하는 융합 프레임워크는 센서 별 가변적인 신호처리 주기와 인식 신뢰 범위를 고려하므로 다양한 차선 센서 조합으로도 정교한 융합이 가능하다. 또한, 새로운 차선 예측 모델의 제안을 통해 간헐적으로 들어오는 차선정보를 세밀한 차선정보로 정밀하게 예측하여 다중주기 신호를 동기화한다. 조도환경이 열악한 환경에서의 실험과 정량적 평가를 통해, 제안하는 융합 시스템이 기존 단일 센서 대비 인식 성능이 개선됨을 검증한다.