DOI QR코드

DOI QR Code

A Fusion Sensor System for Efficient Road Surface Monitorinq on UGV

UGV에서 효율적인 노면 모니터링을 위한 퓨전 센서 시스템

  • 유성환 (인하대학교 전기컴퓨터공학과 ) ;
  • 김서연 (인하대학교 인간중심컴퓨팅연구소 ) ;
  • 신지우 (인하대학교 전기컴퓨터공학과 ) ;
  • 김태식 (홍익대학교 건설환경공학과) ;
  • 정진만 (인하대학교 컴퓨터공학과)
  • Received : 2024.03.08
  • Accepted : 2024.04.03
  • Published : 2024.03.29

Abstract

Road surface monitoring is essential for maintaining road environment safety through managing risk factors like rutting and crack detection. Using autonomous driving-based UGVs with high-performance 2D laser sensors enables more precise measurements. However, the increased energy consumption of these sensors is limited by constrained battery capacity. In this paper, we propose a fusion sensor system for efficient surface monitoring with UGVs. The proposed system combines color information from cameras and depth information from line laser sensors to accurately detect surface displacement. Furthermore, a dynamic sampling algorithm is applied to control the scanning frequency of line laser sensors based on the detection status of monitoring targets using camera sensors, reducing unnecessary energy consumption. A power consumption model of the fusion sensor system analyzes its energy efficiency considering various crack distributions and sensor characteristics in different mission environments. Performance analysis demonstrates that setting the power consumption of the line laser sensor to twice that of the saving state when in the active state increases power consumption efficiency by 13.3% compared to fixed sampling under the condition of λ=10, µ=10.

노면 모니터링은 노면의 함몰 정도 및 크랙 감지와 같은 위험 요소 관리를 통해 도로 환경의 안전성을 유지하는 필수적인 과정이다. 고성능 2D 레이저 센서를 탑재한 자율주행 기반 UGV를 활용한 정밀 측정이 가능하지만, 고성능 센서의 에너지 소모량 증가로 인해 배터리 용량에 대한 한계가 있다. 본 논문에서는 UGV에서 효율적인 노면 모니터링을 위한 퓨전 센서 시스템을 제안한다. 제안된 퓨전 센서 시스템은 카메라를 통한 칼라 정보와 선레이저 센서를 통한 깊이 정보를 결합하여 노면 모니터링의 정밀한 변위 탐지를 가능하게 한다. 또한 카메라 센서를 이용해 모니터링 대상의 탐지 여부에 따라 선레이저 센서 스캔 주파수를 동적으로 제어하는 동적 샘플링 알고리즘을 적용함으로써 불필요한 에너지 소모를 절감한다. 제안된 퓨전 센서 시스템에서의 평균 소비전력 모델을 제시하고 다양한 미션 환경의 크랙 분포 및 센서 특성을 고려하여 에너지 효율성을 분석한다. 성능 분석 결과, 선레이저 센서의 Active 상태 소비 전력이 Saving 상태의 2배이고, λ=10, µ=10인 환경에서 고정 샘플링 기법에 비해 전력 소비 효율이 13.3% 향상됨을 확인하였다.

Keywords

Acknowledgement

이 성과는 정부(과학기술정보통신부)의 재원으로 한국연구재단의 지원을 받아 수행된 연구임 (No. RS-2023-00252501).

References

  1. M. Akhlaq, T. R., Sheltami, B., Helgeson, E. M. Shakshuki, "Designing an integrated driver assistance system using image sensors," in: Journal cf Intelligent Manufacturing, Vol. 23, No. 6, pp. 2109-2132, Dec. 2012.  https://doi.org/10.1007/s10845-011-0618-1
  2. 송민환, 김재호, 안일엽, 김태현, 박용국, 원광호, 이상신, "USN 기반 지능형 도로상태 모니터링을 위한 저전력 센서네트워크 플랫폼 구현," 한국정보통신설비학회 학술대회, 463-465쪽, 2008년 
  3. M. Cheng, H. Zhang, C. Wang, and J. Li. "Extraction and classification of road markings using mobile laser scanning point clouds," in: IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., Vol. 10, No. 3, pp. 1182-1196, Sep. 2017.  https://doi.org/10.1109/JSTARS.2016.2606507
  4. C. Mertz, J. Kozar, J. R., Miller, and C. Thorpe. "Eye-safe laser line striper for outside use," in: Intelligent Vehicle Symposium, IEEE, Vol. 2, pp. 507-512, Jun. 2002. 
  5. A. Sadrpour, J. Jin, A.G. Ulsoy, "Mission energy prediction for unmanned ground vehicles," in: 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, pp. 2229-2234, 2012. 
  6. Tom Webber, Ian Colwill, David Felix, Elias Stipidis, "Improving mission survivability of UGV using polynomial non-linear regression for power prediction," in: Journal of Battlefield Technology, Vol. 18, No. 1, pp. 7-13, Mar. 2015.
  7. J.A. Broderick, D.M. Tilbury, E.M. Atkins, "Optimal coverage trajectories for a UGV with tradeoffs for energy and time," in: Autonomous Robots, Vol. 36, pp. 257-271, 2014.  https://doi.org/10.1007/s10514-013-9348-x
  8. Nikolaos Baras, Minas Dasygen, "UGV Coverage Path Planning: An Energy-Efficient Approach through Turn Reduction," in: Electronics 2023, Vol. 12, no. 13, pp. 2959, 2023. 
  9. Jianyong Lin, WendongXiao, Frank L. Lewis, Lihua Xie, "Energy-Efficient Distributed Adaptive Multisensor Scheduling for Target Tracking in Wireless Sensor Networks," in: IEEE Transactions on Instrumentation and Measurement, Vol. 58, no. 6, pp. 1886-1896, Jun. 2009.  https://doi.org/10.1109/TIM.2008.2005822
  10. MinghuaXia, Yik-Chung Wu, "Backscatter Data Collection With Unmanned Ground Vehicle-Mobility Management and Power Allocation," in: IEEE Transactions on Wireless Communications, Vol. 18, no. 4, pp. 2314-2328, Apr. 2019.  https://doi.org/10.1109/TWC.2019.2902557
  11. Sanghoon Lee, Dongkyu Lee, Pyung Choi, Daejin Park, "Accuracy-Power Controllable LiDAR Sensor System with 3D Object Recognition for Autonomous Vehicle," in: Sensors, Vol. 20, no. 19, pp. 5706, 2020. 
  12. Changhao Chen, Stefano Rosal, Yishu Miao, Chris Xiaoxuan Lu, Wei Wu, Andrew Markham, Niki Trigoni, "Selective Sensor Fusion for Neural Visual-Inertial Odometry," in: Proceedings cf the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 10542-10551, 2019. 
  13. Dieter Balemans, Wim Casteels, Simon Vanneste, Jens de Hoog, Siegfried Mercelis, Peter Hellinckx, "Resource efficient sensor fusion by knowledge-based network pruning," in: Internet of Things, Vol. 11, Sep. 2020. 
  14. Vineet Gokhale, Gerardo Moyers Barrera, R. VenkateshaPrasad, "FEEL: Fast, Energy-Efficient Localization for Autonomous Indoor Vehicles," in: ICC 2021 - IEEE International Conference on Communications, 2021. 
  15. Amav Vaibhav Malawade, Trier Mortlock, Mohammad Abdullah AlFaruque, "EcoFusion: Energy-Aware Adaptive Sensor Fusion for Efficient Autonomous Vehicle Perception," in: DAC '22: Proceedings of the 59th ACM/IEEE Design Automation Conference, pp. 481-486, Jul. 2022. 
  16. Zhiqiang, Wang, and Liu Jun. "A review of object detection based on convolutional neural network," 2017 36th Chinese control conference (CCC). IEEE, 2017. 
  17. Ross, Sheldon M. Introduction to probability models. Academic press, 2014.