DOI QR코드

DOI QR Code

Estimation of Urban Traffic State Using Black Box Camera

차량 블랙박스 카메라를 이용한 도시부 교통상태 추정

  • Haechan Cho (Dept. of civil and environmental engineering, Korea advanced science and technology) ;
  • Yeohwan Yoon (Dept. of Highway & Transportation Research, Korea institute of civil engineering and building technology) ;
  • Hwasoo Yeo (Dept. of civil and environmental engineering, Korea advanced science and technology)
  • 조해찬 (한국과학기술원 건설및환경공학과) ;
  • 윤여환 (한국건설기술연구원 도로교통연구본부) ;
  • 여화수 (한국과학기술원 건설및환경공학과)
  • Received : 2022.11.30
  • Accepted : 2023.02.15
  • Published : 2023.04.30

Abstract

Traffic states in urban areas are essential to implement effective traffic operation and traffic control. However, installing traffic sensors on numerous road sections is extremely expensive. Accordingly, estimating the traffic state using a vehicle-mounted camera, which shows a high penetration rate, is a more effective solution. However, the previously proposed methodology using object tracking or optical flow has a high computational cost and requires consecutive frames to obtain traffic states. Accordingly, we propose a method to detect vehicles and lanes by object detection networks and set the region between lanes as a region of interest to estimate the traffic density of the corresponding area. The proposed method only uses less computationally expensive object detection models and can estimate traffic states from sampled frames rather than consecutive frames. In addition, the traffic density estimation accuracy was over 90% on the black box videos collected from two buses having different characteristics.

도심지역의 교통 상태는 효과적인 교통 운영과 교통 제어를 수행하는 데 필수 요소이다. 하지만 교통 상태를 얻기 위해서 수많은 도로 구간에 교통 센서를 설치하는 것은 막대한 비용이 든다. 이를 해결하기 위해서 시장침투율이 높은 센서인 차량 블랙박스 카메라를 이용하여 교통 상태를 추정하는 것이 효과적이다. 하지만 기존의 방법론은 객체 추적 알고리즘이나 광학 흐름과 같이 계산 복잡도가 높고, 연속된 프레임이 있어야 연산을 수행할 수 있다는 단점이 존재한다. 이에 본 연구에서는 심층학습 모델로 차량과 차선을 탐지하고, 차선 사이의 공간을 관심 영역으로 설정하여 해당 영역의 교통밀도를 추정하는 방법을 제안하였다. 이 방법론은 객체 탐지 모델만을 이용해서 연산량이 적고, 연속된 프레임이 아닌 샘플링된 프레임에 대해 교통 상태를 추정할 수 있다는 장점이 있기에, 보유하고 있는 컴퓨팅 자원에 맞는 교통 상태 추정이 가능하다. 또, 도심지역에서 운행하는 서로 다른 특성의 2개의 버스 노선에서 수집한 블랙박스 영상을 검증한 결과, 교통밀도 추정 정확도가 90% 이상인 것을 확인하였다.

Keywords

Acknowledgement

이 연구는 2022년도 정부(과학기술정보통신부)의 재원으로 정보통신기획평가원의 지원을 받아 수행된 연구임 (2022-0-00784, 이동형 인프라 영상 센서를 활용한 지능형 도로 및 교통 상황 탐지 기술 개발). 이 연구는 2022년도 한국건설기술연구원의 지원을 받아 수행된 연구임 (이동형 영상 기반 지능형 도로 상황 분석 시스템 연구).

References

  1. Benkraouda, O., Thodi, B., Yeo, H., Menendez, M. and Jabari, S.(2020), "Traffic data imputation using deep convolutional neural networks", IEEE Access, vol. 8, pp.104740-104752.  https://doi.org/10.1109/ACCESS.2020.2999662
  2. Chen, X., Yin, J., Tang, K., Tian, Y. and Sun, J.(2022), "Vehicle Trajectory Reconstruction at Signalized Intersections Under Connected and Automated Vehicle Environment", IEEE Transactions on Intelligent Transportation Systems, vol. 23, no. 10, pp.17986-18000.  https://doi.org/10.1109/TITS.2022.3150577
  3. Geroliminis, N. and Daganzo, C. F.(2007), "Macroscopic modeling of traffic in cities", Transportation Research Board 86th Annual Meeting. 
  4. Gokasar, I. and Timurogullari, A.(2021), "Real-time prediction of traffic density with deep learning using computer vision and traffic event information", 2021 International Conference on Innovations in Intelligent Systems and Applications. 
  5. Gupte, S., Masoud, O., Martin, R. and Papanikolopoulos, N.(2002), "Detection and classification of vehicles", IEEE Transactions on Intelligent Transportation Systems, vol. 3, no. 1, pp.37-47.  https://doi.org/10.1109/6979.994794
  6. Kar, G., Jain, S., Gruteser, M., Bai, F. and Govindan, R.(2017), "Real-time traffic estimation at vehicular edge nodes", IEEE/CVF International Conference on Computer Vision, pp.1-13. 
  7. Kim, M., Jeong, D. and Kim, H.(2019), "A study on estimation of traffic flow using image-based vehicle identification technology", The Journal of the Korea Institute of Intelligent Transport Systems, vol. 18, no. 6, pp.110-123.  https://doi.org/10.12815/kits.2019.18.6.110
  8. Kim, S., Anagnostopoulos, G., Barmpounakis, E. and Geroliminis, N.(2023), "Visual extensions and anomaly detection in the pNEUMA experiment with a swarm of drones", Transportation Research Part C: Emerging Technologies, vol. 147, p.103966. 
  9. Lee, H., Lee, J. and Chung, Y.(2022), "Traffic density estimation using vehicle sensor data", Journal of Intelligent Transportation Systems, vol. 26, no. 6, pp.675-689.  https://doi.org/10.1080/15472450.2021.1966626
  10. Magee, D. R.(2004), "Tracking multiple vehicles using foreground, background and motion models", Image and Vision Computing, vol. 22, no. 2, pp.143-155.  https://doi.org/10.1016/S0262-8856(03)00145-8
  11. Mehran, B., Kuwahara, M. and Naznin, F.(2012), "Implementing kinematic wave theory to reconstruct vehicle trajectories from fixed and probe sensor data", Transportation Research Part C: Emerging Technologies, vol. 20, no. 1, pp.144-163.  https://doi.org/10.1016/j.trc.2011.05.006
  12. Nantes, A., Ngoduy, D., Bhaskara, A., Miska, M. and Chung, E.(2016), "Real-time traffic state estimation in urban corridors from heterogeneous data", Transportation Research Part C: Emerging Technologies, vol. 66, pp.99-118.  https://doi.org/10.1016/j.trc.2015.07.005
  13. Pandey, A. and Biswas, S.(2022), "Assessment of Level of Service on urban roads: A revisit to past studies", Advances in Transportation Studies, vol. 57. 
  14. Parisot, C., Meessen, J., Carincotte, C. and Desurmont, X.(2008), "Real-time road traffic classification using on-board bus video camera", 2008 11th International IEEE Conference on Intelligent Transportation Systems, pp.189-196. 
  15. Park, B., Kim, T., Yang, I., Heo, J. and Son, B.(2015), "A method for measuring accurate traffic density by aerial photography", Journal of Advanced Transportation, vol. 49, no. 4, pp.568-580.  https://doi.org/10.1002/atr.1288
  16. Schuricht, P., Michler, O. and Baker, B.(2011), "Efficiency-increasing driver assistance at signalized intersections using predictive traffic state estimation", 2011 14th International IEEE Conference on Intelligent Transportation Systems, pp.347-352. 
  17. Shahrbabaki, M., Safavi, A., Papageorgiou, M. and Papamichail, I.(2018), "A data fusion approach for real-time traffic state estimation in urban signalized links", Transportation Research Part C: Emerging Technologies, vol. 92, pp.525-548.  https://doi.org/10.1016/j.trc.2018.05.020
  18. Smulders, S.(1990), "Control of freeway traffic flow by variable speed signs", Transportation Research Part B: Methodological, vol. 24, no. 2, pp.111-132.  https://doi.org/10.1016/0191-2615(90)90023-R
  19. Sun, Z. and Ban, X.(2013), "Vehicle trajectory reconstruction for signalized intersections using mobile traffic sensors", Transportation Research Part C: Emerging Technologies, vol. 36, pp.268-283.  https://doi.org/10.1016/j.trc.2013.09.002
  20. Sun, Z., Bebis, G. and Miller, R.(2002), "Improving the performance of on-road vehicle detection by combining gabor and wavelet features", IEEE 5th International Conference on Intelligent Transportation Systems, pp.130-135. 
  21. Tabelini, L., Berriel, R., Paixao, T. M., Badue, C., Souza, A. F. and Oliveira, T.(2021), "Keep your Eyes on the Lane: Real-time Attention-guided Lane Detection", IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.294-302. 
  22. Thodi, B., Khan, S., Jabari, E. and Menendez, M.(2022), "Incorporating kinematic wave theory into a deep learning method for high-resolution traffic speed estimation", IEEE Transactions on Intelligent Transportation Systems, vol. 23, no. 10, pp.17849-17862.  https://doi.org/10.1109/TITS.2022.3157439
  23. Ultralytics, https://github.com/ultralytics/yolov5, 2022.07.02. 
  24. Umair, M., Farooq, M., Raza, R., Chen, Q. and Abdulhai, B.(2021), "Efficient Video-based Vehicle Queue Length Estimation using Computer Vision and Deep Learning for an Urban Traffic Scenario", Processes, vol. 9, no. 10, p.1786. 
  25. Xie, X., Lint, H. and Verbraeck, A.(2018), "A generic data assimilation framework for vehicle trajectory reconstruction on signalized urban arterials using particle filters", Transportation Research Part C: Emerging Technologies, vol. 92, pp.364-391.  https://doi.org/10.1016/j.trc.2018.05.009
  26. Zheng, T., Huang, Y., Liu, Y., Tang, W., Yang, W., Cai, D. and He, X.(2022), "CLRNet: Cross Layer Refinement Network for Lane Detection", IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.898-907.