DOI QR코드

DOI QR Code

Calibration of VLP-16 Lidar Sensor and Vision Cameras Using the Center Coordinates of a Spherical Object

구형물체의 중심좌표를 이용한 VLP-16 라이다 센서와 비전 카메라 사이의 보정

  • Received : 2018.08.20
  • Accepted : 2018.12.25
  • Published : 2019.02.28

Abstract

360 degree 3-dimensional lidar sensors and vision cameras are commonly used in the development of autonomous driving techniques for automobile, drone, etc. By the way, existing calibration techniques for obtaining th e external transformation of the lidar and the camera sensors have disadvantages in that special calibration objects are used or the object size is too large. In this paper, we introduce a simple calibration method between two sensors using a spherical object. We calculated the sphere center coordinates using four 3-D points selected by RANSAC of the range data of the sphere. The 2-dimensional coordinates of the object center in the camera image are also detected to calibrate the two sensors. Even when the range data is acquired from various angles, the image of the spherical object always maintains a circular shape. The proposed method results in about 2 pixel reprojection error, and the performance of the proposed technique is analyzed by comparing with the existing methods.

전방향 3차원 라이다 센서와 비전 카메라는 자동차나 드론 등의 자율주행기술 개발에 활용되고 있다. 한편 라이다 센서와 카메라 좌표계 사이의 변환 관계를 보정하기 위한 기존의 기술들은 특수한 보정물체를 제작하거나 보정물체의 크기가 큰 단점이 있다. 본 논문에서는 한 개의 구형물체를 사용하여 두 센서 사이의 기하보정을 간편하게 구현하는 방법을 소개한다. 구형 물체의 3차원 거리정보에서 RANSAC으로 네 개의 3차원 점을 선택하여 구의 중심좌표를 계산하고, 카메라 영상에서 물체의 2차원 중심점을 구하여 두 센서를 보정하였다. 구는 다양한 각도에서 영상을 획득하여도 항상 원형의 형상을 유지하기 때문에 데이터 획득 시 유리한 장점이 있다. 본 논문에서 제안하는 방법으로 약 2픽셀의 투영오차의 결과를 얻었고, 기존의 방법과의 비교실험을 통하여 제안 기술의 성능을 분석하였다.

Keywords

JBCRJM_2019_v8n2_89_f0001.png 이미지

Fig. 1. A Lidar-Camera System for 3D Map Generation (a) System View (b) Transformation between Lidar and 6 Cameras

JBCRJM_2019_v8n2_89_f0002.png 이미지

Fig. 2. 3D Transformation Relationship between a LiDAR Sensor and a Camera Defined by Three 3D Points

JBCRJM_2019_v8n2_89_f0003.png 이미지

Fig. 3. Acquisition of Lidar and Camera Data. A ball is Moving in front of Both Sensors. At Several Positions, 3D Range and 2D Image Data are Acquired

JBCRJM_2019_v8n2_89_f0004.png 이미지

Fig. 4. A Method of Detecting the Center Point of a Sphere using Any Four Points on the Sphere Surface

JBCRJM_2019_v8n2_89_f0005.png 이미지

Fig. 5. Examples of the Detecting the Center of a Sphere in a 2D Image (a) Original Image (b) Spherical Object Color Detection(Adaptive Threshold) (c) Spherical object Detection(Ellipse Fitting)

JBCRJM_2019_v8n2_89_f0006.png 이미지

Fig. 6. (a) Velodyne 16-Channel Lidar Sensor and 6 Vision cameras (b) Calibration Ball

JBCRJM_2019_v8n2_89_f0007.png 이미지

Fig. 7. Data Acquisition of a Spherical Ball with Distance at (a) 1m (b) 2m (c) 3m

JBCRJM_2019_v8n2_89_f0008.png 이미지

Fig. 9. Data Acquisition of a Circular Planar Object with Distance at (a) 1m (b) 2m (c) 3m

JBCRJM_2019_v8n2_89_f0009.png 이미지

Fig. 10. Depth Data obtained from the Lidar Sensor (a) Ball Object(Hand-Held by a Person) (b) Planar Object

JBCRJM_2019_v8n2_89_f0010.png 이미지

Fig. 11. Examples of the 3D Point Clouds of the Ball Surface

JBCRJM_2019_v8n2_89_f0011.png 이미지

Fig. 12. Result of the Projection (a) Depth Data of the Planar Object (b) Projection Result on a Plane

JBCRJM_2019_v8n2_89_f0012.png 이미지

Fig. 13. The Process of Finding the Center Point of a Planar Object (a) the Edge Points (b) a Center Axis (c) the Points Projected on a Plane (d) a Center Point Detected From

JBCRJM_2019_v8n2_89_f0013.png 이미지

Fig. 14. Reprojection Error of the 1-st Camera at DIfferent Calibration Distance. Blue Bar is Average Error and Red Line is Standard Deviation

JBCRJM_2019_v8n2_89_f0014.png 이미지

Fig. 15. The Result of Modeling the Building and Basketball Court using a Lidar Sensor and Six-Cameras (a) Indoor Data (b) Outdoor Data

JBCRJM_2019_v8n2_89_f0015.png 이미지

Fig. 8. A Planar Object for Comparison of a Conventional Calibration Method

Table 3. Reprojection Error of Each Camera View

JBCRJM_2019_v8n2_89_t0001.png 이미지

Table. 4. Comparison of Reprojection Error with Conventional Methods(In Pixel)

JBCRJM_2019_v8n2_89_t0002.png 이미지

Table 1. Pseudocode of Calculating the Ball Center

JBCRJM_2019_v8n2_89_t0003.png 이미지

Table. 2. Number Positions where the Calibration Objects are Placed for 3D and 2D Data Acquisition

JBCRJM_2019_v8n2_89_t0004.png 이미지

References

  1. M. Hassanein, A. Moussa, and N. El-Sheimy, "A new automatic system calibration of multi-cameras and lidar sensors," International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, Vol.41, No. 23, pp.589-594, Jul. 2016.
  2. I. Ashraf, S. J. Hur, and Y. W. Park, "An investigation of interpolation techniques to generate 2D intensity images from lidar data," IEEE Access, Vol.5, pp. 8250-8260, Apr. 2017. https://doi.org/10.1109/ACCESS.2017.2699686
  3. J. P. Hwang, S. K. Park, E. T. Kim, and H. J. Kang, "Camera and LIDAR Combined System for On-Road Vehicle Detection," Journal of Institute of Control, Robotics and Systems, Vol.15, No.4, pp.390-395, Oct. 2017. https://doi.org/10.5302/J.ICROS.2009.15.4.390
  4. J. W. Kim, J. Y. Jeong, Y. S. Shin, Y. G. Cho, H. C. Roh, and A. Y. Kim, "Lidar configuration comparison for urban mapping system," 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence(URAI), pp. 854-857, 2017.
  5. Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.22, No.11, pp.1330-1334, Nov. 2016. https://doi.org/10.1109/34.888718
  6. O. Naroditsky, A. Patterson and K. Daniilidis, "Automatic alignment of a camera with a line scan lidar system," 2011 IEEE International Conference on Robotics and Automation (ICRA), pp.3429-3434, 2011.
  7. Y. S. Park, S. M. Yun, S. W. Chee, K. E. Cho, K. H. Um, and S. D. Sim, "Calibration between color camera and 3D lidar instruments with a polygonal planar board," Sensors, Vol.14, No.3, pp.5533-5353, Mar. 2014.
  8. M. Velas, M. Spanel, Z. Materna, and A. Herout, "Calibration of RGB camera With velodyne lidar," International Conference on Computer Graphics, Visualization and Computer Vision (WSCG), pp.135-144, 2014.
  9. T. GEE, J. James, W. V. D. Mark, A. G. Strozzi, P. Delmas, and G. Gimelfarb, "Estimating extrinsic parameters between a stereo rig and a multi-layer lidar using plane matching and circle feature extraction," 2017 Fifteenth IAPR International Conference on Machine Vision Applications(MVA), pp.21-24, 2017.
  10. S. Park and S. Choi, "Convenient View Calibration of Multiple RGB-D Cameras Using a Spherical Object," KIPS Transactions on Software and Data Engineering, Vol.3, No.8, pp.309-314, 2014. https://doi.org/10.3745/KTSDE.2014.3.8.309
  11. G. Lee, J. Lee, and S. Park, "Calibration of VLP-16 Lidar and multi-view cameras using a ball for 360 degree 3D color map acquisition," in Proceedings of 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems(MFI), 2017.
  12. J. H. Lee, E. S. Kim, and S. Y. Park, "Synchronization error compensation of multi-view RGB-D 3D modeling system," Asian Conference on Computer Vision(ACCV), pp.162-174, 2016.
  13. M. A. Fischler and C. R. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, Vol.24, No.6, pp.381-395, Jun. 1981. https://doi.org/10.1145/358669.358692
  14. M. Ruan and D. Huber, "Calibration of 3D Sensors Using a Spherical Target," 2014 2nd International Conference on 3D Vision, Vol.1, pp.187-193, Dec. 2014.
  15. D. Loannou, H. Walter, and A. F. Laine, "Circle recognition through a 2D Hough Transform and radius histogramming," Image and Vision Computing, pp.15-26, 1999.