DOI QR코드

DOI QR Code

3D Reconstruction of Structure Fusion-Based on UAS and Terrestrial LiDAR

UAS 및 지상 LiDAR 융합기반 건축물의 3D 재현

  • 한승희 (공주대학교 공과대학 건설환경공학부 도시.교통공학전공) ;
  • 강준오 (인천대학교 도시과학대학 도시건설공학과) ;
  • 오성종 (인천대학교 도시과학대학 도시건설공학과) ;
  • 이용창 (인천대학교 도시과학대학 도시공학과)
  • Received : 2018.11.30
  • Accepted : 2018.12.30
  • Published : 2018.12.30

Abstract

Digital Twin is a technology that creates a photocopy of real-world objects on a computer and analyzes the past and present operational status by fusing the structure, context, and operation of various physical systems with property information, and predicts the future society's countermeasures. In particular, 3D rendering technology (UAS, LiDAR, GNSS, etc.) is a core technology in digital twin. so, the research and application are actively performed in the industry in recent years. However, UAS (Unmanned Aerial System) and LiDAR (Light Detection And Ranging) have to be solved by compensating blind spot which is not reconstructed according to the object shape. In addition, the terrestrial LiDAR can acquire the point cloud of the object more precisely and quickly at a short distance, but a blind spot is generated at the upper part of the object, thereby imposing restrictions on the forward digital twin modeling. The UAS is capable of modeling a specific range of objects with high accuracy by using high resolution images at low altitudes, and has the advantage of generating a high density point group based on SfM (Structure-from-Motion) image analysis technology. However, It is relatively far from the target LiDAR than the terrestrial LiDAR, and it takes time to analyze the image. In particular, it is necessary to reduce the accuracy of the side part and compensate the blind spot. By re-optimizing it after fusion with UAS and Terrestrial LiDAR, the residual error of each modeling method was compensated and the mutual correction result was obtained. The accuracy of fusion-based 3D model is less than 1cm and it is expected to be useful for digital twin construction.

Keywords

TSGHB4_2018_v7n2_53_f0001.png 이미지

Figure 1. Blind Spot of UAS and Terrestrial LiDAR

TSGHB4_2018_v7n2_53_f0002.png 이미지

Figure 2. Study Flow Chart

TSGHB4_2018_v7n2_53_f0003.png 이미지

Figure 3. Conjugate Condition

TSGHB4_2018_v7n2_53_f0004.png 이미지

Figure 4. Analysis of SfM Image by Close Range Photogrammetry

TSGHB4_2018_v7n2_53_f0005.png 이미지

Figure 5. Measuring Points Coordinates by a Laser Scanner

TSGHB4_2018_v7n2_53_f0006.png 이미지

Figure 6. Terrestrial LiDAR Scanner(Trimble SX10)

TSGHB4_2018_v7n2_53_f0007.png 이미지

Figure 7. UAV Platform Sensor (Phantom4 Pro)

TSGHB4_2018_v7n2_53_f0008.png 이미지

Figure 8. Network-RTK GNSS

TSGHB4_2018_v7n2_53_f0009.png 이미지

Figure 9. Ground Control Point(GCP) and Checkpoint on Daum Map (○ : GCP, □ : Checkpoint)

TSGHB4_2018_v7n2_53_f0010.png 이미지

Figure 10. Point Clouds Based on UAS

TSGHB4_2018_v7n2_53_f0011.png 이미지

Figure 11. Test Object and Terrestrial LiDAR Install Point on Daum Map

TSGHB4_2018_v7n2_53_f0012.png 이미지

Figure 12. Point Clouds Based on Terrestrial LiDAR

TSGHB4_2018_v7n2_53_f0013.png 이미지

Figure 13. 3D Model Based on UAS+LiDAR

TSGHB4_2018_v7n2_53_f0014.png 이미지

Figure 14. Blind Spot Real Photo

TSGHB4_2018_v7n2_53_f0015.png 이미지

Figure 15. UAS Blind Spot and Complemented Blind Spot in UAS+LiDAR

Table 1. Trimble SX10 Specification

TSGHB4_2018_v7n2_53_t0001.png 이미지

Table 2. Phantom4 Pro Specification

TSGHB4_2018_v7n2_53_t0002.png 이미지

Table 3. R8 GNSS Specification

TSGHB4_2018_v7n2_53_t0003.png 이미지

Table 4. GCP and Checkpoint

TSGHB4_2018_v7n2_53_t0004.png 이미지

Table 5. Checkpoint Accuracy Comparison in 3D Model Based on UAS

TSGHB4_2018_v7n2_53_t0005.png 이미지

Table 6. Checkpoint Accuracy Comparison in 3D Model Based on UAS and Terrestrial LiDAR

TSGHB4_2018_v7n2_53_t0006.png 이미지

References

  1. Boon, M, Greenfield, R. and Tesfamichael, S. (2016), "Unmanned Aerial Vehicle(UAV) Photogrammetry Produces Accurate Highresolution Orthophotos, Point Clouds and Surface Models for Mapping Wetlands, South African Journal of Geomatics, Vol. 5, No.2, p.186-200 https://doi.org/10.4314/sajg.v5i2.7
  2. Cho, Hyung-Sik, Son, Hong-Kyu, Park, Hyo-Geun, Lee, Bin, Park, Je-Seong, (2013), "지상 LiDAR 자료 정합방법에 따른 정확도 분석", 한국측량학회 춘계학술발표회, p. 389-390
  3. Han, Seung-Hee, (2014), "지형정보획득용 저가 소형 자동항법 UAS 개발 및 평가", 대한토목학회논문집, 제34호 제4호, P. 1343-1351 https://doi.org/10.12652/Ksce.2014.34.4.1343
  4. Han, Seung-Hee, (2017), "드론을 이용한 정사영상제작에서 영상기반처리방법과 GCP적용 시의 정확도 비교", 한국지형공간정보학회 춘계학술대회, p. 93-94
  5. Hartley, R. I., (1997), "Self-calibration of Stationary Cameras.", International Journal of Computer Vision, Vol. 22, No. 1, p. 5-23. https://doi.org/10.1023/A:1007957826135
  6. Kang, Joon-Oh, Lee, Yong-Chang, (2016), "드론기반 교량 외관 안전점검을 위한 사전연구", 한국지형공간정보학회 학술대회, p. 207-210
  7. Kang, Joon-Oh, Kim, Dal-joo, Han, Woong-Ji, Lee, Yong-Chang, (2018), "지상 LiDAR 및 UAS 기반 퇴적암벽의 3D모델 구현", 2018 공동추계학술대회, p. 339-340
  8. Kim, Dal-Joo, Lee, Yong-Chang, (2017), "석조물의 3차원 모델링을 위한 3가지 RGB 영상의 조합 해석", 한국측량학회 학술대회자료집, p. 239-241
  9. Lee, Yong-Chang, (2015), "회전익 UAS 영상기반 고밀도 측점자료의 위치 정확도 평가", 한국지형공간정보학회지 제23권 제2호, p. 39-48 https://doi.org/10.7319/kogsis.2015.23.2.039
  10. Lee, Yong-Chang, (2016), "UAS 영상기반 공간 해석", 인천대학교, 총판론 p. 141-302
  11. 국토지리정보원 (2018), 무인비행장치 이용 공공측량 작업지침제정 고시, p. 1-13

Cited by

  1. UAS, CRP 및 지상 LiDAR 융합기반 와형석조여래불의 3차원 재현과 고증 연구 vol.51, pp.1, 2018, https://doi.org/10.22640/lxsiri.2021.51.1.111