DOI QR코드

DOI QR Code

Automatic Extraction Method of Control Point Based on Geospatial Web Service

지리공간 웹 서비스 기반의 기준점 자동추출 기법 연구

  • Received : 2014.04.08
  • Accepted : 2014.06.10
  • Published : 2014.06.30

Abstract

This paper proposes an automatic extraction method of control point based on Geospatial Web Service. The proposed method consists of 3 steps. 1) The first step is to acquires reference data using the Geospatial Web Service. 2) The second step is to finds candidate control points in reference data and the target image by SURF algorithm. 3) By using RANSAC algorithm, the final step is to filters the correct matching points of candidate control points as final control points. By using the Geospatial Web Service, the proposed method increases operation convenience, and has the more extensible because of following the OGC Standard. The proposed method has been tested for SPOT-1, SPOT-5, IKONOS satellite images and has been used military standard data as reference data. The proposed method yielded a uniform accuracy under RMSE 5 pixel. The experimental results proved the capabilities of continuous improvement in accuracy depending on the resolution of target image, and showed the full potential of the proposed method for military purpose.

본 논문에서는 지리공간 웹 서비스 기반의 기준점 자동 추출 기법을 제안한다. 제안하는 기법은 3단계로 구성된다. 1) 첫 번째 단계에서는 지리공간 웹 서비스를 통해 대상영상의 촬영지역에 해당하는 기준자료를 자동으로 획득하고, 2) 두 번째 단계에서는 획득된 기준자료와 대상영상에 SURF 알고리즘을 적용하여 후보 기준점을 찾는다. 3) 마지막 단계에서는 RANSAC 알고리즘을 이용하여 추출된 후보 기준점 중 정 정합점을 최종 기준점으로 산출한다. 제안하는 기법은 기준자료를 획득하기 위해 지리공간 웹 서비스를 활용하였다. 이를 통하여 제안하는 기법은 기준영상과 고도자료의 관리 및 획득 시 사용자 편의성을 증대 시켰고, 표준을 따르기 때문에 높은 확장성을 가진다. 본 논문에서는 제안하는 기법을 SPOT-1, SPOT-5, IKONOS 위성영상에 적용하여 실험을 수행하였다. 실험지역에 적용한 결과, 제안하는 기법은 대상영상의 촬영센서, 촬영일자, 해상도 변화에도 RMSE 5화소 미만의 일관된 정확도를 산출하였고, 대상영상의 해상도가 좋아짐에 따라 정확도의 지속적인 향상 가능성을 확인하였다. 또한 기준영상과 고도자료로 군 표준 자료를 사용함으로써 제안하는 기법의 군사적 활용가능성을 확인하였다.

Keywords

References

  1. Baillarin, S., Bouillon, A., Bernard, M. and Chikhi, M., 2005, Using a three dimensional spatial database to orthorectify automatically remote sensing images, In Proceedings of the ISPRS Hangzhou 2005 Workshop, ISRPS, pp. 89-94.
  2. Bay, H., Tuytelaars, T. and Gool, L. V., 2006, SURF:speeded-up robust features, computer vision-ECCV 2006, Lecture, Springer, Vol. 3951, pp. 404-417.
  3. Bouchiha, R. and Besbes, K., 2013, Automatic remote-sensing image registration using SURF, International Journal of Computer Theory and Engineering, IACSIT, Vol. 5, No. 1, pp. 88-92.
  4. Chang, Y. S., Oh, J. H. and Kim, K. O., 2007, The trend of geospatial web technologies, Electronics and Telecommunications Trends, ETRI, Vol. 22, No. 3, pp. 124-135.
  5. Choi, S. Y. and Sin, D. S., 2003, Modeling of SPOT-5 HRG stereo pair, Agency for Defense Development Report, ADD, pp. 26-27.
  6. Fischler, M. A. and Bolles, R. C., 1981, Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography, Communcations of the ACM, ACM, Vol. 24, No.6, pp. 381-395. https://doi.org/10.1145/358669.358692
  7. Gianinetto, M. and Scaioni, M., 2008, Automated geometric correction of high-resolution pushbroom satellite data, Photogrammetric Engineering & Remote Sensing, ASPRS, Vol. 74, No. 1, pp. 107-116. https://doi.org/10.14358/PERS.74.1.107
  8. Guo, H., Cheng, C., Yang, Y., 2010, An automated registration of RS image based on SURF and piecewise linear transformation, Conference on Environmental Science and Information Application Technology, ESIAT, pp. 133-136.
  9. Han, Y. K., Kim, Y. M., Byun, Y. G., Choi, J. W., Han, D. Y. and Kim, Y. I., 2011, Automatic registration of high-resolution images in urban areas uing local properties of features, Milwaukee 2011 ASPRS Annual Conference Proceedings, ASPRS.
  10. Juan, L. and Gwun, O., 2009, A comparison of SIFT, PCA-SIFT and SURF, International Journal of Image Processing, CSC Journals, Vol. 3, Issue. 4, pp. 143-152.
  11. Kang, M. H., Bang, S. N. and Lee, Y. W., 2003, Automatic measuring of GCP's image coordinates using control point patch and auxiliary points matching, Jounal of the Korean Society for Geospatial Information System, KOGSIS, Vol. 11, No. 2, pp. 29-37.
  12. Kim, T. J. and Im, Y. J., 2003, Automatic satellite image registration by combination of matching and random sample consensus, IEEE Transactions on geosciences and remote sensing, IEEE, Vol. 41, No. 5, pp. 1111-1117. https://doi.org/10.1109/TGRS.2003.811994
  13. Lowe, D. G., 2004, Distinctive image features from scale-invariant keypoints, the International Journal of Computer Vision, Kluwer Academic Publishers, Vol. 60, No. 2, pp. 91-110. https://doi.org/10.1023/B:VISI.0000029664.99615.94
  14. Military specification, 1995, MIL-C-89041, Controlled image base(CIB), the Department of Defense in United States.
  15. Military specification, 2000, MIL-PRF-89020B, Performance specification digital terrain elevation data(DTED), the Department of Defense in United States.
  16. Open Goespatial Consortium, 2006, OpenGIS web map server implementation Specification, OGC.
  17. Open Geospatial Consortium, 2008, Web coverage service(WCS) implementation standard, OGC.
  18. Open Source Project, 2009, Geoserver, Core Contributors-Boundless and GeoSolution, http://www.geoserver.org/
  19. Shin, D. S., 1993, Analysis of the triplet satellite image, Agency for Defense Development Report, ADD, pp. 30-31.
  20. Telecommunications Technology Association, 2009, TTAK.OT-10.0253, Web map service Ver.1.3, Telecommunications Technology Association.
  21. Yi, Z., Zhiguo, C. and Yang, X., 2008, Multi-spectral remote image registration based on SIFT, ELECTRONICS LETTERS, the Institution of Engineering and Technology, Vol. 44, No. 2, pp. 107-108.
  22. Zitova, B. and Flusser, J., 2003, Image registration methods: a survey, image and vision computing, ELSEVIER, Vol.21, No. 11, pp. 977-1000.

Cited by

  1. Matching Points Extraction Between Optical and TIR Images by Using SURF and Local Phase Correlation vol.23, pp.1, 2015, https://doi.org/10.7319/kogsis.2015.23.1.081
  2. 국토관측위성용 정밀영상생성시스템의 위치정확도 분석 vol.36, pp.5, 2014, https://doi.org/10.7780/kjrs.2020.36.5.2.4