• Title/Summary/Keyword: 첨단융합건설연구단

Search Result 4, Processing Time 0.016 seconds

The Role of the Center for Technology Fusion in Construction (첨단융합건설연구단의 역할)

  • Kim, Hyoung-Kwan;Han, Seung-Heon;Kim, Moon-Kyum
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • 2006.11a
    • /
    • pp.229-232
    • /
    • 2006
  • The Center for Technology Fusion in Construction was established on Sep. 30, 2005, with the support of Korea Ministry of Construction and Transportation and Korea Institute of Construction and Transportation Technology Evaluation and Planning. It aims to develop the next generation of economic growth engine through the fusion of traditional construction technology and cutting-edge emerging technologies. To achieve this vision, the center tries to establish a system for the systematic construction research based on the fusion approach. The scope of the center focuses on improving the performance of construction project, including planning, design, construction, and maintenance. Along with the newly developed Korea Construction Technology Road Map, the center is expected to significantly contribute to the development of innovative construction technologies for the world-class Korean society.

  • PDF

3-D Building Reconstruction from Standard IKONOS Stereo Products in Dense Urban Areas (IKONOS 컬러 입체영상을 이용한 대규모 도심지역의 3차원 건물복원)

  • Lee, Suk Kun;Park, Chung Hwan
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.26 no.3D
    • /
    • pp.535-540
    • /
    • 2006
  • This paper presented an effective strategy to extract the buildings and to reconstruct 3-D buildings using high-resolution multispectral stereo satellite images. Proposed scheme contained three major steps: building enhancement and segmentation using both BDT (Background Discriminant Transformation) and ISODATA algorithm, conjugate building identification using the object matching with Hausdorff distance and color indexing, and 3-D building reconstruction using photogrammetric techniques. IKONOS multispectral stereo images were used to evaluate the scheme. As a result, the BDT technique was verified as an effective tool for enhancing building areas since BDT suppressed the dominance of background to enhance the building as a non-background. In building recognition, color information itself was not enough to identify the conjugate building pairs since most buildings are composed of similar materials such as concrete. When both Hausdorff distance for edge information and color indexing for color information were combined, most segmented buildings in the stereo images were correctly identified. Finally, 3-D building models were successfully generated using the space intersection by the forward RFM (Rational Function Model).

Building Detection Using Edge and Color Information of Color Imagery (컬러영상의 경계정보와 색상정보를 활용한 동일건물인식)

  • Park, Choung Hwan;Sohn, Hong Gyoo
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.26 no.3D
    • /
    • pp.519-525
    • /
    • 2006
  • The traditional area-based matching or efficient matching methods using epipolar geometry and height restriction of stereo images, which have a confined search space for image matching, have still some disadvantages such as mismatching and timeconsuming, especially in the dense metropolitan city that very high and similar buildings exist. To solve these problems, a new image matching method through building recognition has been presented. This paper described building recognition in color stereo images using edge and color information as a elementary study of new matching scheme. We introduce the modified Hausdorff distance for using edge information, and the modified color indexing with 3-D RGB histogram for using color information. Color information or edge information alone is not enough to find conjugate building pairs. For edge information only, building recognition rate shows 46.5%, for color information only, 7.1%. However, building recognition rate distinctly increase 78.5% when both information are combined.