• Title/Summary/Keyword: large scale mapping

Search Result 138, Processing Time 0.033 seconds

A New True Ortho-photo Generation Algorithm for High Resolution Satellite Imagery

  • Bang, Ki-In;Kim, Chang-Jae
    • Korean Journal of Remote Sensing
    • /
    • v.26 no.3
    • /
    • pp.347-359
    • /
    • 2010
  • Ortho-photos provide valuable spatial and spectral information for various Geographic Information System (GIS) and mapping applications. The absence of relief displacement and the uniform scale in ortho-photos enable interested users to measure distances, compute areas, derive geographic locations, and quantify changes. Differential rectification has traditionally been used for ortho-photo generation. However, differential rectification produces serious problems (in the form of ghost images) when dealing with large scale imagery over urban areas. To avoid these artifacts, true ortho-photo generation techniques have been devised to remove ghost images through visibility analysis and occlusion detection. So far, the Z-buffer method has been one of the most popular methods for true ortho-photo generation. However, it is quite sensitive to the relationship between the cell size of the Digital Surface Model (DSM) and the Ground Sampling Distance (GSD) of the imaging sensor. Another critical issue of true ortho-photo generation using high resolution satellite imagery is the scan line search. In other words, the perspective center corresponding to each ground point should be identified since we are dealing with a line camera. This paper introduces alternative methodology for true ortho-photo generation that circumvents the drawbacks of the Z-buffer technique and the existing scan line search methods. The experiments using real data are carried out while comparing the performance of the proposed and the existing methods through qualitative and quantitative evaluations and computational efficiency. The experimental analysis proved that the proposed method provided the best success ratio of the occlusion detection and had reasonable processing time compared to all other true ortho-photo generation methods tested in this paper.

Fine mapping of qBK1, a major QTL for bakanae disease resistance in rice

  • Ham, Jeong-Gwan;Cho, Soo-Min;Kim, Tae Heon;Lee, Jong-Hee;Shin, Dongjin;Cho, Jun-Hyun;Lee, Ji-Yoon;Yoon, Young-Nam;Song, You-Chun;Oh, Myeong-Kyu;Park, Dong-Soo
    • Proceedings of the Korean Society of Crop Science Conference
    • /
    • 2017.06a
    • /
    • pp.92-92
    • /
    • 2017
  • Bakanae disease is one of the most serious and oldest problems of rice production, which was first described in 1828 in Japan. This disease has also been identified in Asia, Africa, North America, and Italy. Germinating rice seeds in seed boxes for mechanical transplantation has caused many problems associated with diseases, including bakanae disease. Bakanae disease has become a serious problem in the breeding of hybrid rice, which involves the increased use of raising plants in seed beds. The indica rice variety Shingwang was selected as resistant donor to bakanae disease. One hundred sixty nine NILs, YR28297 ($BC_6F_4$) generated by five backcrosses of Shingwang with the genetic background of susceptible japonica variety, Ilpum were used for QTL analysis. Rice bakanae disease pathogen, CF283, was mainly used in this study and inoculation and evaluation of bakanae disease was performed with the method of the large-scale screening method developed by Kim et al. (2014). SSR markers evenly distributed in the entire rice chromosomes were selected from the Gramene database (http://www.gramene.org), and the polymorphic markers were used for frame mapping of a $BC_5F_5$ resistant line. Here, we developed 168 near-isogenic rice lines (NILs, $BC_6F_4$) to locate a QTL for resistance against bakanae disease. The lines were derived from a cross between Shingwang, a highly resistant variety (indica), and Ilpum, a highly susceptible variety (japonica). The 24 markers representing the Shingwang allele in a bakanae disease-resistant NIL, YR24982-9-1 (parental line of the $BC_6F_4$ NILs), were located on chromosome 1, 2, 7, 8, 10, 11, and 12. Single marker analysis using an SSR marker, RM9, showed that a major QTL was located on chromosome 1. The QTL explained 65 % of the total phenotype variation in $BC_6F_4$ NILs. The major QTL designated qBK1 was mapped in 91 kb region between InDel15 and InDel21. The identification of qBK1 and the closely linked SSR marker, InDel18, could be useful for improving rice bakanae disease resistance in marker-assisted breeding.

  • PDF

Vegetation Cover Type Mapping Over The Korean Peninsula Using Multitemporal AVHRR Data (시계열(時系列) AVHRR 위성자료(衛星資料)를 이용한 한반도 식생분포(植生分布) 구분(區分))

  • Lee, Kyu-Sung
    • Journal of Korean Society of Forest Science
    • /
    • v.83 no.4
    • /
    • pp.441-449
    • /
    • 1994
  • The two reflective channels(red and near infrared spectrum) of advanced very high resolution radiometer(AVHRR) data were used to classify primary vegetation cover types in the Korean Peninsula. From the NOAA-11 satellite data archive of 1991, 27 daytime scenes of relatively minimum cloud coverage were obtained. After the initial radiometric calibration, normalized difference vegetation index(NDVI) was calculated for each of the 27 data sets. Four or five daily NDVI data were then overlaid for each of the six months starting from February to November and the maximum value of NDVI was retained for every pixel location to make a monthly composite. The six bands of monthly NDVI composite were nearly cloud free and used for the computer classification of vegetation cover. Based on the temporal signatures of different vegetation cover types, which were generated by an unsupervised block clustering algorithm, every pixel was classified into one of the six cover type categories. The classification result was evaluated by both qualitative interpretation and quantitative comparison with existing forest statistics. Considering frequent data acquisition, low data cost and volume, and large area coverage, it is believed that AVHRR data are effective for vegetation cover type mapping at regional scale.

  • PDF

National Disaster Management, Investigation, and Analysis Using RS/GIS Data Fusion (RS/GIS 자료융합을 통한 국가 재난관리 및 조사·분석)

  • Seongsam Kim;Jaewook Suk;Dalgeun Lee;Junwoo Lee
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.5_2
    • /
    • pp.743-754
    • /
    • 2023
  • The global occurrence of myriad natural disasters and incidents, catalyzed by climate change and extreme meteorological conditions, has engendered substantial human and material losses. International organizations such as the International Charter have established an enduring collaborative framework for real-time coordination to provide high-resolution satellite imagery and geospatial information. These resources are instrumental in the management of large-scale disaster scenarios and the expeditious execution of recovery operations. At the national level, the operational deployment of advanced National Earth Observation Satellites, controlled by National Geographic Information Institute, has not only catalyzed the advancement of geospatial data but has also contributed to the provisioning of damage analysis data for significant domestic and international disaster events. This special edition of the National Disaster Management Research Institute delineates the contemporary landscape of major disaster incidents in the year 2023 and elucidates the strategic blueprint of the government's national disaster safety system reform. Additionally, it encapsulates the most recent research accomplishments in the domains of artificial satellite systems, information and communication technology, and spatial information utilization, which are paramount in the institution's disaster situation management and analysis efforts. Furthermore, the publication encompasses the most recent research findings relevant to data collection, processing, and analysis pertaining to disaster cause and damage extent. These findings are especially pertinent to the institute's on-site investigation initiatives and are informed by cutting-edge technologies, including drone-based mapping and LiDAR observation, as evidenced by a case study involving the 2023 landslide damage resulting from concentrated heavy rainfall.

Fast Hilbert R-tree Bulk-loading Scheme using GPGPU (GPGPU를 이용한 Hilbert R-tree 벌크로딩 고속화 기법)

  • Yang, Sidong;Choi, Wonik
    • Journal of KIISE
    • /
    • v.41 no.10
    • /
    • pp.792-798
    • /
    • 2014
  • In spatial databases, R-tree is one of the most widely used indexing structures and many variants have been proposed for its performance improvement. Among these variants, Hilbert R-tree is a representative method using Hilbert curve to process large amounts of data without high cost split techniques to construct the R-tree. This Hilbert R-tree, however, is hardly applicable to large-scale applications in practice mainly due to high pre-processing costs and slow bulk-load time. To overcome the limitations of Hilbert R-tree, we propose a novel approach for parallelizing Hilbert mapping and thus accelerating bulk-loading of Hilbert R-tree on GPU memory. Hilbert R-tree based on GPU improves bulk-loading performance by applying the inversed-cell method and exploiting parallelism for packing the R-tree structure. Our experimental results show that the proposed scheme is up to 45 times faster compared to the traditional CPU-based bulk-loading schemes.

An Efficient Multidimensional Scaling Method based on CUDA and Divide-and-Conquer (CUDA 및 분할-정복 기반의 효율적인 다차원 척도법)

  • Park, Sung-In;Hwang, Kyu-Baek
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.4
    • /
    • pp.427-431
    • /
    • 2010
  • Multidimensional scaling (MDS) is a widely used method for dimensionality reduction, of which purpose is to represent high-dimensional data in a low-dimensional space while preserving distances among objects as much as possible. MDS has mainly been applied to data visualization and feature selection. Among various MDS methods, the classical MDS is not readily applicable to data which has large numbers of objects, on normal desktop computers due to its computational complexity. More precisely, it needs to solve eigenpair problems on dissimilarity matrices based on Euclidean distance. Thus, running time and required memory of the classical MDS highly increase as n (the number of objects) grows up, restricting its use in large-scale domains. In this paper, we propose an efficient approximation algorithm for the classical MDS based on divide-and-conquer and CUDA. Through a set of experiments, we show that our approach is highly efficient and effective for analysis and visualization of data consisting of several thousands of objects.

ChIP-seq Library Preparation and NGS Data Analysis Using the Galaxy Platform (ChIP-seq 라이브러리 제작 및 Galaxy 플랫폼을 이용한 NGS 데이터 분석)

  • Kang, Yujin;Kang, Jin;Kim, Yea Woon;Kim, AeRi
    • Journal of Life Science
    • /
    • v.31 no.4
    • /
    • pp.410-417
    • /
    • 2021
  • Next-generation sequencing (NGS) is a high-throughput technique for sequencing large numbers of DNA fragments that are prepared from a genome. This sequencing technique has been used to elucidate whole genome sequences of living organisms and to analyze complementary DNA (cDNA) or chromatin immunoprecipitated DNA (ChIPed DNA) at the genome level. After NGS, the use of proper tools is important for processing and analyzing data with reasonable parameters. However, handling large-scale sequencing data and programing for data analysis can be difficult. The Galaxy platform, a public web service system, provides many different tools for NGS data analysis, and it allows researchers to analyze their data on a web browser with no deep knowledge about bioinformatics and/or programing. In this study, we explain the procedure for preparing chromatin immunoprecipitation-sequencing (ChIP-seq) libraries and steps for analyzing ChIP-seq data using the Galaxy platform. The data analysis steps include the NGS data upload to Galaxy, quality check of the NGS data, premapping processes, read mapping, the post-mapping process, peak-calling and visualization by window view, heatmaps, average profile, and correlation analysis. Analysis of our histone H3K4me1 ChIP-seq data in K562 cells shows that it correlates with public data. Thus, NGS data analysis using the Galaxy platform can provide an easy approach to bioinformatics.

Case Study of the Stability of a Large Cut-Slope at a Tunnel Portal (터널 입구부 대절토 사면 안정성 사례 연구)

  • Park, Dong Soon;Bae, Jong-Soem
    • The Journal of Engineering Geology
    • /
    • v.25 no.1
    • /
    • pp.115-129
    • /
    • 2015
  • The cut-slope of a large-sectional tunnel portal is recognized as a potential area of weakness due to unstable stress distribution and possible permanent displacement. This paper presents a case study of a slope failure and remediation for a large-scale cut-slope at a tunnel portal. Extensive rock-slope brittle failure occurred along discontinuities in the rock mass after 46 mm of rainfall, which caused instability of the upper part of the cut-slope. Based on a geological survey and face mapping, the reason for failure is believed to be the presence of thin clay fill in discontinuities in the weathered rock mass and consequent saturationinduced joint weakening. The granite-gneiss rock mass has a high content of alkali-feldspar, indicating a vulnerability to weathering. Immediately before the slope failure, a sharp increase in displacement rate was indicated by settlement-time histories, and this observation can contribute to the safety management criteria for slope stability. In this case study, emergency remediation was performed to prevent further hazard and to facilitate reconstruction, and counterweight fill and concrete filling of voids were successfully applied. For ultimate remediation, the grid anchor-blocks were used for slope stabilization, and additional rock bolts and grouting were applied inside the tunnel. Limit-equilibrium slope stability analysis and analyses of strereographic projections confirmed the instability of the original slope and the effectiveness of reinforcing methods. After the application of reinforcing measures, instrumental monitoring indicated that the slope and the tunnel remained stable. This case study is expected to serve as a valuable reference for similar engineering cases of large-sectional slope stability.

Usefulness of applying Macro for Brain SPECT Processing (Brain SPECT Processing에 있어서 Macro Program 사용의 유용성)

  • Kim, Gye-Hwan;Lee, Hong-Jae;Kim, Jin-Eui;Kim, Hyeon-Joo
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.13 no.1
    • /
    • pp.35-39
    • /
    • 2009
  • Purpose: Diagnostic and functional imaging softwares in Nuclear Medicine have been developed significantly. But, there are some limitations which like take a lot of time. In this article, we introduced that the basic concept of macro to help understanding macro and its application to Brain SPECT processing. We adopted macro software to SPM processing and PACS verify processing of Brain SPECT processing. Materials and Methods: In Brain SPECT, we choose SPM processing and two PACS works which have large portion of a work. SPM is the software package to analyze neuroimaging data. And purpose of SPM is quantitative analysis between groups. Results are made by complicated process such as realignment, normalization, smoothing and mapping. We made this process to be more simple by using macro program. After sending image to PACS, we directly input coordinates of mouse using simple macro program for processes of color mapping, adjustment of gray scale, copy, cut and match. So we compared time for making result by hand with making result by macro program. Finally, we got results by applying times to number of studies in 2007. Results: In 2007, the number of SPM studies were 115 and the number of PACS studies were 834 according to Diamox study. It was taken 10 to 15 minutes for SPM work by hand according to expertness and 5 minutes and a half was uniformly needed using Macro. After applying needed time to the number of studies, we calculated an average time per a year. When using SPM work by hand according to expertness, 1150 to 1725 minutes (19 to 29 hours) were needed and 632 seconds (11 hours) were needed for using Macro. When using PACS work by hand, 2 to 3 minutes were needed and for using Macro, 45 seconds were needed. After applying theses time to the number of studies, when working by hand, 1668 to 2502 minutes (28 to 42 hours) were needed and for using Macro, 625 minutes (10 hours) were needed. Following by these results, it was shown that 1043 to 1877 (17 to 31 hours were saved. Therefore, we could save 45 to 63% for SPM, 62 to 75% for PACS work and 55 to 70% for total brain SPECT processing in 2007. Conclusions: On the basis of the number of studies, there was significant time saved when we applied Macro to brain SPECT processing and also it was shown that even though work is taken a little time, there is a possibility to save lots of time according to the number of studies. It gives time on technologist's side which makes radiological technologist more concentrate for patients and reduce probability of mistake. Appling Macro to brain SPECT processing helps for both of radiological technologists and patients and contribute to improve quality of hospital service.

  • PDF

Community Patterning of Bethic Macroinvertebrates in Streams of South Korea by Utilizing an Artificial Neural Network (인공신경망을 이용한 남한의 저서성 대형 무척추동물 군집 유형)

  • Kwak, Inn-Sil;Liu, Guangchun;Park, Young-Seuk;Chon, Tae-Soo
    • Korean Journal of Ecology and Environment
    • /
    • v.33 no.3 s.91
    • /
    • pp.230-243
    • /
    • 2000
  • A large-scale community data were patterned by utilizing an unsupervised learning algorithm in artificial neural networks. Data for benthic macroinvertebrates in streams of South Korea reported in publications for 12 years from 1984 to 1995 were provided as inputs for training with the Kohonen network. Taxa included for the training were 5 phylum, 10 class, 26 order, 108 family and 571 species in 27 streams. Abundant groups were Diptera, Ephemeroptera, Trichoptera, Plecoptera, Coleoptera, Odonata, Oligochaeta, and Physidae. A wide spectrum of community compositions was observed: a few tolerant taxa were collected at polluted sites while a high species richness was observed at relatively clean sites. The trained mapping by the Kohonen network effectively showed patterns of communities from different river systems, followed by patterns of communities from different environmental disturbances. The training by the proposed artificial neural network could be an alternative for organizing community data in a large-scale ecological survey.

  • PDF