• Title/Summary/Keyword: Map size

Search Result 764, Processing Time 0.027 seconds

A Map-Based Boundray Input Method for Video Surveillance (영상 감시를 위한 지도기반 감시영역 입력 방법)

  • Kim, Jae-Hyeok;Maeng, Seung-Ryol
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.1
    • /
    • pp.418-424
    • /
    • 2014
  • In this paper, we propose a boundary input method for video surveillance systems. Since intrusion of a moving object is decided by comparition of its position and the surveillance boundary, the boundary input method is a basic function in video surveillance. Previous methods are difficult to adapt to the change of surveillance environments such as the size of surveillance area, the number of cameras, and the position of cameras because those build up the surveillance boundary using the captured image in the center of each camera. In our approach, the whole surveillance boundary is once defined in the form of polygon based on the satellite map and transformed into each camera environment. Its characteristics is that the boundary input is independent from the surveillance environment. Given the position of a moving object, the time complexity of its intrusion detection shows O(n), where n is the number of polygon vertices. To verify our method, we implemented a 3D simulation and assured that the input boundary can be reused in each camera without any redefinition.

A Study on Development of Village Wetlands Inventory Using GIS and Establishment of Management Methods in Asan City, Korea (GIS를 이용한 아산시 마을습지 인벤토리 구축 및 관리 방안 연구)

  • Park, Mi Ok;Yang, Seung Bin;Koo, Bon Hak
    • Journal of the Korean Society of Environmental Restoration Technology
    • /
    • v.18 no.6
    • /
    • pp.167-177
    • /
    • 2015
  • This study was conducted to establish an inventory and propose conservation strategies for 'village wetlands' in Asan city, Korea, using GIS. As results, the village wetlands are defined as such places as 'palustrine' wetland, village embankment, agricultural reservoir or small reservoirs located in or near the village and related to everyday life or farming. Firstly 807 provisional village wetlands(draft) were identified in Asan by using Arc-GIS 10.1, then 196 wetlands(final) were defined finally as village wetlands and listed the inventory of Asan Village Wetlands after being validated through office works and field survey. The office works analyzed minimum area(greater than $625m^2$), satellite images, the Korea Land Information System, land use map and land coverage map. To evaluate the function and conservation values, the 37 wetlands were selected for detailed surveying and function assessment based on the following criteria : 1) doubled code both wetland and reservoir at digital map, 2) located less than 100m from village and 3) ecologically connected to such ecological resources as seaside mudflats, mountains and green area and ecological passages for small size wildlifes. As the result of the wetland function assessments by the RAM method, 7 wetlands were found to have 'high' wetland function (conservation) 18 wetlands were 'medium' (enhancement) and 12 wetlands were 'low' (restoration or enhancement). Enhancing biodiversity and ecosystem services through ecological management of wetlands in Asan and connecting with the Ecological Natural Degree were proposed.

2-D Inundation Analysis According to Post-Spacing Density of DEMs from LiDAR Using GIS (GIS를 활용한 LiDAR 자료의 밀도에 따른 2차원 침수해석)

  • Ha, Chang-Yong;Han, Kun-Yeun;Cho, Wan-Hee
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.13 no.1
    • /
    • pp.74-88
    • /
    • 2010
  • In this study, the points of LiDAR were modified in order to generate various DEM resolutions by applying LiDAR data in Ulsan. Since the LiDAR data have points with 1m intervals, the number of points for each resolution was modified to the size of 1, 5, 10, 30, 50, 100m by uniformly eliminating the points. A runoff analysis was performed on Taehwa river and its tributary, Dongcheon, with 200 year rainfall exceedance probability. 2-dimensional inundation analysis was performed based on the density of LiDAR data using FLUMEN, which was used to establish domestic flood risk map. Once DEM data obtained from LiDAR survey are used, it is expected that the study results can be used as data in determining optimal grid spacing, which is economical, effective and accurate in establishing flood defence plans including the creation of flood risk map.

A Benchmark Test of Spatial Big Data Processing Tools and a MapReduce Application

  • Nguyen, Minh Hieu;Ju, Sungha;Ma, Jong Won;Heo, Joon
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.35 no.5
    • /
    • pp.405-414
    • /
    • 2017
  • Spatial data processing often poses challenges due to the unique characteristics of spatial data and this becomes more complex in spatial big data processing. Some tools have been developed and provided to users; however, they are not common for a regular user. This paper presents a benchmark test between two notable tools of spatial big data processing: GIS Tools for Hadoop and SpatialHadoop. At the same time, a MapReduce application is introduced to be used as a baseline to evaluate the effectiveness of two tools and to derive the impact of number of maps/reduces on the performance. By using these tools and New York taxi trajectory data, we perform a spatial data processing related to filtering the drop-off locations within Manhattan area. Thereby, the performance of these tools is observed with respect to increasing of data size and changing number of worker nodes. The results of this study are as follows 1) GIS Tools for Hadoop automatically creates a Quadtree index in each spatial processing. Therefore, the performance is improved significantly. However, users should be familiar with Java to handle this tool conveniently. 2) SpatialHadoop does not automatically create a spatial index for the data. As a result, its performance is much lower than GIS Tool for Hadoop on a same spatial processing. However, SpatialHadoop achieved the best result in terms of performing a range query. 3) The performance of our MapReduce application has increased four times after changing the number of reduces from 1 to 12.

A Method of Site Selection for the Artificial Recharge of Groundwater Using Geospatial Data (지형공간자료를 이용한 지하수 인공함양 적지 선정 방안)

  • Lee, Jae One;Seo, Minho;Han, Chan
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.33 no.5
    • /
    • pp.427-436
    • /
    • 2015
  • This study aims to select an optimal site for the development of small-scaled artificial ground water recharge system with the purpose of 50ton/day pumping in dry season. First of all, the topography shape satisfying the numerous factors of a hydraulic model experiment is defined and an appropriate pumping efficiency is calculated through the model experiment of injection and pumping scenario. In next step, GIS(Geographic Information System) database are constructed by processing several geospatial data to explore the optimal site. In detail, watershed images are generated from DEM(Digital Elevation Model) with 5m cell size, which is set for the minimum area of the optimal site selection. Slope maps are made from DEM to determine the optimal hydraulic gradient to procure the proper aquifer undercurrent period. Finally, the suitable site for artificial recharge system is selected using an integration of overall data, such as an alluvial map, DEM, orthoimages, slope map, and watershed images.

Processing Method of Mass Small File Using Hadoop Platform (하둡 플랫폼을 이용한 대량의 스몰파일 처리방법)

  • Kim, Chang-Bok;Chung, Jae-Pil
    • Journal of Advanced Navigation Technology
    • /
    • v.18 no.4
    • /
    • pp.401-408
    • /
    • 2014
  • Hadoop is composed with MapReduce programming model for distributed processing and HDFS distributed file system. Hadoop is suitable framework for big data processing, but processing of mass small files have many problems. The processing of mass small file in hadoop have problems to created one mapper per one file, and it have problems to needed many memory for store of meta information of file. This paper have comparison evaluation processing method of mass small file with various method in hadoop platform. The processing of general compression format is inadequate because of processing by one mapper regardless of data size. The processing of sequence and hadoop archive file is removed memory problem of namenode by compress and combine of small file. Hadoop archive file is faster then sequence file about combine time of small file. The processing using CombineFileInputFormat class is needed not combine of small file, and it have similar speed big data processing method.

Three-dimensional cone beam computed tomography analysis of temporomandibular joint response to the Twin-block functional appliance

  • Jiang, Yuan-yuan;Sun, Lian;Wang, Hua;Zhao, Chun-yang;Zhang, Wei-Bing
    • The korean journal of orthodontics
    • /
    • v.50 no.2
    • /
    • pp.86-97
    • /
    • 2020
  • Objective: To propose a three-dimensional (3D) method for evaluating temporomandibular joint (TMJ) changes during Twin-block treatment. Methods: Seventeen patients with Class II division 1 malocclusion treated using Twin-block and nine untreated patients with a similar malocclusion were included in this research. We collected their cone beam computed tomography (CBCT) data from before and 8 months after treatment. Segmentations were constructed using ITK-SNAP. Condylar volume and superficial area were measured using 3D Slicer. The 3D landmarks were identified on CBCT images by using Dolphin software to assess the condylar positional relationship. 3D models of the mandible and glenoid fossa of the patients were constructed and registered via voxel-based superimposition using 3D Slicer. Thereafter, skeletal changes could be visualized using 3DMeshMetric in any direction of the superimposition on a color-coded map. All the superimpositions were measured using the same scale on the distance color-coded map, in which red color represents overgrowth and blue color represents resorption. Results: Significant differences were observed in condylar volume, superficial area, and condylar position in both groups after 8 months. Compared with the control group (CG), the Twin-block group exhibited more obvious condyle-fossa modifications and joint positional changes. Moreover, on the color-coded map, more obvious condyle-fossa modifications could be observed in the posterior and superior directions in the Twin-block group than in the CG. Conclusions: We successfully established a 3D method for measuring and evaluating TMJ changes caused by Twin-block treatment. The treatment produced a larger condylar size and caused condylar positional changes.

A Preliminary Study on the Adjustment of Forest-based Wildlife Protection Area (산림기반 야생동식물보호구역 조경을 위한 기초연구)

  • Jang, Gab-Sue
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.36 no.1
    • /
    • pp.62-69
    • /
    • 2008
  • This study was conducted in order to recommend forest-based wildlife protection areas in Chung-nam Province using several basic habitat conditions. The conditions used in this study were the forest patch size with the potential to keep wildlife animals safe, the distance from water sources, and the availability of food for wildlife. The fractal dimension index was also used to find the edge line dynamics, which can influence on habitat conditions for edge species. The natural conservation management indices including a forest map (indicating the level of forest age), a slope map, and an elevation map were used to find the forest patches with enough space for wildlife to live on. Water resources and their buffer areas were considered as factors to protect the space as an ecological corridor. Deciduous trees and trees mixed with deciduous trees and conifers were chosen to provide wildlife animals their food. In total, 525 forest patches were chosen and recommended for the wildlife protection area. Five of these forest patches were recommended as wildlife protection areas managed by the provincial government. The other 520 forest patches were recommended to protect local wildlife animals and be managed by each county or city. These forest patches were located around the Geum-buk and Geum-nam mountains, and the forest patches are important resources as habitats to keep wildlife in the area. An ecological network consists of these separate forest patches with the ecological integration. A fractal dimension index was used to divide forest patches into several categories in order to find how patches are shaped. The forest patches with longer edges or more irregular shapes have a much higher possibility of being inhabited by various types of edge species. Through comparison of the wildlife protection areas recommended in this study to the current wildlife protection areas, we recognized that the current wildlife protection areas need boundary adjustments in order for wildlife animals to survive by themselves with water sources and food.

A Study on Small-sized Index Structure and Fast Retrieval Method Using The RCB trio (RCB트라이를 이용한 빠른 검색과 소용량 색인 구조에 관한 연구)

  • Jung, Kyu-Cheol
    • Journal of the Korea Society of Computer and Information
    • /
    • v.12 no.4
    • /
    • pp.11-19
    • /
    • 2007
  • This paper proposes RCB(Reduced Compact Binary) tie to correct faults of both CB(Compact Binary) tie and HCB(Hierarchical Compact Binary) trie. First, in the case of CB trie, a compact structure was tried for the first time, but as the amount of data was increasing, that of inputted data gained and much difficulty was experienced in insertion due to the dummy nods used in balancing trees. On the other hand, if the HCB trie realized hierarchically, given certain depth to prevent the map from increasing on the right, reached the depth, the method for making new trees and connecting to them was used. Eventually, fast progress could be made in the inputting and searching speed, but this had a disadvantage of the storage space becoming bigger because of the use of dummy nods like CB trie and of many tree links. In the case of RCB trie in this thesis, the tree-map could be reduced by about 35% by completely cutting down dummy nods and the whole size by half, compared with the HCB trie.

  • PDF

A Study on the Effective Algorithms for tine Generalization (선형성 지형자료의 일반화에 대한 효율적인 알고리즘에 관한 연구)

  • 김감래;이호남
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.12 no.1
    • /
    • pp.43-52
    • /
    • 1994
  • This paper outlines a new approach to the line generalization when preparing small scale map on the basis of existing large scale digital map. Line generalizations are conducted based on Douglas algorithm using 1/25,000 scale topographic maps of southeastern JEJU island which produced by National Geographic Institute to analyze the fitness to the original and problems of graphical representation. Compare to the same scale map which was generated by manual method, a verity of small, but sometimes significant errors & modification of topological relationship have been detected. The research gives full details of three algorithms that operationalize the smallest visible object method, together with some empirical results. A comparison of the results produced by the new algorithms with those produced by manual generalization and Douglas method of data reduction is provided. Also this paper presents the preliminary results of an relationships between the size of smallest visual object and requiring data storages for each algorithms.

  • PDF