• Title/Summary/Keyword: Spatial Clustering

Search Result 354, Processing Time 0.021 seconds

AKARI OBSERVATION OF THE FLUCTUATION OF THE NEAR-INFRARED BACKGROUND

  • Matsumoto, T.;Seo, H.J.;Jeong, W.S.;Lee, H.M.;Matsuura, S.;Matsuhara, H.;Oyabu, S.;Pyo, J.;Wada, T.
    • Publications of The Korean Astronomical Society
    • /
    • v.27 no.4
    • /
    • pp.363-365
    • /
    • 2012
  • We report a search for fluctuations of the sky brightness toward the North Ecliptic Pole with AKARI, at 2.4, 3.2, and $4.1{\mu}m$. The stacked images with a diameter of 10 arcminutes of the AKARI-Monitor Field show a spatial structure on the scale of a few hundred arcseconds. A power spectrum analysis shows that there is a significant excess fluctuation at angular scales larger than 100 arcseconds that cannot be explained by zodiacal light, diffuse Galactic light, shot noise of faint galaxies, or clustering of low-redshift galaxies. These findings indicate that the detected fluctuation could be attributed to the first stars of the universe, i.e., Population III stars.

A novel route restoring method upon geo-tagged photos

  • Wang, Guannan;Wang, Zhizhong;Zhu, Zhenmin;Wen, Saiping
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.5
    • /
    • pp.1236-1251
    • /
    • 2013
  • Sharing geo-tagged photos has been a hot social activity in the daily life because these photos not only contain geo information but also indicate people's hobbies, intention and mobility patterns. However, the present raw geo-tagged photo routes cannot provide information as enough as complete GPS trajectories due to the defects hidden in them. This paper mainly aims at analyzing the large amounts of geo-tagged photos and proposing a novel travel route restoring method. In our approach we first propose an Interest Measure Ratio to rank the hot spots based on density-based spatial clustering arithmetic. Then we apply the Hidden Semi-Markov model and Mean Value method to demonstrate migration discipline in the hot spots and restore the significant region sequence into complete GPS trajectory. At the end of the paper, a novel experiment method is designed to demonstrate that the approach is feasible in restoring route, and there is a good performance.

Spatio-temporal Load Analysis Model for Power Facilities using Meter Reading Data (검침데이터를 이용한 전력설비 시공간 부하분석모델)

  • Shin, Jin-Ho;Kim, Young-Il;Yi, Bong-Jae;Yang, Il-Kwon;Ryu, Keun-Ho
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.57 no.11
    • /
    • pp.1910-1915
    • /
    • 2008
  • The load analysis for the distribution system and facilities has relied on measurement equipment. Moreover, load monitoring incurs huge costs in terms of installation and maintenance. This paper presents a new model to analyze wherein facilities load under a feeder every 15 minutes using meter reading data that can be obtained from a power consumer every 15 minute or a month even without setting up any measuring equipment. After the data warehouse is constructed by interfacing the legacy system required for the load calculation, the relationship between the distribution system and the power consumer is established. Once the load pattern is forecasted by applying clustering and classification algorithm of temporal data mining techniques for the power customer who is not involved in Automatic Meter Reading(AMR), a single-line diagram per feeder is created, and power flow calculation is executed. The calculation result is analyzed using various temporal and spatial analysis methods such as Internet Geographic Information System(GIS), single-line diagram, and Online Analytical Processing (OLAP).

Detection of Multiple Salient Objects by Categorizing Regional Features

  • Oh, Kang-Han;Kim, Soo-Hyung;Kim, Young-Chul;Lee, Yu-Ra
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.1
    • /
    • pp.272-287
    • /
    • 2016
  • Recently, various and effective contrast based salient object detection models to focus on a single target have been proposed. However, there is a lack of research on detection of multiple objects, and also it is a more challenging task than single target process. In the multiple target problem, we are confronted by new difficulties caused by distinct difference between properties of objects. The characteristic of existing models depending on the global maximum distribution of data point would become a drawback for detection of multiple objects. In this paper, by analyzing limitations of the existing methods, we have devised three main processes to detect multiple salient objects. In the first stage, regional features are extracted from over-segmented regions. In the second stage, the regional features are categorized into homogeneous cluster using the mean-shift algorithm with the kernel function having various sizes. In the final stage, we compute saliency scores of the categorized regions using only spatial features without the contrast features, and then all scores are integrated for the final salient regions. In the experimental results, the scheme achieved superior detection accuracy for the SED2 and MSRA-ASD benchmarks with both a higher precision and better recall than state-of-the-art approaches. Especially, given multiple objects having different properties, our model significantly outperforms all existing models.

Object Classification based on Weakly Supervised E2LSH and Saliency map Weighting

  • Zhao, Yongwei;Li, Bicheng;Liu, Xin;Ke, Shengcai
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.1
    • /
    • pp.364-380
    • /
    • 2016
  • The most popular approach in object classification is based on the bag of visual-words model, which has several fundamental problems that restricting the performance of this method, such as low time efficiency, the synonym and polysemy of visual words, and the lack of spatial information between visual words. In view of this, an object classification based on weakly supervised E2LSH and saliency map weighting is proposed. Firstly, E2LSH (Exact Euclidean Locality Sensitive Hashing) is employed to generate a group of weakly randomized visual dictionary by clustering SIFT features of the training dataset, and the selecting process of hash functions is effectively supervised inspired by the random forest ideas to reduce the randomcity of E2LSH. Secondly, graph-based visual saliency (GBVS) algorithm is applied to detect the saliency map of different images and weight the visual words according to the saliency prior. Finally, saliency map weighted visual language model is carried out to accomplish object classification. Experimental results datasets of Pascal 2007 and Caltech-256 indicate that the distinguishability of objects is effectively improved and our method is superior to the state-of-the-art object classification methods.

Light Contribution Based Importance Sampling for the Many-Light Problem (다광원 문제를 위한 광원 기여도 기반의 중요도 샘플링)

  • Kim, Hyo-Won;Ki, Hyun-Woo;Oh, Kyoung-Su
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2008.06b
    • /
    • pp.240-245
    • /
    • 2008
  • 컴퓨터 그래픽스에서 많은 광원들을 포함하는 장면을 사실적으로 렌더링하기 위해서는, 많은 양의 조명 계산을 수행해야 한다. 다수의 광원들로부터 빠르게 조명 계산을 하기 위해 많이 사용되는 기법 중에 몬테 카를로(Monte Carlo) 기법이 있다. 본 논문은 이러한 몬테 카를로(Monte Carlo) 기법을 기반으로, 다수의 광원들을 효과적으로 샘플링 할 수 있는 새로운 중요도 샘플링 기법을 제안한다. 제안된 기법의 두 가지 핵심 아이디어는 첫째, 장면 내에 다수의 광원이 존재하여도 어떤 특정 지역에 많은 영향을 주는 광원은 일부인 경우가 많다는 점이고 두 번째는 공간 일관성(spatial coherence)이 낮거나 그림자 경계 지역에 위치한 픽셀들은 영향을 받는 주요 광원이 서로 다르다는 점이다. 제안된 기법은 이러한 관찰에 착안하여 특정 지역에 광원이 기여하는 정도를 평가하고 이에 비례하게 확률 밀도 함수(PDF: Probability Density Function)를 결정하는 방법을 제안한다. 이를 위하여 이미지 공간상에서 픽셀들을 클러스터링(clustering)하고 클러스터 구조를 기반으로 대표 샘플을 선정한다. 선정된 대표 샘플들로부터 광원들의 기여도를 평가하고 이를 바탕으로 클러스터 단위의 확률 밀도 함수를 결정하여 최종 렌더링을 수행한다. 본 논문이 제안하는 샘플링 기법을 적용했을 때 전통적인 샘플링 방식과 비교하여 같은 샘플링 개수에서 노이즈(noise)가 적게 발생하는 좋은 화질을 얻을 수 있었다. 제안된 기법은 다수의 조명과 다양한 재질, 복잡한 가려짐이 존재하는 장면을 효과적으로 표현할 수 있다.

  • PDF

The Binary Tree Vector Quantization Using Human Visual Properties (인간의 시각 특성을 이용한 이진 트리 벡터 양자화)

  • 유성필;곽내정;박원배;안재형
    • Journal of Korea Multimedia Society
    • /
    • v.6 no.3
    • /
    • pp.429-435
    • /
    • 2003
  • In this paper, we propose improved binary tree vector quantization with consideration of spatial sensitivity which is one of the human visual properties. We combine weights in consideration with the responsibility of human visual system according to changes of three primary color in blocks of images with the process of splitting nodes using eigenvector in binary tree vector quantization. Also we propose the novel quality measure of the quantization images that applies MTF(modulation transfer function) to luminance value of quantization error of color image. The test results show that the proposed method generates the quantized images with fine color and performs better than the conventional method in terms of clustering the similar regions. Also the proposed method can get less quantized level images and can reduce the resource occupied by the quantized image.

  • PDF

Drought Classification Method for Jeju Island using Standard Precipitation Index (표준강수지수를 활용한 제주도 가뭄의 공간적 분류 방법 연구)

  • Park, Jae-Kyu;Lee, Jun-ho;Yang, Sung-Kee;Kim, Min-Chul;Yang, Se-Chang
    • Journal of Environmental Science International
    • /
    • v.25 no.11
    • /
    • pp.1511-1519
    • /
    • 2016
  • Jeju Island relies on subterranean water for over 98% of its water resources, and it is therefore necessary to continue to perform studies on drought due to climate changes. In this study, the representative standardized precipitation index (SPI) is classified by various criteria, and the spatial characteristics and applicability of drought in Jeju Island are evaluated from the results. As the result of calculating SPI of 4 weather stations (SPI 3, 6, 9, 12), SPI 12 was found to be relatively simple compared to SPI 6. Also, it was verified that the fluctuation of SPI was greater fot short-term data, and that long-term data was relatively more useful for judging extreme drought. Cluster analysis was performed using the K-means technique, with two variables extracted as the result of factor analysis, and the clustering was terminated with seven-time repeated calculations, and eventually two clusters were formed.

A framework for parallel processing in multiblock flow computations (다중블록 유동해석에서 병렬처리를 위한 시스템의 구조)

  • Park, Sang-Geun;Lee, Geon-U
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.21 no.8
    • /
    • pp.1024-1033
    • /
    • 1997
  • The past several years have witnessed an ever-increasing acceptance and adoption of parallel processing, both for high performance scientific computing as well as for more general purpose applications. Furthermore with increasing needs to perform the complex flow calculations in an efficient manner, the use of the message passing model on distributed networks has emerged as an important alternative to the expensive supercomputers. This work attempts to provide a generic framework to enable the parallelization of all CFD-related works using the master-slave model. This framework consists of (1) input geometry, (2) domain decomposition, (3) grid generation, (4) flow computations, (5) flow visualization, and (6) output display as the sequential components, but performs computations for (2) to (5) in parallel on the workstation clustering. The flow computations are parallized by having multiple copies of the flow-code to solve a PDE on different spatial regions on different processors, while their flow data are exchanged across the region boundaries, and the solution is time-stepped. The Parallel Virtual Machine (PVM) is used for distributed communication in this work.

Large-eddy simulation of channel flow using a spectral domain-decomposition grid-embedding technique (스펙트럴 영역분할 격자 삽입법을 이용한 채널유동의 큰 에디 모사)

  • Gang, Sang-Mo;Byeon, Do-Yeong;Baek, Seung-Uk
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.22 no.7
    • /
    • pp.1030-1040
    • /
    • 1998
  • One of the main unresolved issues in large-eddy simulation(LES) of wall-bounded turbulent flows is the requirement of high spatial resolution in the near-wall region, especially in the spanwise direction. Such high resolution required in the near-wall region is generally used throughout the computational domain, making simulations of high Reynolds number, complex-geometry flows prohibitive. A grid-embedding strategy using a nonconforming spectral domain-decomposition method is proposed to address this limitation. This method provides an efficient way of clustering grid points in the near-wall region with spectral accuracy. LES of transitional and turbulent channel flow has been performed to evaluate the proposed grid-embedding technique. The computational domain is divided into three subdomains to resolve the near-wall regions in the spanwise direction. Spectral patching collocation methods are used for the grid-embedding and appropriate conditions are suggested for the interface matching. Results of LES using the grid-embedding strategy are promising compared to LES of global spectral method and direct numerical simulation. Overall, the results show that the spectral domain-decomposition grid-embedding technique provides an efficient method for resolving the near-wall region in LES of complex flows of engineering interest, allowing significant savings in the computational CPU and memory.