• Title/Summary/Keyword: peaks-over-threshold

Search Result 34, Processing Time 0.034 seconds

Adaptive thresholding for eliminating noises in 2-DE image (2차원 전기영동 영상에서 잡영을 제거하기 위한 적응적인 문턱값 결정)

  • Choi, Kwan-Deok;Kim, Mi-Ae;Yoon, Young-Woo
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.9 no.1
    • /
    • pp.1-9
    • /
    • 2008
  • One of the problems for implementing the spot detection phase in the 2-DE gel image analysis program is the eliminating noises in the image. Remained noises after the preprocessing phase cause the over-segmented regions by the segmentation phase. To identify and exclude the over-segmented background regions, if we use the fixed thresholding method that is choosing an intensity value for the threshold, the spots that is invisible by the eyes but mean a very small amount proteins which have important role in the biological samples could be eliminated. This paper propose an adaptive thresholding method that come from an idea that is got on statistical analysing for the prominences of the peaks. The adaptive thresholding method works as following. Firstly we calculate an average prominence value curve and fit it to exponential function curve, as a result we get parameters for the exponential function. And then we calculate a threshold value by using the parameters and probability distribution of errors. Lastly we apply the threshold value to the region for determining the region is a noise or not. According to the probability distribution of errors, the reliability is 99.85% and we show the correctness of the proposed method by representing experiment results.

  • PDF

Frequency Analysis of Partial Duration Series for Flood Discharge of Rivers (하천 홍수량에 대한 부분시계열 빈도분석)

  • Lee, Gyu-Min;Jun, Kyung-Soo
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2010.05a
    • /
    • pp.174-178
    • /
    • 2010
  • 일반적으로 설계홍수량은 강우빈도 해석으로 설계강우량을 결정하고 이를 유역유출모형에 적용하여 계산된 유출량을 정상류모의를 통하여 산정하게 된다. 이러한 기존의 설계홍수량 산정방법은 설계강우량 산정에 있어 임의성을 포함하게 된다. 따라서 본 연구에서는 대상 하천 구간의 실측 수위자료를 사용하여 홍수량을 산정하는 방법을 제시하고자 한다. 분석대상 자료로서 남한강 여주지점의 실측 시유량을 선정하였으며 충주댐 완공 이후인 1988년부터 2007년까지의 기간을 대상으로 하였다. 빈도해석을 위한 분석 자료군을 연최대치 계열과 POT(Peaks Over Threshold) 계열의 두 가지 그룹으로 추출하여 홍수량을 추정하였다. 연최대치 계열 분석 결과 Weibull 분포를 적절한 분포형으로 선정하였으며 부분시계열 POT 빈도해석을 수집자료 전체와 기간을 전, 후 10년씩 나눈 세 그룹으로 나누어 수행하였다. 빈도별 확률홍수량 추정 결과 연최대치 계열을 사용한 결과가 부분시계열 POT 방법을 사용한 결과보다 크게 산정되었으며 자료 전체 기간에 대한 POT 빈도해석 결과보다 최근 10년간의 자료를 사용한 결과가 더 크게 나타나 홍수량의 증가 경향을 확인 할 수 있었다.

  • PDF

Study on Optimal Sample Size for Bivariate Frequency Anlaysis using POT (POT 방법을 이용한 이변량 빈도해석 적정 표본크기 연구)

  • Joo, Kyungwon;Joo, Kyungwon;Joo, Kyungwon;Heo, Jun-Haeng
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2015.05a
    • /
    • pp.38-38
    • /
    • 2015
  • 최근 다변량 확률모형을 이용한 빈도해석이 여러 수문분야에 걸쳐 연구되고 있다. 기존 일변량 빈도해석에 비해 변수활용에 대한 자유도와 물리적 현상을 정확하게 표현할 수 있다는 장점이 있으나, 표본자료의 부족, 매개변수 추정 및 적합도 검정 등의 어려움으로 실제 분야에 사용되기 어려운 점이 있다. 본 연구에서는 copula 모형에 대하여 Cramer-von Mises(CVM) 적합도 검정 시 표본자료의 적정 크기를 결정하기 위하여 Peaks-Over-Threshold(POT) 방법을 이용하였다. 서울지점의 기상청 시강우 자료를 이용하여 빈도해석을 수행하였으며, Gumbel copula 모형에 대하여 매개변수 추정은 maximum pseudolikelihood method(MPL) 방법을 이용하였다. 50년의 기록 자료에 대하여 표본크기를 50개부터 2500개까지 조절하여 CVM 통계값과 p-value를 기준으로 적정 표본크기를 산정하였다.

  • PDF

Extreme value modeling of structural load effects with non-identical distribution using clustering

  • Zhou, Junyong;Ruan, Xin;Shi, Xuefei;Pan, Chudong
    • Structural Engineering and Mechanics
    • /
    • v.74 no.1
    • /
    • pp.55-67
    • /
    • 2020
  • The common practice to predict the characteristic structural load effects (LEs) in long reference periods is to employ the extreme value theory (EVT) for building limit distributions. However, most applications ignore that LEs are driven by multiple loading events and thus do not have the identical distribution, a prerequisite for EVT. In this study, we propose the composite extreme value modeling approach using clustering to (a) cluster initial blended samples into finite identical distributed subsamples using the finite mixture model, expectation-maximization algorithm, and the Akaike information criterion; (b) combine limit distributions of subsamples into a composite prediction equation using the generalized Pareto distribution based on a joint threshold. The proposed approach was validated both through numerical examples with known solutions and engineering applications of bridge traffic LEs on a long-span bridge. The results indicate that a joint threshold largely benefits the composite extreme value modeling, many appropriate tail approaching models can be used, and the equation form is simply the sum of the weighted models. In numerical examples, the proposed approach using clustering generated accurate extrema prediction of any reference period compared with the known solutions, whereas the common practice of employing EVT without clustering on the mixture data showed large deviations. Real-world bridge traffic LEs are driven by multi-events and present multipeak distributions, and the proposed approach is more capable of capturing the tendency of tailed LEs than the conventional approach. The proposed approach is expected to have wide applications to general problems such as samples that are driven by multiple events and that do not have the identical distribution.

Frequency analysis of storm surge using Poisson-Generalized Pareto distribution (Poisson-Generalized Pareto 분포를 이용한 폭풍해일 빈도해석)

  • Kim, Tae-Jeong;Kwon, Hyun-Han;Shin, Young-Seok
    • Journal of Korea Water Resources Association
    • /
    • v.52 no.3
    • /
    • pp.173-185
    • /
    • 2019
  • The Korean Peninsula is considered as one of the most typhoon related disaster prone areas. In particular, the potential risk of flooding in coastal areas would be greater when storm surge and heavy rainfall occurred at the same time. In this context, understanding the mechanism of the interactions between them and estimating the risk associated with the concurrent occurrence are of particular interests especially in low-lying coastal areas. In this study, we developed a Poisson-Generalized Pareto (Poisson-GP) distribution based storm surge frequency analysis model to combine the occurrence of the exceedance of a threshold, that is the peaks over threshold (POT), within a Bayesian framework. The storm surge frequency analysis technique developed through this study might contribute to the improvement of disaster prevention technology related to storm surge in the coastal area.

Adaptive thresholding noise elimination and asymmetric diffusion spot model for 2-DE image analysis

  • Choi, Kwan-Deok;Yoon, Young-Woo
    • 한국정보컨버전스학회:학술대회논문집
    • /
    • 2008.06a
    • /
    • pp.113-116
    • /
    • 2008
  • In this paper we suggest two novel methods for an implementation of the spot detection phase in the 2-DE gel image analysis program. The one is the adaptive thresholding method for eliminating noises and the other is the asymmetric diffusion model for spot matching. Remained noises after the preprocessing phase cause the over-segmentation problem by the next segmentation phase. To identify and exclude the over-segmented background regions, il we use a fixed thresholding method that is choosing an intensity value for the threshold, the spots that are invisible by one's human eyes but mean very small amount proteins which have important role in the biological samples could be eliminated. Accordingly we suggest the adaptive thresholding method which comes from an idea that is got on statistical analysis for the prominences of the peaks. There are the Gaussian model and the diffusion model for the spot shape model. The diffusion model is the closer to the real spot shapes than the Gaussian model, but spots have very various and irregular shapes and especially asymmetric formation in x-coordinate and y-coordinate. The reason for irregularity of spot shape is that spots could not be diffused perfectly across gel medium because of the characteristics of 2-DE process. Accordingly we suggest the asymmetric diffusion model for modeling spot shapes. In this paper we present a brief explanation ol the two methods and experimental results.

  • PDF

Multi-stage and Variable-length Peak Windowing Techniques for PAPR Reduction of OFDMA Downlink Systems (OFDMA 하향링크 시스템에서의 PAPR 저감을 위한 다단계 및 가변길이 첨두 윈도윙 기법들)

  • Lee, Sung-Eun;Min, Hyun-Kee;Bang, Keuk-Joon;Hong, Dae-Sik
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.45 no.2
    • /
    • pp.67-74
    • /
    • 2008
  • This paper proposes two peak-windowing algorithms for peak-to-average power reduction(PAPR) of orthogonal frequency division multiple access(OFDMA) downlink systems. The Proposed algorithms mitigate the effect of excessive suppression due to successive peaks or relatively high peaks of the signal. First, multi-stage peak windowing algorithm is proposed, which exploits multiple threshold of target PAPR in order to step down the peaks gradually. Secondary, variable-length peak windowing algorithm is proposed, which adapts the window length with respect to the existence of successive peaks within a half of window length. Therefore, the proposed method reduces the distortion of signal amplitude caused by window overlapping. The proposed algorithms outperform the conventional peak windowing with the aid of window-length adaptation or sequential peak power reduction. Simulation results show the efficiency of the proposed algorithms over OFDMA downlink systems, especially WiBro systems.

Investigating InSnZnO as an Active Layer for Non-volatile Memory Devices and Increasing Memory Window by Utilizing Silicon-rich SiOx for Charge Storage Layer

  • Park, Heejun;Nguyen, Cam Phu Thi;Raja, Jayapal;Jang, Kyungsoo;Jung, Junhee;Yi, Junsin
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2016.02a
    • /
    • pp.324-326
    • /
    • 2016
  • In this study, we have investigated indium tin zinc oxide (ITZO) as an active channel for non-volatile memory (NVM) devices. The electrical and memory characteristics of NVM devices using multi-stack gate insulator SiO2/SiOx/SiOxNy (OOxOy) with Si-rich SiOx for charge storage layer were also reported. The transmittance of ITZO films reached over 85%. Besides, ITZO-based NVM devices showed good electrical properties such as high field effect mobility of 25.8 cm2/V.s, low threshold voltage of 0.75 V, low subthreshold slope of 0.23 V/dec and high on-off current ratio of $1.25{\times}107$. The transmission Fourier Transform Infrared spectroscopy of SiOx charge storage layer with the richest silicon content showed an assignment at peaks around 2000-2300 cm-1. It indicates that many silicon phases and defect sources exist in the matrix of the SiOx films. In addition, the characteristics of NVM device showed a retention exceeding 97% of threshold voltage shift after 104 s and greater than 94% after 10 years with low operating voltage of +11 V at only 1 ms programming duration time. Therefore, the NVM fabricated by high transparent ITZO active layer and OOxOy memory stack has been applied for the flexible memory system.

  • PDF

Confidence Intervals for High Quantiles of Heavy-Tailed Distributions (꼬리가 두꺼운 분포의 고분위수에 대한 신뢰구간)

  • Kim, Ji-Hyun
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.3
    • /
    • pp.461-473
    • /
    • 2014
  • We consider condence intervals for high quantiles of heavy-tailed distribution. The asymptotic condence intervals based on the limiting distribution of estimators are considered together with bootstrap condence intervals. We can also apply a non-parametric, parametric and semi-parametric approach to each of these two kinds of condence intervals. We considered 11 condence intervals and compared their performance in actual coverage probability and the length of condence intervals. Simulation study shows that two condence intervals (the semi-parametric asymptotic condence interval and the semi-parametric bootstrap condence interval using pivotal quantity) are relatively more stable under the criterion of actual coverage probability.

The Analysis for Flood Damage on Nam-sa Down Stream Region (남사천 하류지역 홍수피해 분석)

  • 김가현;이영대;서진호;민일규
    • Journal of Environmental Science International
    • /
    • v.10 no.3
    • /
    • pp.217-223
    • /
    • 2001
  • Where no records are available at a site, a preliminary estimate may be made from relations between floods and catchment chatacteristics. A number of these chatacteristics were chosen for testing and were measured for those catchments where mean annual flood estimates were available. Although the improvement using extended data in regression of flood estimates on catchment characteristics was small, this may be due to the limitations of the regression model. When an individual short term record is to be extended, more detailed attention can be given; an example is presented of the technique which should be adopted in practice, particularly when a short term record covers a period which is known to be biassed. A method of extending the peaks over a threshold series is presented with a numerical example. The extension of records directly from rainfall by means of a conceptual model is discussed, although the application of such methods is likely to be limited by lack of recording raingauge information. Methods of combining information from various sources are discussed in terms of information from catchment characteristics supplemented by records. but are generally applicable to different sources of information. The application of this technique to estimating the probable maximum flood requires more conservative assumptions about the antecedent condition, storm profile and unit hydrograph. It is suggested that the profile and catchment wetness index at the start of the design duration should be based on the assumption that the estimated maximum rainfall occurs in all durations centered on the storm peak.

  • PDF