• Title/Summary/Keyword: Robust Statistics

Search Result 397, Processing Time 0.026 seconds

Robust determination of control parameters in K chart with respect to data structures (데이터 구조에 강건한 K 관리도의 관리 모수 결정)

  • Park, Ingkeun;Lee, Sungim
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.6
    • /
    • pp.1353-1366
    • /
    • 2015
  • These days Shewhart control chart for evaluating stability of the process is widely used in various field. But it must follow strict assumption of distribution. In real-life problems, this assumption is often violated when many quality characteristics follow non-normal distribution. Moreover, it is more serious in multivariate quality characteristics. To overcome this problem, many researchers have studied the non-parametric control charts. Recently, SVDD (Support Vector Data Description) control chart based on RBF (Radial Basis Function) Kernel, which is called K-chart, determines description of data region on in-control process and is used in various field. But it is important to select kernel parameter or etc. in order to apply the K-chart and they must be predetermined. For this, many researchers use grid search for optimizing parameters. But it has some problems such as selecting search range, calculating cost and time, etc. In this paper, we research the efficiency of selecting parameter regions as data structure vary via simulation study and propose a new method for determining parameters so that it can be easily used and discuss a robust choice of parameters for various data structures. In addition, we apply it on the real example and evaluate its performance.

Comparison of GEE Estimation Methods for Repeated Binary Data with Time-Varying Covariates on Different Missing Mechanisms (시간-종속적 공변량이 포함된 이분형 반복측정자료의 GEE를 이용한 분석에서 결측 체계에 따른 회귀계수 추정방법 비교)

  • Park, Boram;Jung, Inkyung
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.5
    • /
    • pp.697-712
    • /
    • 2013
  • When analyzing repeated binary data, the generalized estimating equations(GEE) approach produces consistent estimates for regression parameters even if an incorrect working correlation matrix is used. However, time-varying covariates experience larger changes in coefficients than time-invariant covariates across various working correlation structures for finite samples. In addition, the GEE approach may give biased estimates under missing at random(MAR). Weighted estimating equations and multiple imputation methods have been proposed to reduce biases in parameter estimates under MAR. This article studies if the two methods produce robust estimates across various working correlation structures for longitudinal binary data with time-varying covariates under different missing mechanisms. Through simulation, we observe that time-varying covariates have greater differences in parameter estimates across different working correlation structures than time-invariant covariates. The multiple imputation method produces more robust estimates under any working correlation structure and smaller biases compared to the other two methods.

A Robust Edge Detection method using Van der Waerden Statistic (Waerden 통계량을 이용한 강인한 에지검출 방법)

  • 최명희;이호근;김주원;하영호
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.41 no.3
    • /
    • pp.147-153
    • /
    • 2004
  • This paper proposes an efficient edge detection using Van der Waerden statistic in original and noisy images. An edge is where the intensity of an image moves from a low value to a high value or vice versa. We describe a nonparametric Wilcoxon test and a parametric T test based on statistical hypothesis testing for the detection of edges. We use the threshold determined by specifying significance level $\alpha$, while Bovik, Huang and Munson consider the range of possible values of test statistics for the threshold. From the experimental results of edge detection, the T and Wilcoxon method perform sensitively to the noisy image, while the proposed Waerden method is robust over both noisy and noise-free images under $\alpha$=0.0005. Comparison with our statistical test and Sobel, LoG, Canny operators shows that Waerden method perform more effectively in both noisy and noise-free images.

Algorithm for the L1-Regression Estimation with High Breakdown Point (L1-회귀추정량의 붕괴점 향상을 위한 알고리즘)

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.4
    • /
    • pp.541-550
    • /
    • 2010
  • The $L_1$-regression estimator is susceptible to the leverage points, even though it is highly robust to the vertical outliers. This article is concerned with the improvement of robustness of the $L_1$-estimator. To improve its robustness, in terms of the breakdown point, we attempt to dampen the influence of the leverage points by means of reducing the weights corresponding to the leverage points. In addition the algorithm employs the linear scaling transformation technique, for higher computational efficiency with the large data sets, to solve the linear programming problem of $L_1$-estimation. Monte Carlo simulation results indicate that the proposed algorithm yields $L_1$-estimates which are robust to the leverage points as well as the vertical outliers.

Preparation of wastewater-based reference materials for heavy metal analysis and interlaboratory study (중금속분석용 폐수표준물질 제조 및 실험실간 비교평가)

  • Kim, Young-Hee;Song, Ki-Bong;Shin, Sun-Kyoung;Lee, Jung-Sub;Jeong, Gi-Taeg;Hong, Eun-Jin;Park, Jin-Ju;Yu, Suk-Min
    • Analytical Science and Technology
    • /
    • v.23 no.3
    • /
    • pp.295-303
    • /
    • 2010
  • In this study, the wastewater-based reference material (RM) was prepared and certified for 7 trace metal elements with evaluation of uncertainties. The RM was distributed to 25 laboratories for the interlaboratory comparison testing. The certified values and expanded uncertainties were derived using ISO guideline 35 and the standard uncertainties for homogenieties were 0.43~2.67% of certified values. The analytical results from the interlaboratory comparison testing showed normal distributions and the robust means from the interlaboratory comparison testing were higher than the certified values of the RM for all analytes.

Efficient Image Segmentation Using Morphological Watershed Algorithm (형태학적 워터쉐드 알고리즘을 이용한 효율적인 영상분할)

  • Kim, Young-Woo;Lim, Jae-Young;Lee, Won-Yeol;Kim, Se-Yun;Lim, Dong-Hoon
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.4
    • /
    • pp.709-721
    • /
    • 2009
  • This paper discusses an efficient image segmentation using morphological watershed algorithm that is robust to noise. Morphological image segmentation consists of four steps: image simplification, computation of gradient image and watershed algorithm and region merging. Conventional watershed segmentation exhibits a serious weakness for over-segmentation of images. In this paper we present a morphological edge detection methods for detecting edges under noisy condition and apply our watershed algorithm to the resulting gradient images and merge regions using Kolmogorov-Smirnov test for eliminating irrelevant regions in the resulting segmented images. Experimental results are analyzed in both qualitative analysis through visual inspection and quantitative analysis with percentage error as well as computational time needed to segment images. The proposed algorithm can efficiently improve segmentation accuracy and significantly reduce the speed of computational time.

Evaluation of shape similarity for 3D models (3차원 모델을 위한 형상 유사성 평가)

  • Kim, Jeong-Sik;Choi, Soo-Mi
    • The KIPS Transactions:PartA
    • /
    • v.10A no.4
    • /
    • pp.357-368
    • /
    • 2003
  • Evaluation of shape similarity for 3D models is essential in many areas - medicine, mechanical engineering, molecular biology, etc. Moreover, as 3D models are commonly used on the Web, many researches have been made on the classification and retrieval of 3D models. In this paper, we describe methods for 3D shape representation and major concepts of similarity evaluation, and analyze the key features of recent researches for shape comparison after classifying them into four categories including multi-resolution, topology, 2D image, and statistics based methods. In addition, we evaluated the performance of the reviewed methods by the selected criteria such as uniqueness, robustness, invariance, multi-resolution, efficiency, and comparison scope. Multi-resolution based methods have resulted in decreased computation time for comparison and increased preprocessing time. The methods using geometric and topological information were able to compare more various types of models and were robust to partial shape comparison. 2D image based methods incurred overheads in time and space complexity. Statistics based methods allowed for shape comparison without pose-normalization and showed robustness against affine transformations and noise.

Minimum Density Power Divergence Estimation for Normal-Exponential Distribution (정규-지수분포에 대한 최소밀도함수승간격 추정법)

  • Pak, Ro Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.3
    • /
    • pp.397-406
    • /
    • 2014
  • The minimum density power divergence estimation has been a popular topic in the field of robust estimation for since Basu et al. (1988). The minimum density power divergence estimator has strong robustness properties with the little loss in asymptotic efficiency relative to the maximum likelihood estimator under model conditions. However, a limitation in applying this estimation method is the algebraic difficulty on an integral involved in an estimation function. This paper considers a minimum density power divergence estimation method with approximated divergence avoiding such difficulty. As an example, we consider the normal-exponential convolution model introduced by Bolstad (2004). The estimated divergence in this case is too complicated; consequently, a Laplace approximation is employed to obtain a manageable form. Simulations and an empirical study show that the minimum density power divergence estimators based on an approximated estimated divergence for the normal-exponential model perform adequately in terms of bias and efficiency.

Modifications of single and double EWMA feedback controllers for balancing the mean squared deviation and the adjustment variance (편차제곱평균과 수정량분산의 균형을 위한 단일 및 이중 지수가중이동평균 피드백 수정기의 수정)

  • Park, Chang-Soon;Kwon, Sung-Gu
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.1
    • /
    • pp.11-24
    • /
    • 2009
  • The process controller in the adjustment procedure is often used effectively to control the process level close to target when noise is present and unremovable. Examples of the robust controller are single EWMA controller and double EWMA controller. Double EWMA controller is designed to reduce the offset of the process deviation, which single EWMA can not eliminate. In this paper, the two controllers are modified by taking EWMA of the original controller to reduce the adjustment variance, which may become excessively large when the two given controllers are implemented. It is shown that the EWMA modification of the given controllers is successful in reducing the adjustment variance, while the mean squared deviation increases slightly.

  • PDF

Improving Lane Marking Detection by Combining Horizontal 1-D LoG Filtered Scale Space and Variable Thresholding (수평 1-D LoG 필터링 스케일 공간과 가변적 문턱처리의 결합에 의한 차선 마킹 검출 개선)

  • Yoo, Hyeon-Joong
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.49 no.4
    • /
    • pp.85-94
    • /
    • 2012
  • Lane marking detection is essential to both ITS and DAS systems. The objective of this paper is to provide more robust technique for lane marking detection than traditional techniques by using scale-space technique. Variable thresholding that is based on the local statistics may be very effective for detecting such objects as lane markings that have prominent intensities. However, such techniques that only rely on local statistics have limitations containing irrelevant areas as well. We reduce the candidate areas by combining the variable thresholding result with cost-efficient horizontal 1D LoG filtered scale space. Through experiments using practical images, we could achieve significant improvement over the techniques based not only on the variable thresholding but also on the Hough transform that is another very popular technique for this purpose.