• Title/Summary/Keyword: 절단정규분포

Search Result 24, Processing Time 0.027 seconds

Improved Estimation for Expected Sliding Distance of Caisson Breakwaters by Employment of a Doubly-Truncated Normal Distribution (이중절단정규분포의 적용을 통한 케이슨 방파제 기대활동량 평가의 향상)

  • Kim Tae-Min;Hwang Kyu-Nam;Takayama Tomotsuka
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.17 no.4
    • /
    • pp.221-231
    • /
    • 2005
  • The present study is deeply concerned with the reliability design method(Level III) for caisson breakwaters using expected sliding distance, and the objectives of this study are to propose the employment of a doubly-truncated normal distribution and to present the validity for it. In this study, therefore, the explanations are made for consideration of effects of uncertain factors, and a clear basis that the doubly-truncated normal distribution should be employed in the computation process of expected sliding distance by Monte-Carlo simulation is presented with introduction of the employment method. Even though only caisson breakwaters are treated in this paper, the employment of doubly-truncated normal distribution can be applied to various coastal structures as well as other engineering fields, and therefore it is expected that the present study will be extended in various fields.

양쪽 절단된 정규분포의 평균과 분산의 추정

  • Choe, Yun-Yeong;Hong, Jong-Seon
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2002.05a
    • /
    • pp.127-132
    • /
    • 2002
  • 절단된 정규분포의 평균과 분산을 추정하기 위하여 전체 표본에 기초한 최대가능도 추정량을 사용한 방법과 절단된 후에 남아있는 표본만을 고려한 절단된 표본의 표본평균과 표본분산을 시뮬레이션을 통해 비교 연구하였다. 평균을 추정하는 경우에는 놀랍게도 절단된 자료에 기초한 추정량이 전체 표본에 기초한 추정량보다 평균제곱오차가 더 작다는 것을 발견하였다.

  • PDF

Testing Log Normality for Randomly Censored Data (임의중도절단자료에 대한 로그정규성 검정)

  • Kim, Nam-Hyun
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.5
    • /
    • pp.883-891
    • /
    • 2011
  • For survival data we sometimes want to test a log normality hypothesis that can be changed into normality by transforming the survival data. Hence the Shapiro-Wilk type statistic for normality is generalized to randomly censored data based on the Kaplan-Meier product limit estimate of the distribution function. Koziol and Green (1976) derived Cram$\acute{e}$r-von Mises statistic's randomly censored version under the simpl hypothesis. These two test statistics are compared through a simulation study. As for the distribution of censoring variables, we consider Koziol and Green (1976)'s model and other similar models. Through the simulation results, we can see that the power of the proposed statistic is higher than that of Koziol-Green statistic and that the proportion of the censored observations (rather than the distribution of censoring variables) has a strong influence on the power of the proposed statistic.

A Modification of the Shapiro-Wilk Test for Exponentiality Based on Censored Data (중도절단자료에 대한 수정된 SHAPIRO-WILK 지수 검정)

  • Kim, Nam-Hyun
    • The Korean Journal of Applied Statistics
    • /
    • v.21 no.2
    • /
    • pp.265-273
    • /
    • 2008
  • Kim (2001a) presented a modification of the Shapiro and Wilk (1972) test for exponentiality based on the ratio of two asymptotically efficient estimates of scale. In this paper we modify this test statistic when the sample is censored. We use the normalized spacings based on the sample data, which was used in Samanta and Schwarz (1988) to modify the Shapiro and Wilk (1972) statistic to the censored data. As a result the modified statistics have the same null distribution as the uncensored case with a corresponding reduction in sample size. Through a simulation study it is found that the proposed statistic has higher power than Samanta and Schwarz (1988) statistic especially for the alternatives with the coefficient of variation greater than or equal to 1.

Estimation of Probability Density Function of Tidal Elevation Data using the Double Truncation Method (이중 절단 기법을 이용한 조위자료의 확률밀도함수 추정)

  • Jeong, Shin-Taek;Cho, Hong-Yeon;Kim, Jeong-Dae;Hui, Ko-Dong
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.20 no.3
    • /
    • pp.247-254
    • /
    • 2008
  • The double-peak normal distribution function (DPDF) suggested by Cho et al.(2004) has the problems that the extremely high and low tidal elevations are frequently generated in the Monte-Carlo simulation processes because the upper and lower limits of the DPDF are unbounded in spite of the excellent goodness-offit results. In this study, the modified DPDF is suggested by introducing the upper and lower value parameters and re-scale parameters in order to remove these problems. These new parameters of the DPDF are optimally estimated by the non-linear optimization problem solver using the Levenberg-Marquardt scheme. This modified DPDF can remove completely the unrealistically generated tidal levations and give a slightly better fit than the existing DRDF. Based on the DPDF's characteristic power, the over- and under estimation problems of the design factors are also automatically intercepted, too.

CUSUM charts for monitoring type I right-censored lognormal lifetime data (제1형 우측중도절단된 로그정규 수명 자료를 모니터링하는 누적합 관리도)

  • Choi, Minjae;Lee, Jaeheon
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.5
    • /
    • pp.735-744
    • /
    • 2021
  • Maintaining the lifetime of a product is one of the objectives of quality control. In real processes, most samples are constructed with censored data because, in many situations, we cannot measure the lifetime of all samples due to time or cost problems. In this paper, we propose two cumulative sum (CUSUM) control charting procedures to monitor the mean of type I right-censored lognormal lifetime data. One of them is based on the likelihood ratio, and the other is based on the binomial distribution. Through simulations, we evaluate the performance of the two proposed procedures by comparing the average run length (ARL). The overall performance of the likelihood ratio CUSUM chart is better, especially this chart performs better when the censoring rate is low and the shape parameter value is small. Conversely, the binomial CUSUM chart is shown to perform better when the censoring rate is high, the shape parameter value is large, and the change in the mean is small.

중도절단된 생존함수의 신뢰구간 비교연구

  • Lee, Gyeong-Hwa;Lee, Jae-Won
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2005.05a
    • /
    • pp.251-255
    • /
    • 2005
  • 중도절단된 자료와 표본수가 적은 자료를 가지는 생존분석에서 생존율을 추정하거나 두 집단의 생존율을 비교할 때 정규분포 근사를 가정한 신뢰구간을 이용하는 데는 많은 어려움이 생긴다. 생존함수의 신뢰구간에 대한 중도절단을, 표본의 크기에 따른 다양한 상황의 모의실험을 통하여 Kaplan-Meier, Nelson, 적률 추정량 그리고 cox model의 ${\beta}$을 가지고 붓스트랩을 이용한 신뢰구간과 비모수 신뢰구간, 우도비 신뢰구간의 실제 포함 확률을 비교해보고자 한다.

  • PDF

Goodness of Fit Tests for the Exponential Distribution based on Multiply Progressive Censored Data (다중 점진적 중도절단에서 지수분포의 적합도 검정)

  • Yun, Hyejeong;Lee, Kyeongjun
    • Journal of the Korean Data Analysis Society
    • /
    • v.20 no.6
    • /
    • pp.2813-2827
    • /
    • 2018
  • Progressive censoring schemes have become quite popular in reliability study. Under progressive censored data, however, some units can be failed between two points of observation with exact times of failure of these units unobserved. For example, loss may arise in life-testing experiments when the failure times of some units were not observed due to mechanical or experimental difficulties. Therefore, multiply progressive censoring scheme was introduced. So, we derives a maximum likelihood estimator of the parameter of exponential distribution. And we introduced the goodness-of-fit test statistics using order statistic and Lorenz curve. We carried out Monte Carlo simulation to compare the proposed test statistics. In addition, real data set have been analysed. In Weibull and chi-squared distributions, the test statistics using Lorenz curve are more powerful than test statistics using order statistics.

An approximate fitting for mixture of multivariate skew normal distribution via EM algorithm (EM 알고리즘에 의한 다변량 치우친 정규분포 혼합모형의 근사적 적합)

  • Kim, Seung-Gu
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.3
    • /
    • pp.513-523
    • /
    • 2016
  • Fitting a mixture of multivariate skew normal distribution (MSNMix) with multiple skewness parameter vectors via EM algorithm often requires a highly expensive computational cost to calculate the moments and probabilities of multivariate truncated normal distribution in E-step. Subsequently, it is common to fit an asymmetric data set with MSNMix with a simple skewness parameter vector since it allows us to compute them in E-step in an univariate manner that guarantees a cheap computational cost. However, the adaptation of a simple skewness parameter is unrealistic in many situations. This paper proposes an approximate estimation for the MSNMix with multiple skewness parameter vectors that also allows us to treat them in an univariate manner. We additionally provide some experiments to show its effectiveness.

ROC Curve Fitting with Normal Mixtures (정규혼합분포를 이용한 ROC 분석)

  • Hong, Chong-Sun;Lee, Won-Yong
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.2
    • /
    • pp.269-278
    • /
    • 2011
  • There are many researches that have considered the distribution functions and appropriate covariates corresponding to the scores in order to improve the accuracy of a diagnostic test, including the ROC curve that is represented with the relations of the sensitivity and the specificity. The ROC analysis was used by the regression model including some covariates under the assumptions that its distribution function is known or estimable. In this work, we consider a general situation that both the distribution function and the elects of covariates are unknown. For the ROC analysis, the mixtures of normal distributions are used to estimate the distribution function fitted to the credit evaluation data that is consisted of the score random variable and two sub-populations of parameters. The AUC measure is explored to compare with the nonparametric and empirical ROC curve. We conclude that the method using normal mixtures is fitted to the classical one better than other methods.