• Title/Summary/Keyword: Exponential estimator

Search Result 146, Processing Time 0.018 seconds

Some efficient ratio-type exponential estimators using the Robust regression's Huber M-estimation function

  • Vinay Kumar Yadav;Shakti Prasad
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.3
    • /
    • pp.291-308
    • /
    • 2024
  • The current article discusses ratio type exponential estimators for estimating the mean of a finite population in sample surveys. The estimators uses robust regression's Huber M-estimation function, and their bias as well as mean squared error expressions are derived. It was campared with Kadilar, Candan, and Cingi (Hacet J Math Stat, 36, 181-188, 2007) estimators. The circumstances under which the suggested estimators perform better than competing estimators are discussed. Five different population datasets with a well recognized outlier have been widely used in numerical and simulation-based research. These thorough studies seek to provide strong proof to back up our claims by carefully assessing and validating the theoretical results reported in our study. The estimators that have been proposed are intended to significantly improve both the efficiency and accuracy of estimating the mean of a finite population. As a result, the results that are obtained from statistical analyses will be more reliable and precise.

Penalizing the Negative Exponential Disparity in Discrete Models

  • Sahadeb Sarkar;Song, Kijoung-Song;Jeong, Dong-Bin
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.2
    • /
    • pp.517-529
    • /
    • 1998
  • When the sample size is small the robust minimum Hellinger distance (HD) estimator can have substantially poor relative efficiency at the true model. Similarly, approximating the exact null distributions of the ordinary Hellinger distance tests with the limiting chi-square distributions can be quite inappropriate in small samples. To overcome these problems Harris and Basu (1994) and Basu et at. (1996) recommended using a modified HD called penalized Hellinger distance (PHD). Lindsay (1994) and Basu et al. (1997) showed that another density based distance, namely the negative exponential disparity (NED), is a major competitor to the Hellinger distance in producing an asymptotically fully efficient and robust estimator. In this paper we investigate the small sample performance of the estimates and tests based on the NED and penalized NED (PNED). Our results indicate that, in the settings considered here, the NED, unlike the HD, produces estimators that perform very well in small samples and penalizing the NED does not help. However, in testing of hypotheses, the deviance test based on a PNED appears to achieve the best small-sample level compared to tests based on the NED, HD and PHD.

  • PDF

Comparison of parametric and nonparametric hazard change-point estimators (모수적과 비모수적 위험률 변화점 통계량 비교)

  • Kim, Jaehee;Lee, Sieun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.5
    • /
    • pp.1253-1262
    • /
    • 2016
  • When there exists a change-point in hazard function, it should be estimated for exact parameter or hazard estimation. In this research, we compare the hazard change-point estimators. Matthews and Farewell (1982) parametric change-point estimator is based on the likelihood and Zhang et al. (2014) nonparametric estimator is based on the Nelson-Aalen cumulative hazard estimator. Simulation study is done for the data from exponential distribution with one hazard change-point. The simulated data generated without censoring and the data with right censoring are considered. As real data applications, the change-point estimates are computed for leukemia data and primary biliary cirrhosis data.

Minimum Density Power Divergence Estimation for Normal-Exponential Distribution (정규-지수분포에 대한 최소밀도함수승간격 추정법)

  • Pak, Ro Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.3
    • /
    • pp.397-406
    • /
    • 2014
  • The minimum density power divergence estimation has been a popular topic in the field of robust estimation for since Basu et al. (1988). The minimum density power divergence estimator has strong robustness properties with the little loss in asymptotic efficiency relative to the maximum likelihood estimator under model conditions. However, a limitation in applying this estimation method is the algebraic difficulty on an integral involved in an estimation function. This paper considers a minimum density power divergence estimation method with approximated divergence avoiding such difficulty. As an example, we consider the normal-exponential convolution model introduced by Bolstad (2004). The estimated divergence in this case is too complicated; consequently, a Laplace approximation is employed to obtain a manageable form. Simulations and an empirical study show that the minimum density power divergence estimators based on an approximated estimated divergence for the normal-exponential model perform adequately in terms of bias and efficiency.

SEQUENTIAL INTERVAL ESTIMATION FOR THE EXPONENTIAL HAZARD RATE WHEN THE LOSS FUNCTION IS STRICTLY CONVEX

  • Jang, Yu Seon
    • Korean Journal of Mathematics
    • /
    • v.21 no.4
    • /
    • pp.429-437
    • /
    • 2013
  • Let $X_1$, $X_2$, ${\cdots}$, $X_n$ be independent and identically distributed random variables having common exponential density with unknown mean ${\mu}$. In the sequential confidence interval estimation for the exponential hazard rate ${\theta}=1/{\mu}$, when the loss function is strictly convex, the following stopping rule is proposed with the half length d of prescribed confidence interval $I_n$ for the parameter ${\theta}$; ${\tau}$ = smallest integer n such that $n{\geq}z^2_{{\alpha}/2}\hat{\theta}^2/d^2+2$, where $\hat{\theta}=(n-1)\bar{X}{_n}^{-1}/n$ is the minimum risk estimator for ${\theta}$ and $z_{{\alpha}/2}$ is defined by $P({\mid}Z{\mid}{\leq}{\alpha}/2)=1-{\alpha}({\alpha}{\in}(0,1))$ Z ~ N(0, 1). For the confidence intervals $I_n$ which is required to satisfy $P({\theta}{\in}I_n){\geq}1-{\alpha}$. These estimated intervals $I_{\tau}$ have the asymptotic consistency of the sequential procedure; $$\lim_{d{\rightarrow}0}P({\theta}{\in}I_{\tau})=1-{\alpha}$$, where ${\alpha}{\in}(0,1)$ is given.

A Study on the Parameter Estimation for Testing Effort Function of Software (소프트웨어 테스트 노력 함수의 파라미터 산출에 관한 연구)

  • 최규식;김필중
    • Journal of Information Technology Applications and Management
    • /
    • v.11 no.2
    • /
    • pp.191-204
    • /
    • 2004
  • Many software reliability growth model(SRGM) have been proposed for past several decades. Most of these propositions assumed the S/W debugging testing efforts be constant or even did not consider them. A few papers were presented as the software reliability evaluation considering the testing effort was important afterwards. The testing effort forms which have been presented by this kind of papers were exponential, Rayleigh, Weibull, or Logistic functions, and one of these 4 types was used as a testing effort function depending on the S/W developing circumstances. We consider the methology to evaluate the SRGN using least square estimator(LSE) and maximum likelihood estimator(MLE) for those 4 functions, and then examine parameters applying actual data adopted from real field test of developing S/W.

  • PDF

Optimal Design of Accelerated Life Tests with Different Censoring Times

  • Seo, Sun-Keun;Kim, Kab-Seok
    • Journal of Korean Society for Quality Management
    • /
    • v.24 no.4
    • /
    • pp.44-58
    • /
    • 1996
  • This paper presents optimal accelerated life test plans with different censoring times for exponential, Weibull, and lognormal lifetime distributions, respectively. For an optimal plan, low stress level, proportion of test units allocated and censoring time at each stress are determined such that the asymptotic variance of the maximum likelihood estimator of a certain quantile at use condition is minimized. The proposed plans are compared with the corresponding optimal plans with a common censoring time over range of parameter values. Computational results indicate that those plans are statistically optimal ones in terms of accuracy of estimator when total censoring times of two plans are equal.

  • PDF

Mixed Replacement Designs for Life Testing with Interval Censoring

  • Tai Sup;kesar Singh
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.2
    • /
    • pp.443-456
    • /
    • 1999
  • The estimation of mean lifetimes in presence of interval censoring with mixed replacement procedure are examined when the distribution s of lifetimes are exponential. it is assumed that due to physical restrictions and/or economic constraints the number of failures is investigated only at several inspection times during the lifetime test; thus there is interval censoring. Comparisons of mixed replacement designs are made with those with and without replacement The maximum likelihood estimator is found in an implicit form. The Cramer-Rao lower bound which is the asymptotic variance of the estimator is derived. The test conditions for minimizing the Cramer-Rao lower bound and minimizing the test costs within a desired width of the Cramer-Rao bound have been studied.

  • PDF

A Study on the Optimum Parameter Estimation of Software Reliability (소프트웨어 신뢰도의 적정 파라미터 도출 기법에 관한 연구)

  • Che, Gyu-Shik;Moon, Myong-Ho
    • Journal of Information Technology Applications and Management
    • /
    • v.13 no.4
    • /
    • pp.1-12
    • /
    • 2006
  • Many software reliability growth models(SRGM) have been proposed since the software reliability issue was raised in 1972. The technology to estimate and grow the reliability of developing S/W to target value during testing phase were developed using them. Most of these propositions assumed the S/W debugging testing efforts be constant or even did not consider them. A few papers were presented as the software reliability evaluation considering the testing effort was important afterwards. The testing effort forms which have been presented by this kind of papers were exponential, Rayleigh, Weibull, or logistic functions, and one of these 4 types was used as a testing effort function depending on the S/W developing circumstances. I propose the methology to evaluate the SRGM using least square estimator and maximum likelihood estimator for those 4 functions, and then examine parameters applying actual data adopted from real field test of developing S/W.

  • PDF

Nonparametric estimation of hazard rates change-point (위험률의 변화점에 대한 비모수적 추정)

  • 정광모
    • The Korean Journal of Applied Statistics
    • /
    • v.11 no.1
    • /
    • pp.163-175
    • /
    • 1998
  • The change of hazard rates at some unknown time point has been the interest of many statisticians. But it was restricted to the constant hazard rates which correspond to the exponential distribution. In this paper we generalize the change-point model in which any specific functional forms of hazard rates are net assumed. The assumed model includes various types of changes before and after the unknown time point. The Nelson estimator of cumulative hazard function is introduced. We estimate the change-point maximizing slope changes of Nelson estimator. Consistency and asymptotic distribution of bootstrap estimator are obtained using the martingale theory. Through a Monte Carlo study we check the performance of the proposed method. We also explain the proposed method using the Stanford Heart Transplant Data set.

  • PDF