• Title/Summary/Keyword: Estimation methods

Search Result 5,461, Processing Time 0.034 seconds

Development of a Storage-Reliability Estimation Method Using Quantal Response Data for One-Shot Systems with Low Reliability-Decreasing Rates (미소한 신뢰도 감소율을 가지는 원샷 시스템의 가부반응 데이터를 이용한 저장 신뢰도 추정방법 개발)

  • Jang, Hyun-Jeung;Son, Young-Kap
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.35 no.10
    • /
    • pp.1291-1298
    • /
    • 2011
  • This paper proposes a new reliability estimation method for one-shot systems using quantal response data, which is based on a parametric estimation method. The proposed method considers the time-variant failure ratio of the quantal response data and it can overcome the problems in parametric estimation methods. Seven reliability estimation methods in the literature were compared with the proposed method in terms of the accuracy of reliability estimation in order to verify the proposed method. To compare the accuracy of reliability estimation, the SSEs (Sum of Squared Error) of the reliability estimation results for the different estimation methods were evaluated according to the various numbers of samples tested. The proposed method provided more accurate reliability estimation results than any of the other methods from the results of the accuracy comparison.

Estimating Missing Points In Experiments (실험(實驗)에 있어서 결측점(缺測点) 추정(推定))

  • SIM, JUNG WOOK
    • Honam Mathematical Journal
    • /
    • v.4 no.1
    • /
    • pp.147-154
    • /
    • 1982
  • Estimation methods of missing points for an experimental design are described. Formulae are provided for the estimation of missing points using matrix notation. The correct analysis of variance table is given. Estimation methods of a single missing point and two missing points in $2^{n}$ factorial designs are described.

  • PDF

Bayesian Estimation of Multinomial and Poisson Parameters Under Starshaped Restriction

  • Oh, Myong-Sik
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.1
    • /
    • pp.185-191
    • /
    • 1997
  • Bayesian estimation of multinomial and Poisson parameters under starshped restriction is considered. Most Bayesian estimations in order restricted statistical inference require the high-dimensional integration which is very difficult to evaluate. Monte Carlo integration and Gibbs sampling are among alternative methods. The Bayesian estimation considered in this paper requires only evaluation of incomplete beta functions which are extensively tabulated.

  • PDF

A Review on Nonparametric Density Estimation Using Wavelet Methods

  • Sungho;Hwa Rak
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.1
    • /
    • pp.129-140
    • /
    • 2000
  • Wavelets constitute a new orthogonal system which has direct application in density estimation. We introduce a brief wavelet density estimation and summarize some asymptotic results. An application to mixture normal distributions is implemented with S-Plus.

  • PDF

Effects of Covariance Modeling on Estimation Accuracy in an IMU-based Attitude Estimation Kalman Filter (IMU 기반 자세 추정 칼만필터에서 공분산 모델링이 추정 정확도에 미치는 영향)

  • Choi, Ji Seok;Lee, Jung Keun
    • Journal of Sensor Science and Technology
    • /
    • v.29 no.6
    • /
    • pp.440-446
    • /
    • 2020
  • A well-known difficulty in attitude estimation based on inertial measurement unit (IMU) signals is the occurrence of external acceleration under dynamic motion conditions, as the acceleration significantly degrades the estimation accuracy. Lee et al. (2012) designed a Kalman filter (KF) that could effectively deal with the acceleration issue. Ahmed and Tahir (2017) modified this method by adjusting the acceleration-related covariance matrix because they considered covariance modeling as a pivotal factor in the estimation accuracy. This study investigates the effects of covariance modeling on estimation accuracy in an IMU-based attitude estimation KF. The method proposed by Ahmed and Tahir can be divided into two: one uses the covariance including only diagonal components and the other uses the covariance including both diagonal and off-diagonal components. This paper compares these three methods with respect to the motion condition and the window size, which is required for the methods by Ahmed and Tahir. Experimental results showed that the method proposed by Lee et al. performed the best among the three methods under relatively slow motion conditions, whereas the modified method using the diagonal covariance with a high window size performed the best under relatively fast motion conditions.

A Comparative Study of Small Area Estimation Methods (소지역 추정법에 관한 비교연구)

  • Park, Jong-Tae;Lee, Sang-Eun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.12 no.2
    • /
    • pp.47-55
    • /
    • 2001
  • Usually estimating the means is used for statistical inference. However depending the purpose of survey, sometimes totals will give the better and more meaningful in statistical inference than the means. Here in this study, we dealt with the unemployment population of small areas with using 4 different small area estimation methods: Direct, Synthetic, Composite, Bayes estimation. For all the estimates considered in this study, the average of absolute bias and men square error were obtained in the Monte Carlo Study which was simulated using data from 1998 Economic Active Population Survey in Korea.

  • PDF

Terrain Slope Estimation Methods Using the Least Squares Approach for Terrain Referenced Navigation

  • Mok, Sung-Hoon;Bang, Hyochoong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.14 no.1
    • /
    • pp.85-90
    • /
    • 2013
  • This paper presents a study on terrain referenced navigation (TRN). The extended Kalman filter (EKF) is adopted as a filter method. A Jacobian matrix of measurement equations in the EKF consists of terrain slope terms, and accurate slope estimation is essential to keep filter stability. Two slope estimation methods are proposed in this study. Both methods are based on the least-squares approach. One is planar regression searching the best plane, in the least-squares sense, representing the terrain map over the region, determined by position error covariance. It is shown that the method could provide a more accurate solution than the previously developed linear regression approach, which uses lines rather than a plane in the least-squares measure. The other proposed method is weighted planar regression. Additional weights formed by Gaussian pdf are multiplied in the planar regression, to reflect the actual pdf of the position estimate of EKF. Monte Carlo simulations are conducted, to compare the performance between the previous and two proposed methods, by analyzing the filter properties of divergence probability and convergence speed. It is expected that one of the slope estimation methods could be implemented, after determining which of the filter properties is more significant at each mission.

A data-adaptive maximum penalized likelihood estimation for the generalized extreme value distribution

  • Lee, Youngsaeng;Shin, Yonggwan;Park, Jeong-Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.5
    • /
    • pp.493-505
    • /
    • 2017
  • Maximum likelihood estimation (MLE) of the generalized extreme value distribution (GEVD) is known to sometimes over-estimate the positive value of the shape parameter for the small sample size. The maximum penalized likelihood estimation (MPLE) with Beta penalty function was proposed by some researchers to overcome this problem. But the determination of the hyperparameters (HP) in Beta penalty function is still an issue. This paper presents some data adaptive methods to select the HP of Beta penalty function in the MPLE framework. The idea is to let the data tell us what HP to use. For given data, the optimal HP is obtained from the minimum distance between the MLE and MPLE. A bootstrap-based method is also proposed. These methods are compared with existing approaches. The performance evaluation experiments for GEVD by Monte Carlo simulation show that the proposed methods work well for bias and mean squared error. The methods are applied to Blackstone river data and Korean heavy rainfall data to show better performance over MLE, the method of L-moments estimator, and existing MPLEs.

Exploring modern machine learning methods to improve causal-effect estimation

  • Kim, Yeji;Choi, Taehwa;Choi, Sangbum
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.2
    • /
    • pp.177-191
    • /
    • 2022
  • This paper addresses the use of machine learning methods for causal estimation of treatment effects from observational data. Even though conducting randomized experimental trials is a gold standard to reveal potential causal relationships, observational study is another rich source for investigation of exposure effects, for example, in the research of comparative effectiveness and safety of treatments, where the causal effect can be identified if covariates contain all confounding variables. In this context, statistical regression models for the expected outcome and the probability of treatment are often imposed, which can be combined in a clever way to yield more efficient and robust causal estimators. Recently, targeted maximum likelihood estimation and causal random forest is proposed and extensively studied for the use of data-adaptive regression in estimation of causal inference parameters. Machine learning methods are a natural choice in these settings to improve the quality of the final estimate of the treatment effect. We explore how we can adapt the design and training of several machine learning algorithms for causal inference and study their finite-sample performance through simulation experiments under various scenarios. Application to the percutaneous coronary intervention (PCI) data shows that these adaptations can improve simple linear regression-based methods.

A Study on Estimating Function Point Count of Domestic Software Development Projects (국내 소프트웨어 개발사업에 적합한 기능점수규모 예측방법에 관한 연구)

  • 박찬규;신수정;이현옥
    • Korean Management Science Review
    • /
    • v.20 no.2
    • /
    • pp.179-196
    • /
    • 2003
  • Function point model is the international standard method to measure the software size which is one of the most important factors to determine the software development cost. Function point model can successfully be applied only when the detailed specification of users' requirements is available. In the domestic public sector, however, the budgeting for software projects is carried out before the requirements of softwares ere specified in detail. Therefore, an efficient function point estimation method is required to apply function point model at the early stage of software development projects. The purpose of this paper is to compare various function point estimation methods and analyse their accuracies in domestic software projects. We consider four methods : NESMA model, ISBSG model, the simplified function point model and the backfiring method. The methods are applied to about one hundred of domestic projects, and their estimation errors are compared. The results can used as a criterion to select an adequate estimation model for function point counts.