• Title/Summary/Keyword: Interval Estimates

Search Result 212, Processing Time 0.021 seconds

Bayesian Confidence Intervals in Penalized Likelihood Regression

  • Kim Young-Ju
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.1
    • /
    • pp.141-150
    • /
    • 2006
  • Penalized likelihood regression for exponential families have been considered by Kim (2005) through smoothing parameter selection and asymptotically efficient low dimensional approximations. We derive approximate Bayesian confidence intervals based on Bayes model associated with lower dimensional approximations to provide interval estimates in penalized likelihood regression and conduct empirical studies to access their properties.

CONFIDENCE CURVES FOR A FUNCTION OF PARAMETERS IN NONLINEAR REGRESSION

  • Kahng, Myung-Wook
    • Journal of the Korean Statistical Society
    • /
    • v.32 no.1
    • /
    • pp.1-10
    • /
    • 2003
  • We consider obtaining graphical summaries of uncertainty in estimates of parameters in nonlinear models. A nonlinear constrained optimization algorithm is developed for likelihood based confidence intervals for the functions of parameters in the model The results are applied to the problem of finding significance levels in nonlinear models.

Analyzing the Uncertainty of Traffic Link Flow, and Estimation of the Interval Link Flow using Korea Transport Data Base (기종점 통행량 변화에 따른 링크 교통량 추정의 불확실성에 관한 연구 (국가교통DB를 이용한 구간 링크 교통량 추정을 중심으로))

  • Kim, Gang-Su;Kim, Jin-Seok;Jo, Hye-Jin
    • Journal of Korean Society of Transportation
    • /
    • v.27 no.1
    • /
    • pp.117-127
    • /
    • 2009
  • This study analyzed the uncertainty of the forecasted link traffic flow, and estimated of the interval link flow using Korea Transport Data Base (KTDB) to consider those risks into the feasibility study. In the paper, the uncertainty was analyzed according to the stochastic variation of the KTDB origin-destination traffic. It was found that the uncertainty of the entire network traffic forecasts was 15.4% in average,. when the stochastic variation of the KTDB was considered. The results showed that the more congested the roads were, the bigger the uncertainty of forecasted link traffic flow were found. In particular, we estimated the variance of the forecasted traffic flow, and suggested interval estimates of the forecasted traffic flow instead of point estimates which were presented in the common feasibility studies. These results are expected to contribute the quantitative evaluation of uncertain road investment projects and to provide valuable information to the decision makers for the transport investment.

Probabilistic assessment on the basis of interval data

  • Thacker, Ben H.;Huyse, Luc J.
    • Structural Engineering and Mechanics
    • /
    • v.25 no.3
    • /
    • pp.331-345
    • /
    • 2007
  • Uncertainties enter a complex analysis from a variety of sources: variability, lack of data, human errors, model simplification and lack of understanding of the underlying physics. However, for many important engineering applications insufficient data are available to justify the choice of a particular probability density function (PDF). Sometimes the only data available are in the form of interval estimates which represent, often conflicting, expert opinion. In this paper we demonstrate that Bayesian estimation techniques can successfully be used in applications where only vague interval measurements are available. The proposed approach is intended to fit within a probabilistic framework, which is established and widely accepted. To circumvent the problem of selecting a specific PDF when only little or vague data are available, a hierarchical model of a continuous family of PDF's is used. The classical Bayesian estimation methods are expanded to make use of imprecise interval data. Each of the expert opinions (interval data) are interpreted as random interval samples of a parent PDF. Consequently, a partial conflict between experts is automatically accounted for through the likelihood function.

Balanced Accuracy and Confidence Probability of Interval Estimates

  • Liu, Yi-Hsin;Stan Lipovetsky;Betty L. Hickman
    • International Journal of Reliability and Applications
    • /
    • v.3 no.1
    • /
    • pp.37-50
    • /
    • 2002
  • Simultaneous estimation of accuracy and probability corresponding to a prediction interval is considered in this study. Traditional application of confidence interval forecasting consists in evaluation of interval limits for a given significance level. The wider is this interval, the higher is probability and the lower is the forecast precision. In this paper a measure of stochastic forecast accuracy is introduced, and a procedure for balanced estimation of both the predicting accuracy and confidence probability is elaborated. Solution can be obtained in an optimizing approach. Suggested method is applied to constructing confidence intervals for parameters estimated by normal and t distributions

  • PDF

Association measure of doubly interval censored data using a Kendall's 𝜏 estimator

  • Kang, Seo-Hyun;Kim, Yang-Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.2
    • /
    • pp.151-159
    • /
    • 2021
  • In this article, our interest is to estimate the association between consecutive gap times which are subject to interval censoring. Such data are referred as doubly interval censored data (Sun, 2006). In a context of serial event, an induced dependent censoring frequently occurs, resulting in biased estimates. In this study, our goal is to propose a Kendall's 𝜏 based association measure for doubly interval censored data. For adjusting the impact of induced dependent censoring, the inverse probability censoring weighting (IPCW) technique is implemented. Furthermore, a multiple imputation technique is applied to recover unknown failure times owing to interval censoring. Simulation studies demonstrate that the suggested association estimator performs well with moderate sample sizes. The proposed method is applied to a dataset of children's dental records.

UNCERTAINTY ANALYSIS OF DATA-BASED MODELS FOR ESTIMATING COLLAPSE MOMENTS OF WALL-THINNED PIPE BENDS AND ELBOWS

  • Kim, Dong-Su;Kim, Ju-Hyun;Na, Man-Gyun;Kim, Jin-Weon
    • Nuclear Engineering and Technology
    • /
    • v.44 no.3
    • /
    • pp.323-330
    • /
    • 2012
  • The development of data-based models requires uncertainty analysis to explain the accuracy of their predictions. In this paper, an uncertainty analysis of the support vector regression (SVR) model, which is a data-based model, was performed because previous research showed that the SVR method accurately estimates the collapse moments of wall-thinned pipe bends and elbows. The uncertainty analysis method used in this study was an analytic uncertainty analysis method, and estimates with a 95% confidence interval were obtained for 370 test data points. From the results, the prediction interval (PI) was very narrow, which means that the predicted values are quite accurate. Therefore, the proposed SVR method can be used effectively to assess and validate the integrity of the wall-thinned pipe bends and elbows.

Interval Estimation of the Difference of two Population Proportions using Pooled Estimator

  • Hong, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • v.9 no.2
    • /
    • pp.389-399
    • /
    • 2002
  • In order to examine whether the difference between two point estimates of population proportions is statistically significant, data analysts use two techniques. The first is to explore the overlap between two associated confidence intervals. Second method is to test the significance which is introduced at most statistical textbooks under the common assumptions of consistency, asymptotic normality, and asymptotic independence of the estimates. Under the null hypothesis which is two population proportions are equal, the pooled estimator of population proportion is preferred as a point estimator since two independent random samples are considered to be collected from one population. Hence as an alternative method, we could obtain another confidence interval of the difference of the population proportions with using the pooled estimate. We conclude that, among three methods, the overlapped method is under-estimated, and the difference of the population proportions method is over-estimated on the basis of the proposed method.

On Employing Nonparametric Bootstrap Technique in Oscillometric Blood Pressure Measurement for Confidence Interval Estimation

  • Lee, Yong-Kook;Lee, Im-Bong;Chang, Joon-Hyuk;Lee, Soo-Jeong
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.2
    • /
    • pp.200-207
    • /
    • 2014
  • Blood pressure (BP) is an important vital signal for determining the health of an individual subject. Although estimation of mean arterial blood pressure is possible using oscillometric blood pressure techniques, there are no established techniques in the literature for obtaining confidence interval (CI) for systolic blood pressure (SBP) and diastolic blood pressure (DBP) estimates obtained from such BP measurements. This paper proposes a nonparametric bootstrap technique to obtain CI with a small number of the BP measurements. The proposed algorithm uses pseudo measurements employing nonparametric bootstrap technique to derive the pseudo maximum amplitudes (PMA) and the pseudo envelopes (PE). The SBP and DBP are then derived using the new relationships between PMA and PE and the CIs for such estimates. Application of the proposed method on an experimental dataset of 85 patients with five sets of measurements for each patient has yielded a smaller Cl than the conventional student t-method.

An Improvement on Estimation for Causal Models of Categorical Variables of Abilities and Task Performance

  • Kim, Sung-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.1
    • /
    • pp.65-86
    • /
    • 2000
  • The estimates from an EM when it is applied to a large causal model of 10 or more categorical variables are often subject to the initial values for the estimates. This phenomenon becomes more serious as the model structure becomes more serious as the model structure becomes more complicated involving more variables. In this regard Wu(1983) recommends among others that EMs are implemented several times with different sets of initial values to obtain more appropriate estimates. in this paper a new approach for initial values is proposed. The main idea is that we use initials that are calibrated to data. A simulation result strongly indicates that the calibrated initials give rise to the estimates that are far closer to the true values than the initials that are not calibrated.

  • PDF