• 제목/요약/키워드: Likelihood Analysis

검색결과 1,313건 처리시간 0.024초

직렬-최대 공차 탐색을 사용한 비스트형 DS-SS 초기 동기 시스템의 성능 분석-수정된 상태 천이도 접근법- (Performance Analysis of Burst-Format DS-SS Acquisition system Using a Serial-Maximum Likelihood Search- A Modified State Transition diagram Approach)

  • 이독욱;김근묵;황금찬
    • 전자공학회논문지A
    • /
    • 제28A권11호
    • /
    • pp.855-865
    • /
    • 1991
  • A simple method for analysis of serial search burst-format direct-sequence spread-spectrum (DS-SS) code acquisition system had been recently proposed, and therein, the effect of the code autocorrelation sidelobes on the performance of DS-SS code acquisition system has been assessed [4],[5]. In this paper, a new hybrid scheme which combines the serial search with maximum likelihood search is proposed to eliminate the deleterious effect of code autocorrelation sidelobes, and the performance is analyzed for the additive withe Gaussian noise (AWGN) channels. To analyze the performance of this system, we have generalized the method used in [4],[5] to arbitrary burst-format DS-SS acquisition schemes. The results show that a new hybrid scheme has good immunity against the effect of code autocorrelation sidelobes. Thevalidity of the presented approach has been confirmed form the results of simulation.

  • PDF

균열발생시기 결정을 위한 항공기 엔진 구성품의 비파괴검사 결과에 대한 통계적 분석 (Statistical Analysis for NDI Results of Aircraft Engine Component for Determining Crack Initiation Period)

  • 최재만;권영한;최환서;양승효;우상욱;조순미;이승주
    • 대한기계학회논문집A
    • /
    • 제33권12호
    • /
    • pp.1482-1487
    • /
    • 2009
  • In this study statistical analysis was performed for NDI(Non-Destructive Inspection) results of F100 engine front seal support assembly. NDI results can be statistically considered as Quantal Response Data. It is found that the suitable probability distribution to the failure data is normal distribution through MLE(Maximum Likelihood Estimation) of the Quantal Response Data. Moreover, Cumulative Distribution Function, failure rate function and B-Life are calculated on the supposed distribution.

SAMPLE ENTROPY IN ESTIMATING THE BOX-COX TRANSFORMATION

  • Rahman, Mezbahur;Pearson, Larry M.
    • Journal of the Korean Data and Information Science Society
    • /
    • 제12권1호
    • /
    • pp.103-125
    • /
    • 2001
  • The Box-Cox transformation is a well known family of power transformation that brings a set of data into agreement with the normality assumption of the residuals and hence the response variable of a postulated model in regression analysis. This paper proposes a new method for estimating the Box-Cox transformation using maximization of the Sample Entropy statistic which forces the data to get closer to normal as much as possible. A comparative study of the proposed procedure with the maximum likelihood procedure, the procedure via artificial regression estimation, and the recently introduced maximization of the Shapiro-Francia W' statistic procedure is given. In addition, we generate a table for the optimal spacings parameter in computing the Sample Entropy statistic.

  • PDF

구상흑연주철의 피로수명분포에 대한 통계적 해석 (A Statistical Analysis on Fatigue Life Distribution in Spheroidal Graphite Cast Iron)

  • 장성수;김상태
    • 대한기계학회논문집A
    • /
    • 제24권9호
    • /
    • pp.2353-2360
    • /
    • 2000
  • Statistical fatigue properties of metallic materials are increasingly required for reliability design purpose. In this study, static and fatigue tests were conducted and the normal, log-normal, two -parameter Weibull distributions at the 5% significance level are compared using the Kolmogorov-Smirnov goodness-of-fit test. Parameter estimation were compared with experimental results using the maximum likelihood method and least square method. It is found that two-parameter Weibull distribution and maximum likelihood method provide a good fit for static and fatigue life data. Therefore, it is applicable to the static and fatigue life analysis of the spheroidal graphite cast iron. The P-S-N curves were evaluated using log-normal distribution, which showed fatigue life behavior very well.

New approach for analysis of progressive Type-II censored data from the Pareto distribution

  • Seo, Jung-In;Kang, Suk-Bok;Kim, Ho-Yong
    • Communications for Statistical Applications and Methods
    • /
    • 제25권5호
    • /
    • pp.569-575
    • /
    • 2018
  • Pareto distribution is important to analyze data in actuarial sciences, reliability, finance, and climatology. In general, unknown parameters of the Pareto distribution are estimated based on the maximum likelihood method that may yield inadequate inference results for small sample sizes and high percent censored data. In this paper, a new approach based on the regression framework is proposed to estimate unknown parameters of the Pareto distribution under the progressive Type-II censoring scheme. The proposed method provides a new regression type estimator that employs the spacings of exponential progressive Type-II censored samples. In addition, the provided estimator is a consistent estimator with superior performance compared to maximum likelihood estimators in terms of the mean squared error and bias. The validity of the proposed method is assessed through Monte Carlo simulations and real data analysis.

A visualizing method for investigating individual frailties using frailtyHL R-package

  • Ha, Il Do;Noh, Maengseok
    • Journal of the Korean Data and Information Science Society
    • /
    • 제24권4호
    • /
    • pp.931-940
    • /
    • 2013
  • For analysis of clustered survival data, the inferences of parameters in semi-parametric frailty models have been widely studied. It is also important to investigate the potential heterogeneity in event times among clusters (e.g. centers, patients). For purpose of this analysis, the interval estimation of frailty is useful. In this paper we propose a visualizing method to present confidence intervals of individual frailties across clusters using the frailtyHL R-package, which is implemented from h-likelihood methods for frailty models. The proposed method is demonstrated using two practical examples.

Bayesian Estimation of Three-parameter Bathtub Shaped Lifetime Distribution Based on Progressive Type-II Censoring with Binomial Removal

  • Chung, Younshik
    • Journal of the Korean Data Analysis Society
    • /
    • 제20권6호
    • /
    • pp.2747-2757
    • /
    • 2018
  • We consider the MLE (maximum likelihood estimate) and Bayesian estimates of three-parameter bathtub-shaped lifetime distribution based on the progressive type II censoring with binomial removal. Jung, Chung (2018) proposed the three-parameter bathtub-shaped distribution which is the extension of the two-parameter bathtub-shaped distribution given by Zhang (2004). Jung, Chung (2018) investigated its properties and estimations. The maximum likelihood estimates are computed using Newton-Raphson algorithm. Also, Bayesian estimates are obtained under the balanced loss function using MCMC (Markov chain Monte Carlo) method. In particular, BSEL (balanced squared error loss) function is considered as a special form of balanced loss function given by Zellner (1994). For comparing theirs MLEs with the corresponding Bayes estimates, some simulations are performed. It shows that Bayes estimates is better than MLEs in terms of risks. Finally, concluding remarks are mentioned.

Novel estimation based on a minimum distance under the progressive Type-II censoring scheme

  • Young Eun Jeon;Suk-Bok Kang;Jung-In Seo
    • Communications for Statistical Applications and Methods
    • /
    • 제30권4호
    • /
    • pp.411-421
    • /
    • 2023
  • This paper provides a new estimation equation based on the concept of a minimum distance between the empirical and theoretical distribution functions under the most widely used progressive Type-II censoring scheme. For illustrative purposes, simulated and real datasets from a three-parameter Weibull distribution are analyzed. For comparison, the most popular estimation methods, the maximum likelihood and maximum product of spacings estimation methods, are developed together. In the analysis of simulated datasets, the excellence of the provided estimation method is demonstrated through the degree of the estimation failure of the likelihood-based method, and its validity is demonstrated through the mean squared errors and biases of the estimators obtained from the provided estimation equation. In the analysis of the real dataset, two types of goodness-of-fit tests are performed on whether the observed dataset has the three-parameter Weibull distribution under the progressive Type-II censoring scheme, through which the performance of the new estimation equation provided is examined.

우리나라 김과 참치의 수출 결정요인 분석 : 중력모형을 이용하여 (Analysis of Determinants of Export of Korean Laver and Tuna: Using the Gravity Model)

  • 김은지;김봉태
    • 수산경영론집
    • /
    • 제51권4호
    • /
    • pp.85-96
    • /
    • 2020
  • The purpose of this study is to find out the determinants of export in Korean fishery products. For the analysis, laver and tuna, which account for almost half of seafood exports, were selected, and a gravity model widely used in trade analysis was applied. As explanatory variables, GDP, number of overseas Koreans, exchange rate, FTA, and WTO were applied, and fixed effect terms were included to take into account multilateral resistance that hinders trade. The analysis period is from 2000 to 2018, and the Poisson Pseudo Maximum Likelihood (PPML) method was applied to solve the problem of zero observation and heteroscedasticity inherent in trade data. As a result of the analysis, GDP was found to have a significant positive effect on both laver and tuna. The number of overseas Koreans was significant in canned tuna exports, but not in laver and the other tuna products. As the exchange rate increased, the export of laver and tuna for sashimi increased. The impacts of the FTA were confirmed in the exports of dried laver and raw tuna, which supports the results of the previous study. WTO was not significant for laver and tuna. Based on these results, it is necessary to find a way to make good use of the FTA to expand exports of seafood.