• Title/Summary/Keyword: monte carlo methods

Search Result 949, Processing Time 0.025 seconds

A Feasible Two-Step Estimator for Seasonal Cointegration

  • Seong, Byeong-Chan
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.3
    • /
    • pp.411-420
    • /
    • 2008
  • This paper considers a feasible two-step estimator for seasonal cointegration as the extension of $Br{\ddot{u}}ggeman$ and $L{\ddot{u}}tkepohl$ (2005). It is shown that the reducedrank maximum likelihood(ML) estimator for seasonal cointegration can still produce occasional outliers as that for non-seasonal cointegration even though the sizes of them are not extreme as those in non-seasonal cointegration. The ML estimator(MLE) is compared with the two-step estimator in a small Monte Carlo simulation study and we find that the two-step estimator can be an attractive alternative to the MLE, especially, in a small sample.

Accuracy Measures of Empirical Bayes Estimator for Mean Rates

  • Jeong, Kwang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.6
    • /
    • pp.845-852
    • /
    • 2010
  • The outcomes of counts commonly occur in the area of disease mapping for mortality rates or disease rates. A Poisson distribution is usually assumed as a model of disease rates in conjunction with a gamma prior. The small area typically refers to a small geographical area or demographic group for which very little information is available from the sample surveys. Under this situation the model-based estimation is very popular, in which the auxiliary variables from various administrative sources are used. The empirical Bayes estimator under Poissongamma model has been considered with its accuracy measures. An accuracy measure using a bootstrap samples adjust the underestimation incurred by the posterior variance as an estimator of true mean squared error. We explain the suggested method through a practical dataset of hitters in baseball games. We also perform a Monte Carlo study to compare the accuracy measures of mean squared error.

Test procedures for the mean and variance simultaneously under normality

  • Park, Hyo-Il
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.6
    • /
    • pp.563-574
    • /
    • 2016
  • In this study, we propose several simultaneous tests to detect the difference between means and variances for the two-sample problem when the underlying distribution is normal. For this, we apply the likelihood ratio principle and propose a likelihood ratio test. We then consider a union-intersection test after identifying the likelihood statistic, a product of two individual likelihood statistics, to test the individual sub-null hypotheses. By noting that the union-intersection test can be considered a simultaneous test with combination function, also we propose simultaneous tests with combination functions to combine individual tests for each sub-null hypothesis. We apply the permutation principle to obtain the null distributions. We then provide an example to illustrate our proposed procedure and compare the efficiency among the proposed tests through a simulation study. We discuss some interesting features related to the simultaneous test as concluding remarks. Finally we show the expression of the likelihood ratio statistic with a product of two individual likelihood ratio statistics.

A generalized regime-switching integer-valued GARCH(1, 1) model and its volatility forecasting

  • Lee, Jiyoung;Hwang, Eunju
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.1
    • /
    • pp.29-42
    • /
    • 2018
  • We combine the integer-valued GARCH(1, 1) model with a generalized regime-switching model to propose a dynamic count time series model. Our model adopts Markov-chains with time-varying dependent transition probabilities to model dynamic count time series called the generalized regime-switching integer-valued GARCH(1, 1) (GRS-INGARCH(1, 1)) models. We derive a recursive formula of the conditional probability of the regime in the Markov-chain given the past information, in terms of transition probabilities of the Markov-chain and the Poisson parameters of the INGARCH(1, 1) process. In addition, we also study the forecasting of the Poisson parameter as well as the cumulative impulse response function of the model, which is a measure for the persistence of volatility. A Monte-Carlo simulation is conducted to see the performances of volatility forecasting and behaviors of cumulative impulse response coefficients as well as conditional maximum likelihood estimation; consequently, a real data application is given.

A class of CUSUM tests using empirical distributions for tail changes in weakly dependent processes

  • Kim, JunHyeong;Hwang, Eunju
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.2
    • /
    • pp.163-175
    • /
    • 2020
  • We consider a wide class of general weakly-dependent processes, called ψ-weak dependence, which unify almost all weak dependence structures of interest found in statistics under natural conditions on process parameters, such as mixing, association, Bernoulli shifts, and Markovian sequences. For detecting the tail behavior of the weakly dependent processes, change point tests are developed by means of cumulative sum (CUSUM) statistics with the empirical distribution functions of sample extremes. The null limiting distribution is established as a Brownian bridge. Its proof is based on the ψ-weak dependence structure and the existence of the phantom distribution function of stationary weakly-dependent processes. A Monte-Carlo study is conducted to see the performance of sizes and powers of the CUSUM tests in GARCH(1, 1) models; in addition, real data applications are given with log-returns of financial data such as the Korean stock price index.

Bayesian and maximum likelihood estimation of entropy of the inverse Weibull distribution under generalized type I progressive hybrid censoring

  • Lee, Kyeongjun
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.4
    • /
    • pp.469-486
    • /
    • 2020
  • Entropy is an important term in statistical mechanics that was originally defined in the second law of thermodynamics. In this paper, we consider the maximum likelihood estimation (MLE), maximum product spacings estimation (MPSE) and Bayesian estimation of the entropy of an inverse Weibull distribution (InW) under a generalized type I progressive hybrid censoring scheme (GePH). The MLE and MPSE of the entropy cannot be obtained in closed form; therefore, we propose using the Newton-Raphson algorithm to solve it. Further, the Bayesian estimators for the entropy of InW based on squared error loss function (SqL), precautionary loss function (PrL), general entropy loss function (GeL) and linex loss function (LiL) are derived. In addition, we derive the Lindley's approximate method (LiA) of the Bayesian estimates. Monte Carlo simulations are conducted to compare the results among MLE, MPSE, and Bayesian estimators. A real data set based on the GePH is also analyzed for illustrative purposes.

Tolerance Analysis of Automobile Steering System (상용차 조향장치용 U-joint 어셈블리 공차해석)

  • Lee, Jang-Yong
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.12
    • /
    • pp.1397-1402
    • /
    • 2011
  • Quality of manufactured goods under mass production system depends largely upon accuracy rate of quality control. Tolerance analysis is a useful method to set up a guide for inspection of product. However it usually would happen that strict tolerance provoke very high manufacturing cost. It is the main concern of tolerance analysis to find optimal values of tolerance to satisfy both quality and cost. This paper presents three tolerance analysis methods and its corresponding results upon automobile steering system to analysis the merits and demerits of each method.

Enhanced Inter-Symbol Interference Cancellation Scheme for Diffusion Based Molecular Communication using Maximum Likelihood Estimation

  • Raut, Prachi;Sarwade, Nisha
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.10
    • /
    • pp.5035-5048
    • /
    • 2016
  • Nano scale networks are futuristic networks deemed as enablers for the Internet of Nano Things, Body area nano networks, target tracking, anomaly/ abnormality detection at molecular level and neuronal therapy / drug delivery applications. Molecular communication is considered the most compatible communication technology for nano devices. However, connectivity in such networks is very low due to inter-symbol interference (ISI). Few research papers have addressed the issue of ISI mitigation in molecular communication. However, many of these methods are not adaptive to dynamic environmental conditions. This paper presents an enhancement over original Memory-1 ISI cancellation scheme using maximum likelihood estimation of a channel parameter (λ) to make it adaptable to variable channel conditions. Results of the Monte Carlo simulation show that, the connectivity (Pconn) improves by 28% for given simulation parameters and environmental conditions by using enhanced Memory-1 cancellation method. Moreover, this ISI mitigation method allows reduction in symbol time (Ts) up to 50 seconds i.e. an improvement of 75% is achieved.

A Goodness of Fit Approach to Major Lifetesting Problems

  • Ahmad, Ibrahim A.;Alwasel, Ibrahim A.;Mugdadi, A.R.
    • International Journal of Reliability and Applications
    • /
    • v.2 no.2
    • /
    • pp.81-97
    • /
    • 2001
  • Lifetesting problems have been the subject of investigations for over three decades. Most suggested approaches are markedly different from those used in the related but wider goodness of fit problems. In the current investigation, it is demonstrated that a goodness of fit approach is possible in many lifetesting problems and that It results in simpler procedures that are asymptotically equivalent or better than standard ones. They may also have superior finite sample behavior. Several perennial classes are addressed here. The class of increasing failure rate (IFR) and the class of new better than used (NBU) are addressed first. In addition, we provide testing for a newer and practical class of new better than used in convex ordering (NBUC) due to Cao and Wang (1991). Other classes can be developed similarly and this point is illustrated with the classes of new better than used in expectation (NBUE) and harmonic new better than used in expectation (HNBUE).

  • PDF

ONE-DIMENSIONAL TREATMENT OF MOLECULAR LINE RADIATIVE TRANSFER IN CLUMPY CLOUDS

  • Park, Yong-Sun
    • Journal of The Korean Astronomical Society
    • /
    • v.54 no.6
    • /
    • pp.183-190
    • /
    • 2021
  • We have revisited Monte Carlo radiative transfer calculations for clumpy molecular clouds. Instead of introducing a three-dimensional geometry to implement clumpy structure, we have made use of its stochastic properties in a one-dimensional geometry. Taking into account the reduction of spontaneous emission and optical depth due to clumpiness, we have derived the excitation conditions of clumpy clouds and compared them with those of three-dimensional calculations. We found that the proposed approach reproduces the excitation conditions in a way compatible to those from three-dimensional models, and reveals the dependencies of the excitation conditions on the size of clumps. When bulk motions are involved, the applicability of the approach is rather vague, but the one-dimensional approach can be an excellent proxy for more rigorous three-dimensional calculations.