• Title/Summary/Keyword: Classical Statistical Method

Search Result 109, Processing Time 0.027 seconds

Modified inverse moment estimation: its principle and applications

  • Gui, Wenhao
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.6
    • /
    • pp.479-496
    • /
    • 2016
  • In this survey, we present a modified inverse moment estimation of parameters and its applications. We use a specific model to demonstrate its principle and how to apply this method in practice. The estimation of unknown parameters is considered. A necessary and sufficient condition for the existence and uniqueness of maximum-likelihood estimates of the parameters is obtained for the classical maximum likelihood estimation. Inverse moment and modified inverse moment estimators are proposed and their properties are studied. Monte Carlo simulations are conducted to compare the performances of these estimators. As far as the biases and mean squared errors are concerned, modified inverse moment estimator works the best in all cases considered for estimating the unknown parameters. Its performance is followed by inverse moment estimator and maximum likelihood estimator, especially for small sample sizes.

Two-dimensional attention-based multi-input LSTM for time series prediction

  • Kim, Eun Been;Park, Jung Hoon;Lee, Yung-Seop;Lim, Changwon
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.1
    • /
    • pp.39-57
    • /
    • 2021
  • Time series prediction is an area of great interest to many people. Algorithms for time series prediction are widely used in many fields such as stock price, temperature, energy and weather forecast; in addtion, classical models as well as recurrent neural networks (RNNs) have been actively developed. After introducing the attention mechanism to neural network models, many new models with improved performance have been developed; in addition, models using attention twice have also recently been proposed, resulting in further performance improvements. In this paper, we consider time series prediction by introducing attention twice to an RNN model. The proposed model is a method that introduces H-attention and T-attention for output value and time step information to select useful information. We conduct experiments on stock price, temperature and energy data and confirm that the proposed model outperforms existing models.

New Calibration Methods with Asymmetric Data

  • Kim, Sung-Su
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.4
    • /
    • pp.759-765
    • /
    • 2010
  • In this paper, two new inverse regression methods are introduced. One is a distance based method, and the other is a likelihood based method. While a model is fitted by minimizing the sum of squared prediction errors of y's and x's in the classical and inverse methods, respectively. In the new distance based method, we simultaneously minimize the sum of both squared prediction errors. In the likelihood based method, we propose an inverse regression with Arnold-Beaver Skew Normal(ABSN) error distribution. Using the cross validation method with an asymmetric real data set, two new and two existing methods are studied based on the relative prediction bias(RBP) criteria.

Watermark Detection Algorithm Using Statistical Decision Theory (통계적 판단 이론을 이용한 워터마크 검출 알고리즘)

  • 권성근;김병주;이석환;권기구;권기용;이건일
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.40 no.1
    • /
    • pp.39-49
    • /
    • 2003
  • Watermark detection has a crucial role in copyright protection of and authentication for multimedia and has classically been tackled by means of correlation-based algorithms. Nevertheless, when watermark embedding does not obey an additive rule, correlation-based detection is not the optimum choice. So a new detection algorithm is proposed which is optimum for non-additive watermark embedding. By relying on statistical decision theory, the proposed method is derived according to the Bayes decision theory, Neyman-Pearson criterion, and distribution of wavelet coefficients, thus permitting to minimize the missed detection probability subject to a given false detection probability. The superiority of the proposed method has been tested from a robustness perspective. The results confirm the superiority of the proposed technique over classical correlation- based method.

A dynamical stochastic finite element method based on the moment equation approach for the analysis of linear and nonlinear uncertain structures

  • Falsone, Giovanni;Ferro, Gabriele
    • Structural Engineering and Mechanics
    • /
    • v.23 no.6
    • /
    • pp.599-613
    • /
    • 2006
  • A method for the dynamical analysis of FE discretized uncertain linear and nonlinear structures is presented. This method is based on the moment equation approach, for which the differential equations governing the response first and second-order statistical moments must be solved. It is shown that they require the cross-moments between the response and the random variables characterizing the structural uncertainties, whose governing equations determine an infinite hierarchy. As a consequence, a closure scheme must be applied even if the structure is linear. In this sense the proposed approach is approximated even for the linear system. For nonlinear systems the closure schemes are also necessary in order to treat the nonlinearities. The complete set of equations obtained by this procedure is shown to be linear if the structure is linear. The application of this procedure to some simple examples has shown its high level of accuracy, if compared with other classical approaches, such as the perturbation method, even for low levels of closures.

Modeling or rock slope stability and rockburst by the rock failure process analysis (RFPA) method

  • Tang, Chun'an;Tang, Shibin
    • Proceedings of the Korean Society for Rock Mechanics Conference
    • /
    • 2011.09a
    • /
    • pp.89-97
    • /
    • 2011
  • Brittle failure of rock is a classical rock mechanics problem. Rock failure not only involves initiation and propagation of single crack, but also is a complex problem associated with initiation, propagation and coalescence of many cracks. As the most important feature of rock material properties is the heterogeneity, the Weibull statistical distribution is employed in the rock failure process analysis (RFPA) method to describe the heterogeneity in rock properties. In this paper, the applications of the RFPA method in geotechnical engineering and rockburst modeling are introduced with emphasis, which can provide some references for relevant researches.

  • PDF

Climate Prediction by a Hybrid Method with Emphasizing Future Precipitation Change of East Asia

  • Lim, Yae-Ji;Jo, Seong-Il;Lee, Jae-Yong;Oh, Hee-Seok;Kang, Hyun-Suk
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.6
    • /
    • pp.1143-1152
    • /
    • 2009
  • A canonical correlation analysis(CCA)-based method is proposed for prediction of future climate change which combines information from ensembles of atmosphere-ocean general circulation models(AOGCMs) and observed climate values. This paper focuses on predictions of future climate on a regional scale which are of potential economic values. The proposed method is obtained by coupling the classical CCA with empirical orthogonal functions(EOF) for dimension reduction. Furthermore, we generate a distribution of climate responses, so that extreme events as well as a general feature such as long tails and unimodality can be revealed through the distribution. Results from real data examples demonstrate the promising empirical properties of the proposed approaches.

Theoretical Considerations for the Agresti-Coull Type Confidence Interval in Misclassified Binary Data (오분류된 이진자료에서 Agresti-Coull유형의 신뢰구간에 대한 이론적 고찰)

  • Lee, Seung-Chun
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.4
    • /
    • pp.445-455
    • /
    • 2011
  • Although misclassified binary data occur frequently in practice, the statistical methodology available for the data is rather limited. In particular, the interval estimation of population proportion has relied on the classical Wald method. Recently, Lee and Choi (2009) developed a new confidence interval by applying the Agresti-Coull's approach and showed the efficiency of their proposed confidence interval numerically, but a theoretical justification has not been explored yet. Therefore, a Bayesian model for the misclassified binary data is developed to consider the Agresti-Coull confidence interval from a theoretical point of view. It is shown that the Agresti-Coull confidence interval is essentially a Bayesian confidence interval.

Application of Empirical Research Methods in Information Systems Research: Gaining Lessons Through Evaluation (경영정보학 연구에 나타난 실증적 연구방법 적용상의 문제: 평가를 통해 얻은 교훈)

  • Kang, Shin-Cheol;Lee, Zoon-Ky;Choi, Jeong-Il
    • Asia pacific journal of information systems
    • /
    • v.16 no.2
    • /
    • pp.1-25
    • /
    • 2006
  • Application of appropriate research methods has assumed an important role in knowledge accumulation endeavors in the scientific research community. The current study reported here summarizes how we used the empirical methods in IS research, what we need to improve in using empirical study as research methodology through the set of comprehensive research guideline. From the survey and review of seminal and classical research guidelines, we developed our own 28 checklist for research design, statistical analysis, and conclusion, which can be commonly applied to all articles that employ inferential statistical methods in IS research. Then, we critically evaluated the usage of empirical research methods in major IS journals by using the checklist, with the goal of improving the quality of academic papers. In this study, we scrutinized four major IS journals which published empirical papers from 1991 to 2000: MIS Quarterly, Journal of MIS, Information Systems Research, and Decision Sciences. As a result of intensive evaluation work, we could highlight many areas that are lagging and call for greater attention with regard to the proper usage of empirical study in IS research. The research findings in this study can be referred as checklist and guideline when IS researcher applies the empirical method.

Model selection algorithm in Gaussian process regression for computer experiments

  • Lee, Youngsaeng;Park, Jeong-Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.4
    • /
    • pp.383-396
    • /
    • 2017
  • The model in our approach assumes that computer responses are a realization of a Gaussian processes superimposed on a regression model called a Gaussian process regression model (GPRM). Selecting a subset of variables or building a good reduced model in classical regression is an important process to identify variables influential to responses and for further analysis such as prediction or classification. One reason to select some variables in the prediction aspect is to prevent the over-fitting or under-fitting to data. The same reasoning and approach can be applicable to GPRM. However, only a few works on the variable selection in GPRM were done. In this paper, we propose a new algorithm to build a good prediction model among some GPRMs. It is a post-work of the algorithm that includes the Welch method suggested by previous researchers. The proposed algorithms select some non-zero regression coefficients (${\beta}^{\prime}s$) using forward and backward methods along with the Lasso guided approach. During this process, the fixed were covariance parameters (${\theta}^{\prime}s$) that were pre-selected by the Welch algorithm. We illustrated the superiority of our proposed models over the Welch method and non-selection models using four test functions and one real data example. Future extensions are also discussed.