• Title/Summary/Keyword: Mean-Variance Analysis

Search Result 1,100, Processing Time 0.036 seconds

The Change Point Analysis in Time Series Models

  • Lee, Sang-Yeol
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2005.11a
    • /
    • pp.43-48
    • /
    • 2005
  • We consider the problem of testing for parameter changes in time series models based on a cusum test. Although the test procedure is well-established for the mean and variance in time series models, a general parameter case has not been discussed in the literature. Therefore, here we develop a cusum test for parameter change in a more general framework. As an example, we consider the change of the parameters in an RCA(1) model and that of the autocovariances of a linear process. We also consider the variance change test for unstable models with unit roots and GARCH models.

  • PDF

On a robust analysis of variance based on winsorization (윈저화를 이용한 로버스트 분산분석)

  • 성내경
    • The Korean Journal of Applied Statistics
    • /
    • v.8 no.1
    • /
    • pp.119-131
    • /
    • 1995
  • Based on Monte-Carlo simulation results we propose a robust analysis of variance procedure by utilizing trimmed mean and Winsorized variance. We deal with mainly the one-way classification case. We evaluate the empirical distribution of a pseudo-F statistic based on symmetrically Winsorized sum of squares when the population is normally distributed.

  • PDF

BER Analysis of a Quadrature Receiver with an Autocalibration Function (자동보정 기능을 가진 Quadrature 수신기의 BER 해석)

  • Kwon, Soon-Man;Lee, Jong-Moo;Cheon, Jong-Min;Park, Min-Kook;Kim, Jong-Moon
    • Proceedings of the KIEE Conference
    • /
    • 2005.10b
    • /
    • pp.457-459
    • /
    • 2005
  • In this paper the BER consideration of a quadrature receiver that has an autocalibration method is considered. The analysis is based on the derivation of the statistical characteristics of the imbalances in gain and phase between in-phase and quadrature components that may cause severe performance degradation of the receiver. The density. mean and variance functions of the estimates of gain and phase imbalances are discussed. Then it is shown that the estimates are asymptotically minimum variance unbiased with respect to the integration time in sampling. A brief consideration on the BER calculation follows.

  • PDF

Characteristic Analysis of Normalized D-QR-RLS Algorithm (II) (정규화된 D-QR-RLS 알고리즘의 특성 분석(II))

  • Ahn, Bong-Man;Hwang, Jee-Won;Cho, Ju-Phil
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.32 no.11C
    • /
    • pp.1127-1133
    • /
    • 2007
  • This paper proposes one of normalized QR-typed LMS (Least Mean Square) algorithms with computational complexity of O(N). This proposed algorithm shows the normalized property in terms of theoretical characteristics. This proposed algorithm is one of algorithms which normalize variance of input signal in terms of mean because QR-typed LMS is proportional to variance of input signal. In this paper, convergence characteristic analysis of normalized algorithm was made. Computer simulation was made by the algorithms used for echo canceller. Proposed algorithm has similar performance to theoretical value. And, we can see that proposed method shows similar one to performance of NLMS.by comparison among different algorithms.

Characterization of the Spatial Variability of Paper Formation Using a Continuous Wavelet Transform

  • Keller, D.Steven;Luner, Philip;Pawlak, Joel J.
    • Journal of Korea Technical Association of The Pulp and Paper Industry
    • /
    • v.32 no.5
    • /
    • pp.14-25
    • /
    • 2000
  • In this investigation, a wavelet transform analysis was used to decompose beta-radiographic formation images into spectral and spatial components. Conventional formation analysis may use spectral analysis, based on Fourier transformation or variance vs. zone size, to describe the grammage distribution of features such as flocs, streaks and mean fiber orientation. However, these methods have limited utility for the analysis of statistically stationary data sets where variance is not uniform with position, e.g. paper machine CD profiles (especially those that contain streaks). A continuous wavelet transform was used to analyze formation data arrays obtained from radiographic imaging of handsheets and cross machine paper samples. The response of the analytical method to grammage, floc size distribution, mean fiber orientation an sensitivity to feature localization were assessed. From wavelet analysis, the change in scale of grammage variation as a function of position was used to demonstrate regular and isolated differences in the formed structure.

  • PDF

Selection of Survival Models for Technological Development (기술발전에 따른 생존모형 선정)

  • Oh, H.S.;Kim, C.S.;Rhee, H.K.;Yim, D.S.;Cho, J.H.
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.32 no.4
    • /
    • pp.184-191
    • /
    • 2009
  • In a technological driven environment, a depreciation estimate which is based on traditional life analysis results in a decelerated rate of capital recovery. This time pattern of technological growths models needs to be incorporated into life analysis framework especially in those industries experiencing fast technological changes. The approximation technique for calculating the variance can be applied to the six growth models that were selected by the degree of skewness and the transformation of the functions. For the Pearl growth model, the Gompertz growth model, and the Weibull growth model, the errors have zero mean and a constant variance over time. However, transformed models like the linearized Fisher-Pry model, the linearized Gompertz growth model, and the linearized Weibull growth model have increasing variance from zero to that point at which inflection occurs. It can be recommended that if the variance of error over time is increasing, then a transformation of observed data is appropriate.

N-Step Sliding Recursion Formula of Variance and Its Implementation

  • Yu, Lang;He, Gang;Mutahir, Ahmad Khwaja
    • Journal of Information Processing Systems
    • /
    • v.16 no.4
    • /
    • pp.832-844
    • /
    • 2020
  • The degree of dispersion of a random variable can be described by the variance, which reflects the distance of the random variable from its mean. However, the time complexity of the traditional variance calculation algorithm is O(n), which results from full calculation of all samples. When the number of samples increases or on the occasion of high speed signal processing, algorithms with O(n) time complexity will cost huge amount of time and that may results in performance degradation of the whole system. A novel multi-step recursive algorithm for variance calculation of the time-varying data series with O(1) time complexity (constant time) is proposed in this paper. Numerical simulation and experiments of the algorithm is presented and the results demonstrate that the proposed multi-step recursive algorithm can effectively decrease computing time and hence significantly improve the variance calculation efficiency for time-varying data, which demonstrates the potential value for time-consumption data analysis or high speed signal processing.

A Study on Robust Design Optimization of Layered Plates Bonding Process Considering Uncertainties (불확정성을 고려한 적층판 결합공정의 강건최적설계)

  • Lee, Woo-Hyuk;Park, Jung-Jin;Choi, Joo-Ho;Lee, Soo-Yong
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.31 no.1 s.256
    • /
    • pp.113-120
    • /
    • 2007
  • Design optimization of layered plates bonding process is conducted by considering uncertainties in a manufacturing process, in order to reduce the crack failure arising due to the residual stress at the surface of the adherent which is caused by different thermal expansion coefficients. Robust optimization is peformed to minimize the mean as well as its variance of the residual stress, while constraining the distortion as well as the instantaneous maximum stress under the allowable reliability limits. In this optimization, the dimension reduction (DR) method is employed to quantify the reliability such as mean and variance of the layered plate bonding. It is expected that the DR method benefits the optimization from the perspectives of efficiency, accuracy, and simplicity. The obtained robust optimal solution is verified by the Monte Carlo simulation.

Length-biased Rayleigh distribution: reliability analysis, estimation of the parameter, and applications

  • Kayid, M.;Alshingiti, Arwa M.;Aldossary, H.
    • International Journal of Reliability and Applications
    • /
    • v.14 no.1
    • /
    • pp.27-39
    • /
    • 2013
  • In this article, a new model based on the Rayleigh distribution is introduced. This model is useful and practical in physics, reliability, and life testing. The statistical and reliability properties of this model are presented, including moments, the hazard rate, the reversed hazard rate, and mean residual life functions, among others. In addition, it is shown that the distributions of the new model are ordered regarding the strongest likelihood ratio ordering. Four estimating methods, namely, method of moment, maximum likelihood method, Bayes estimation, and uniformly minimum variance unbiased, are used to estimate the parameters of this model. Simulation is used to calculate the estimates and to study their properties. Finally, the appropriateness of this model for real data sets is shown by using the chi-square goodness of fit test and the Kolmogorov-Smirnov statistic.

  • PDF

Sensitivity and Reliability Analysis of Elate (판 구조물의 감도해석 및 신뢰성해석)

  • 김지호;양영순
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1991.10a
    • /
    • pp.57-62
    • /
    • 1991
  • For the purpose of developing the method for efficiently calculating the design sensitivity and the reliability for the complicated structure such as ship structure, the probabilistic finite element method is introduced to formulate the deterministic design sensitivity analysis method and incorporated with the second moment reliability methods such as MVFOSM, AFOSM and SORM. Also, the probabilistic design sensitivity analysis needed in the reliability-based design is performed. The reliability analysis is carried out for the initial yielding failure, in which the derivative derived in the deterministic desin sensitivity is used. The present PFEM-based reliability method shows good agreement with Monte Carlo method in terms with the variance of response and the associated probability of failure even at the first or first few iteration steps. The probabilistic design sensitivity analysis evaluates explicitly the contribution of each random variable to probability of failure. Further, the reliability index variation can be easily predicted by the variation of the mean and the variance of the random variables.

  • PDF