• Title/Summary/Keyword: Statistical Constraints

Search Result 152, Processing Time 0.025 seconds

An Economic Statistical Design of the EWMA Control Charts with Variable Sampling Interval (VSI EWMA 관리도의 경제적 통계적 설계)

  • 송서일;박현규;정혜진
    • Journal of Korean Society for Quality Management
    • /
    • v.32 no.1
    • /
    • pp.92-101
    • /
    • 2004
  • Tris paper present an economic statistical design which have statistically constraints for the optimal design of an EWMA control charts with variable sampling interval. Cost function use that proposed by Lorenzen and Vance, and the optimal design parameters include the sample size, control limit width, sampling interval, EWMA weight value. Comparisons between VSI EWMA control charts optimal economic design and optimal economic statistical designs show the following fact. Although have demerits which are more costly than economic design, have merits which to detect shifts more efficiently and to improve statistical performance.

Rationale of the Maximum Entropy Probability Density

  • Park, B. S.
    • Journal of the Korean Statistical Society
    • /
    • v.13 no.2
    • /
    • pp.87-106
    • /
    • 1984
  • It ${X_t}$ is a sequence of independent identically distributed normal random variables, then the conditional probability density of $X_1, X_2, \cdots, X_n$ given the first p+1 sample autocovariances converges to the maximum entropy probability density satisfying the corresponding covariance constraints as the length of the sample sequence tends to infinity. This establishes that the maximum entropy probability density and the associated Gaussian autoregressive process arise naturally as the answers of conditional limit problems.

  • PDF

Monte Carlo Estimation of Multivariate Normal Probabilities

  • Oh, Man-Suk;Kim, Seung-Whan
    • Journal of the Korean Statistical Society
    • /
    • v.28 no.4
    • /
    • pp.443-455
    • /
    • 1999
  • A simulation-based approach to estimating the probability of an arbitrary region under a multivariate normal distribution is developed. In specific, the probability is expressed as the ratio of the unrestricted and the restricted multivariate normal density functions, where the restriction is given by the region whose probability is of interest. The density function of the restricted distribution is then estimated by using a sample generated from the Gibbs sampling algorithm.

  • PDF

Robust Inference for Testing Order-Restricted Inference

  • Kang, Moon-Su
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.5
    • /
    • pp.1097-1102
    • /
    • 2009
  • Classification of subjects with unknown distribution in small sample size setup may involve order-restricted constraints in multivariate parameter setups. Those problems makes optimality of conventional likelihood ratio based statistical inferences not feasible. Fortunately, Roy (1953) introduced union-intersection principle(UIP) which provides an alternative avenue. Redescending M-estimator along with that principle yields a considerably appropriate robust testing procedure. Furthermore, conditionally distribution-free test based upon exact permutation theory is used to generate p-values, even in small sample. Applications of this method are illustrated in simulated data and read data example (Lobenhofer et al., 2002)

Constrained $L_1$-Estimation in Linear Regression

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.3
    • /
    • pp.581-589
    • /
    • 1998
  • An algorithm is proposed for the $L_1$-estimation with linear equality and inequality constraints in linear regression model. The algorithm employs a linear scaling transformation to obtain the optimal solution of linear programming type problem. And a special scheme is used to maintain the feasibility of the updated solution at each iteration. The convergence of the proposed algorithm is proved. In addition, the updating and orthogonal decomposition techniques are employed to improve the computational efficiency and numerical stability.

  • PDF

A Penalized Principal Component Analysis using Simulated Annealing

  • Park, Chongsun;Moon, Jong Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1025-1036
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalty function is proposed. We use the fact that usual principal component problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Simulated annealing algorithm is used in searching for optimal solutions with penalty functions. Comparisons between several well-known penalty functions through simulation reveals that the HARD penalty function should be suggested as the best one in several aspects. Illustrations with real and simulated examples are provided.

Bayesian Variable Selection in Linear Regression Models with Inequality Constraints on the Coefficients (제한조건이 있는 선형회귀 모형에서의 베이지안 변수선택)

  • 오만숙
    • The Korean Journal of Applied Statistics
    • /
    • v.15 no.1
    • /
    • pp.73-84
    • /
    • 2002
  • Linear regression models with inequality constraints on the coefficients are frequently used in economic models due to sign or order constraints on the coefficients. In this paper, we propose a Bayesian approach to selecting significant explanatory variables in linear regression models with inequality constraints on the coefficients. Bayesian variable selection requires computation of posterior probability of each candidate model. We propose a method which computes all the necessary posterior model probabilities simultaneously. In specific, we obtain posterior samples form the most general model via Gibbs sampling algorithm (Gelfand and Smith, 1990) and compute the posterior probabilities by using the samples. A real example is given to illustrate the method.

The Minimization of Tolerance Cost and Quality Loss Cost by the Statistical Tolerance Allocation Method (Statistical Tolerance Allocation을 이용한 제조비용과 품질손실비용의 최소화)

  • Kim, Sunn-Ho;Kwon, Yong-Sung;Lee, Byong-Ki;Kang, Kyung-Sik
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.24 no.2
    • /
    • pp.175-183
    • /
    • 1998
  • When a product is designed, tolerances must be given to the product so that required functions are guaranteed and production costs are minimized. In this research, a model is suggested which allocates tolerances to components optimally according to the STA(Statistical Tolerance Allocation) method. Taking into account the concept that dimensional errors have characteristics of statistical distributions, this model presents the discrete pseudo-boolean approach for the tolerance optimization by minimizing the tolerance cost and the quality loss cost. In this approach, two methods are proposed for the reduction of the problem scale; 1) a method for converting the minimization model for casts into the maximization model for cost savings, and 2) procedures to reduce the number of constraints and variables.

  • PDF

CONSTRAINING COSMOLOGICAL PARAMETERS WITH IMAGE SEPARATION STATISTICS OF GRAVITATIONALLY LENSED SDSS QUASARS: MEAN IMAGE SEPARATION AND LIKELIHOOD INCORPORATING LENS GALAXY BRIGHTNESS

  • Han, Du-Hwan;Park, Myeong-Gu
    • Journal of The Korean Astronomical Society
    • /
    • v.48 no.1
    • /
    • pp.83-92
    • /
    • 2015
  • Recent large scale surveys such as Sloan Digital Sky Survey have produced homogeneous samples of multiple-image gravitationally lensed quasars with well-defined selection effects. Statistical analysis on these can yield independent constraints on cosmological parameters. Here we use the image separation statistics of lensed quasars from Sloan Digital Sky Survey Quasar Lens Search (SQLS) to derive constraints on cosmological parameters. Our analysis does not require knowledge of the magnification bias, which can only be estimated from the detailed knowledge on the quasar luminosity function at all redshifts, and includes the consideration for the bias against small image separation quasars due to selection against faint lens galaxy in the follow-up observations for confirmation. We first use the mean image separation of the lensed quasars as a function of redshift to find that cosmological models with extreme curvature are inconsistent with observed lensed quasars. We then apply the maximum likelihood test to the statistical sample of 16 lensed quasars that have both measured redshift and magnitude of lens galaxy. The likelihood incorporates the probability that the observed image separation is realized given the luminosity of the lens galaxy in the same manner as Im et al. (1997). We find that the 95% confidence range for the cosmological constant (i.e., the vacuum energy density) is $0.72{\leq}{\Omega}_{\Lambda}{\leq}1.0$ for a flat universe. We also find that the equation of state parameter can be consistent with -1 as long as the matter density ${\Omega}_m{\leq}0.4$ (95% confidence range). We conclude that the image separation statistics incorporating the brightness of lens galaxies can provide robust constraints on the cosmological parameters.

THE ACCELERATION AND TRANSPORT OF COSMIC RAYS WITH HELIOSPHERIC EXAMPLES

  • JOKIPII J. R.
    • Journal of The Korean Astronomical Society
    • /
    • v.37 no.5
    • /
    • pp.399-404
    • /
    • 2004
  • Cosmic rays are ubiquitous in space, and are apparently present wherever the matter density is small enough that they are not removed by collisions with ambient particles. The essential similarity of their energy spectra in many different regions places significant general constraints on the mechanisms for their acceleration and confinement. Diffusive shock acceleration is at present the most successful acceleration mechanism proposed, and, together with transport in Kolmogorov turbulence, can account for the universal specta. In comparison to shock acceleration, statistical acceleration, invoked in many situations, has significant disadvantages. The basic physics of acceleration and transport are discussed, and examples shown where it apparently works very well. However, there are now well-established situations where diffusive shock acceleration cannot be the accelerator. This problem will be discussed and possible acceleration mechanism evaluated. Statistical acceleration in these places is possible. In addition, a new mechanism, called diffusive compression acceleration, will be discussed and shown to be an attractive candidate. It has similarities with both statistical acceleration and shock acceleration.