• 제목/요약/키워드: Quadratic Loss

검색결과 118건 처리시간 0.029초

Estimators with Nondecreasing Risk in a Multivariate Normal Distribution

  • Kim, Byung-Hwee;Koh, Tae-Wook;Baek, Hoh-Yoo
    • Journal of the Korean Statistical Society
    • /
    • 제24권1호
    • /
    • pp.257-266
    • /
    • 1995
  • Consider a p-variate $(p \geq 4)$ normal distribution with mean $\b{\theta}$ and identity covariance matrix. For estimating $\b{\theta}$ under a quadratic loss we investigate the behavior of risks of Stein-type estimators which shrink the usual estimator toward the mean of observations. By using concavity of the function appearing in the shrinkage factor together with new expectation identities for noncentral chi-squared random variables, a characterization of estimators with nondecreasing risk is obtained.

  • PDF

Lindley Type Estimators with the Known Norm

  • Baek, Hoh-Yoo
    • Journal of the Korean Data and Information Science Society
    • /
    • 제11권1호
    • /
    • pp.37-45
    • /
    • 2000
  • Consider the problem of estimating a $p{\times}1$ mean vector ${\underline{\theta}}(p{\geq}4)$ under the quadratic loss, based on a sample ${\underline{x}_{1}},\;{\cdots}{\underline{x}_{n}}$. We find an optimal decision rule within the class of Lindley type decision rules which shrink the usual one toward the mean of observations when the underlying distribution is that of a variance mixture of normals and when the norm ${\parallel}\;{\underline{\theta}}\;-\;{\bar{\theta}}{\underline{1}}\;{\parallel}$ is known, where ${\bar{\theta}}=(1/p){\sum_{i=1}^p}{\theta}_i$ and $\underline{1}$ is the column vector of ones.

  • PDF

Lindley Type Estimation with Constrains on the Norm

  • Baek, Hoh-Yoo;Han, Kyou-Hwan
    • 호남수학학술지
    • /
    • 제25권1호
    • /
    • pp.95-115
    • /
    • 2003
  • Consider the problem of estimating a $p{\times}1$ mean vector ${\theta}(p{\geq}4)$ under the quadratic loss, based on a sample $X_1,\;{\cdots}X_n$. We find an optimal decision rule within the class of Lindley type decision rules which shrink the usual one toward the mean of observations when the underlying distribution is that of a variance mixture of normals and when the norm $||{\theta}-{\bar{\theta}}1||$ is known, where ${\bar{\theta}}=(1/p)\sum_{i=1}^p{\theta}_i$ and 1 is the column vector of ones. When the norm is restricted to a known interval, typically no optimal Lindley type rule exists but we characterize a minimal complete class within the class of Lindley type decision rules. We also characterize the subclass of Lindley type decision rules that dominate the sample mean.

  • PDF

Lindley Type Estimators When the Norm is Restricted to an Interval

  • Baek, Hoh-Yoo;Lee, Jeong-Mi
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권4호
    • /
    • pp.1027-1039
    • /
    • 2005
  • Consider the problem of estimating a $p{\times}1$ mean vector $\theta(p\geq4)$ under the quadratic loss, based on a sample $X_1$, $X_2$, $\cdots$, $X_n$. We find a Lindley type decision rule which shrinks the usual one toward the mean of observations when the underlying distribution is that of a variance mixture of normals and when the norm $\parallel\;{\theta}-\bar{{\theta}}1\;{\parallel}$ is restricted to a known interval, where $bar{{\theta}}=\frac{1}{p}\;\sum\limits_{i=1}^{p}{\theta}_i$ and 1 is the column vector of ones. In this case, we characterize a minimal complete class within the class of Lindley type decision rules. We also characterize the subclass of Lindley type decision rules that dominate the sample mean.

  • PDF

James-Stein Type Estimators Shrinking towards Projection Vector When the Norm is Restricted to an Interval

  • Baek, Hoh Yoo;Park, Su Hyang
    • 통합자연과학논문집
    • /
    • 제10권1호
    • /
    • pp.33-39
    • /
    • 2017
  • Consider the problem of estimating a $p{\times}1$ mean vector ${\theta}(p-q{\geq}3)$, $q=rank(P_V)$ with a projection matrix $P_v$ under the quadratic loss, based on a sample $X_1$, $X_2$, ${\cdots}$, $X_n$. We find a James-Stein type decision rule which shrinks towards projection vector when the underlying distribution is that of a variance mixture of normals and when the norm ${\parallel}{\theta}-P_V{\theta}{\parallel}$ is restricted to a known interval, where $P_V$ is an idempotent and projection matrix and rank $(P_V)=q$. In this case, we characterize a minimal complete class within the class of James-Stein type decision rules. We also characterize the subclass of James-Stein type decision rules that dominate the sample mean.

An improvement of estimators for the multinormal mean vector with the known norm

  • Kim, Jaehyun;Baek, Hoh Yoo
    • Journal of the Korean Data and Information Science Society
    • /
    • 제28권2호
    • /
    • pp.435-442
    • /
    • 2017
  • Consider the problem of estimating a $p{\times}1$ mean vector ${\theta}$ (p ${\geq}$ 3) under the quadratic loss from multi-variate normal population. We find a James-Stein type estimator which shrinks towards the projection vectors when the underlying distribution is that of a variance mixture of normals. In this case, the norm ${\parallel}{\theta}-K{\theta}{\parallel}$ is known where K is a projection vector with rank(K) = q. The class of this type estimator is quite general to include the class of the estimators proposed by Merchand and Giri (1993). We can derive the class and obtain the optimal type estimator. Also, this research can be applied to the simple and multiple regression model in the case of rank(K) ${\geq}2$.

이차손실함수를 이용한 유동적인 공정수행척도 (Flexible Process Performance Measures by Quadratic Loss Function)

  • 정영배
    • 산업경영시스템학회지
    • /
    • 제18권36호
    • /
    • pp.275-285
    • /
    • 1995
  • In recent years there has been increasing interest in the issue of process centering in manufacturing process, The traditional process capability indices Cp, Cpk and Cpu are used to provide measure of process performance, but these indices do not represent the issue of process centering. A new measure of the process capability index Cpm is proposed that takes into account the proximity to the target value as well as the process variation when assessing process performance. However, Cpm only considers acceptance cost for deviation from target value within specification limits, do not includes economic consideration for rejected items. This paper proposes flexible process performance measures that considered quadratic loss function caused by quality deviation within specification limits, rejection cost associated with the disposition of rejected items, and inspection cost. In this model disposition of rejected items are considered under perfect corrective procedures and the absence of perfect corrective procedures.

  • PDF

An approach to improving the James-Stein estimator shrinking towards projection vectors

  • Park, Tae Ryong;Baek, Hoh Yoo
    • Journal of the Korean Data and Information Science Society
    • /
    • 제25권6호
    • /
    • pp.1549-1555
    • /
    • 2014
  • Consider a p-variate normal distribution ($p-q{\geq}3$, q = rank($P_V$) with a projection matrix $P_V$). Using a simple property of noncentral chi square distribution, the generalized Bayes estimators dominating the James-Stein estimator shrinking towards projection vectors under quadratic loss are given based on the methods of Brown, Brewster and Zidek for estimating a normal variance. This result can be extended the cases where covariance matrix is completely unknown or ${\sum}={\sigma}^2I$ for an unknown scalar ${\sigma}^2$.

보조벡터 머신을 이용한 시계열 예측에 관한 연구 (A study on the Time Series Prediction Using the Support Vector Machine)

  • 강환일;정요원;송영기
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2000년도 제15차 학술회의논문집
    • /
    • pp.315-315
    • /
    • 2000
  • In this paper, we perform the time series prediction using the SVM(Support Vector Machine). We make use of two different loss functions and two different kernel functions; i) Quadratic and $\varepsilon$-insensitive loss function are used; ii) GRBF(Gaussian Radial Basis Function) and ERBF(Exponential Radial Basis Function) are used. Mackey-Glass time series are used for prediction. For both cases, we compare the results by the SVM to those by ANN(Artificial Neural Network) and show the better performance by SVM than that by ANN.

Simultaneous Optimization Using Loss Functions in Multiple Response Robust Designs

  • Kwon, Yong Man
    • 통합자연과학논문집
    • /
    • 제14권3호
    • /
    • pp.73-77
    • /
    • 2021
  • Robust design is an approach to reduce the performance variation of mutiple responses in products and processes. In fact, in many experimental designs require the simultaneous optimization of multiple responses. In this paper, we propose how to simultaneously optimize multiple responses for robust design when data are collected from a combined array. The proposed method is based on the quadratic loss function. An example is illustrated to show the proposed method.