• Title/Summary/Keyword: OLS estimator

Search Result 17, Processing Time 0.022 seconds

Jensen's Alpha Estimation Models in Capital Asset Pricing Model

  • Phuoc, Le Tan
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.5 no.3
    • /
    • pp.19-29
    • /
    • 2018
  • This research examined the alternatives of Jensen's alpha (α) estimation models in the Capital Asset Pricing Model, discussed by Treynor (1961), Sharpe (1964), and Lintner (1965), using the robust maximum likelihood type m-estimator (MM estimator) and Bayes estimator with conjugate prior. According to finance literature and practices, alpha has often been estimated using ordinary least square (OLS) regression method and monthly return data set. A sample of 50 securities is randomly selected from the list of the S&P 500 index. Their daily and monthly returns were collected over a period of the last five years. This research showed that the robust MM estimator performed well better than the OLS and Bayes estimators in terms of efficiency. The Bayes estimator did not perform better than the OLS estimator as expected. Interestingly, we also found that daily return data set would give more accurate alpha estimation than monthly return data set in all three MM, OLS, and Bayes estimators. We also proposed an alternative market efficiency test with the hypothesis testing Ho: α = 0 and was able to prove the S&P 500 index is efficient, but not perfect. More important, those findings above are checked with and validated by Jackknife resampling results.

Weighted Least Absolute Deviation Lasso Estimator

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.6
    • /
    • pp.733-739
    • /
    • 2011
  • The linear absolute shrinkage and selection operator(Lasso) method improves the low prediction accuracy and poor interpretation of the ordinary least squares(OLS) estimate through the use of $L_1$ regularization on the regression coefficients. However, the Lasso is not robust to outliers, because the Lasso method minimizes the sum of squared residual errors. Even though the least absolute deviation(LAD) estimator is an alternative to the OLS estimate, it is sensitive to leverage points. We propose a robust Lasso estimator that is not sensitive to outliers, heavy-tailed errors or leverage points.

Expressions for Shrinkage Factors of PLS Estimator

  • Kim, Jong-Duk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.4
    • /
    • pp.1169-1180
    • /
    • 2006
  • Partial least squares regression (PLS) is a biased, non-least squares regression method and is an alternative to the ordinary least squares regression (OLS) when predictors are highly collinear or predictors outnumber observations. One way to understand the properties of biased regression methods is to know how the estimators shrink the OLS estimator. In this paper, we introduce an expression for the shrinkage factor of PLS and develop a new shrinkage expression, and then prove the equivalence of the two representations. We use two near-infrared (NIR) data sets to show general behavior of the shrinkage and in particular for what eigendirections PLS expands the OLS coefficients.

  • PDF

The Use Ridge Regression for Yield Prediction Models with Multicollinearity Problems (수확예측(收穫豫測) Model의 Multicollinearity 문제점(問題點) 해결(解決)을 위(爲)한 Ridge Regression의 이용(利用))

  • Shin, Man Yong
    • Journal of Korean Society of Forest Science
    • /
    • v.79 no.3
    • /
    • pp.260-268
    • /
    • 1990
  • Two types of ridge regression estimators were compared with the ordinary least squares (OLS) estimator in order to select the "best" estimator when multicollinearitc existed. The ridge estimators were Mallows's (1973) $C_P$-like statistic, and Allen's (1974) PRESS-like statistic. The evaluation was conducted based on the predictive ability of a yield model developed by Matney et al. (1988). A total of 522 plots from the data of the Southwide Loblolly Pine Seed Source study was used in this study. All of ridge estimators were better in predictive ability than the OLS estimator. The ridge estimator obtained by using Mallows's statistic performed the best. Thus, ridge estimators can be recommended as an alternative estimator when multicollinearity exists among independent variables.

  • PDF

Estimation of the Polynomial Errors-in-variables Model with Decreasing Error Variances

  • Moon, Myung-Sang;R. F. Gunst
    • Journal of the Korean Statistical Society
    • /
    • v.23 no.1
    • /
    • pp.115-134
    • /
    • 1994
  • Polynomial errors-in-variables model with one predictor variable and one response variable is defined and an estimator of model is derived following the Booth's linear model estimation procedure. Since polynomial model is nonlinear function of the unknown regression coefficients and error-free predictors, it is nonlinear model in errors-in-variables model. As a result of applying linear model estimation method to nonlinear model, some additional assumptions are necessary. Hence, an estimator is derived under the assumption that the error variances are decrasing as sample size increases. Asymptotic propoerties of the derived estimator are provided. A simulation study is presented to compare the small sample properties of the derived estimator with those of OLS estimator.

  • PDF

On Fitting Polynomial Measurement Error Models with Vector Predictor -When Interactions Exist among Predictors-

  • Myung-Sang Moon
    • Communications for Statistical Applications and Methods
    • /
    • v.2 no.1
    • /
    • pp.1-12
    • /
    • 1995
  • An estimator of coefficients of polynomial measurement error model with vector predictor and first-order interaction terms is derived using Hermite polynomial. Asymptotic normality of estimator is provided and some simulation study is performed to compare the small sample properties of derived estimator with those of OLS estimator.

  • PDF

Asymptotic Properties of Least Square Estimator of Disturbance Variance in the Linear Regression Model with MA(q)-Disturbances

  • Jong Hyup Lee;Seuck Heum Song
    • Communications for Statistical Applications and Methods
    • /
    • v.4 no.1
    • /
    • pp.111-117
    • /
    • 1997
  • The ordinary least squares estimator $S^2$ for the variance of the disturbances is considered in the linear regression model with sutocorrelated disturbances. It is proved that the OLS-estimator of disturbance variance is asymptotically unbiased and weakly consistent, when the distrubances are generated by an MA(q) process. In particular, the asymptotic unbiasedness and consistency of $S^2$ is satisfied without any restriction on the regressor matrix.

  • PDF

A Parameter Estimation of Bass Diffusion Model by the Hybrid of NLS and OLS (NLS와 OLS의 하이브리드 방법에 의한 Bass 확산모형의 모수추정)

  • Hong, Jung-Sik;Kim, Tae-Gu;Koo, Hoon-Young
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.37 no.1
    • /
    • pp.74-82
    • /
    • 2011
  • The Bass model is a cornerstone in diffusion theory which is used for forecasting demand of durables or new services. Three well-known estimation methods for parameters of the Bass model are Ordinary Least Square (OLS), Maximum Likelihood Estimator (MLE), Nonlinear Least Square (NLS). In this paper, a hybrid method incorporating OLS and NLS is presented and it's performance is analyzed and compared with OLS and NLS by using simulation data and empirical data. The results show that NLS has the best performance in terms of accuracy and our hybrid method has the best performance in terms of stability. Specifically, hybrid method has better performance with less data. This result means much in practical aspect because the avaliable data is little when a diffusion model is used for forecasting demand of a new product.

The Asymptotic Unbiasedness of $S^2$ in the Linear Regression Model with Dependent Errors

  • Lee, Sang-Yeol;Kim, Young-Won
    • Journal of the Korean Statistical Society
    • /
    • v.25 no.2
    • /
    • pp.235-241
    • /
    • 1996
  • The ordinary least squares estimator of the disturbance variance in the linear regression model with stationary errors is shown to be asymptotically unbiased when the error process has a spectral density bounded from the above and away from zero. Such error processes cover a broad class of stationary processes, including ARMA processes.

  • PDF