• Title/Summary/Keyword: 일반화최소제곱법

Search Result 10, Processing Time 0.022 seconds

A Generalized Marginal Logit Model for Repeated Polytomous Response Data (반복측정의 다가 반응자료에 대한 일반화된 주변 로짓모형)

  • Choi, Jae-Sung
    • The Korean Journal of Applied Statistics
    • /
    • v.21 no.4
    • /
    • pp.621-630
    • /
    • 2008
  • This paper discusses how to construct a generalized marginal logit model for analyzing repeated polytomous response data when some factors are applied to larger experimental units as treatments and time to a smaller experimental unit as a repeated measures factor. So, two different experimental sizes are considered. Weighted least squares(WLS) methods are used for estimating fixed effects in the suggested model.

L-Estimation for the Parameter of the AR(l) Model (AR(1) 모형의 모수에 대한 L-추정법)

  • Han Sang Moon;Jung Byoung Cheal
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.1
    • /
    • pp.43-56
    • /
    • 2005
  • In this study, a robust estimation method for the first-order autocorrelation coefficient in the time series model following AR(l) process with additive outlier(AO) is investigated. We propose the L-type trimmed least squares estimation method using the preliminary estimator (PE) suggested by Rupport and Carroll (1980) in multiple regression model. In addition, using Mallows' weight function in order to down-weight the outlier of X-axis, the bounded-influence PE (BIPE) estimator is obtained and the mean squared error (MSE) performance of various estimators for autocorrelation coefficient are compared using Monte Carlo experiments. From the results of Monte-Carlo study, the efficiency of BIPE(LAD) estimator using the generalized-LAD to preliminary estimator performs well relative to other estimators.

A comparison study on regression with stationary nonparametric autoregressive errors (정상 비모수 자기상관 오차항을 갖는 회귀분석에 대한 비교 연구)

  • Yu, Kyusang
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.1
    • /
    • pp.157-169
    • /
    • 2016
  • We compare four methods to estimate a regression coefficient under linear regression models with serially correlated errors. We assume that regression errors are generated with nonlinear autoregressive models. The four methods are: ordinary least square estimator, general least square estimator, parametric regression error correction method, and nonparametric regression error correction method. We also discuss some properties of nonlinear autoregressive models by presenting numerical studies with typical examples. Our numerical study suggests that no method dominates; however, the nonparametric regression error correction method works quite well.

An outlier weight adjustment using generalized ratio-cum-product method for two phase sampling (이중추출법에서 일반화 ratio-cum-product 방법을 이용한 이상점 가중치 보정법)

  • Oh, Jung-Taek;Shin, Key-Il
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.7
    • /
    • pp.1185-1199
    • /
    • 2016
  • Two phase sampling (double sampling) is often used when there is inadequate population information for proper stratification. Many recent papers have been devoted to the estimation method to improve the precision of the estimator using first phase information. In this study we suggested outlier weight adjustment methods to improve estimation precision based on the weight of the generalized ratio-cum-product estimator. Small simulation studies are conducted to compare the suggested methods and the usual method. Real data analysis is also performed.

Bigdata Analysis of Fine Dust Theme Stock Price Volatility According to PM10 Concentration Change (PM10 농도변화에 따른 미세먼지 테마주 주가변동 빅데이터 분석)

  • Kim, Mu Jeong;Lim, Gyoo Gun
    • Journal of Service Research and Studies
    • /
    • v.10 no.1
    • /
    • pp.55-67
    • /
    • 2020
  • Fine dust has recently become one of the greatest concerns of Korean people and has been a target of considerable efforts by governments and local governments. In the academic world, many researches have been carried out in relation to fine dust, but the research on the economic field has been relatively few. So we wanted to know how fine dust affects the economy. Big data of PM10 concentration for fine dust and fine dust theme stock price were collected for five years from 2013 to 2017. Regression analysis was performed using the linear regression model, the generalized least squares method. As a result, the change in the fine dust concentration was found to have a effect on the related theme stocks' price. When the fine dust concentration increased compared to the previous day, the fine dust theme stocks' price also showed a tendency to increase. Also, according to the analysis of stock price change from 2013 to 2017 based on fine dust theme stocks, companies with large regression coefficients were changed every year. Among them, the regression coefficients of Monalisa were repeatedly high in 2014, 2015, 2017, Samil Pharmaceutical in 2015, 2016 and 2017, and Welcron in 2016 and 2017, and the companies were judged to be sensitive to the concentration of fine dust. The companies that responded the most in the past 5 years were Wokong, Welcron, Dongsung Pharmaceutical, Samil Pharmaceutical, and Monalisa. If PM2.5 measurement data are accumulated enough, it would be meaningful to compare and analyze PM2.5 concentration with independent variables. In this study, only the fine dust concentration is used as an independent variable. However, it is expected that a more clear and well-explained result can be found by adding appropriate additional variables to increase the explanatory power.

Algorithms for wavefront reconstruction of Shack-Hartmann wavefront sensor (Shack-Hartmann 센서의 파면 재구성 알고리즘)

  • 서영석;백성훈;박승규;김철중
    • Proceedings of the Optical Society of Korea Conference
    • /
    • 2000.08a
    • /
    • pp.44-45
    • /
    • 2000
  • Shack-Hartmann 센서로부터 얻어진 기울기 정보로부터 파면을 재구성하고 분석하기 위해서는 각각의 점 영상에 대한 위상 구배로부터 파면의 위상을 재구성할 수 있는 수학적인 알고리즘이 필요하다. 파면의 위상을 재구성하기 위한 알고리즘은 Hudgin, Fried, Southwell이 제시한 세 가지 방법에 대한 연구결과가 가장 많이 알려져 있다. 본 연구에서는 CCD 카메라로부터 전송된 디지털 영상에서 각각의 점 영상의 중심점을 추출하여 점 영상의 이동정보로부터 수평과 수직방향의 기울기를 계산하고, 이를 바탕으로 최소제곱법(least-square fitting)을 사용하여 위상을 재구성하였다. 파면의 기울기 정보로부터 파면을 재구성하기 위해 기존의 이론을 바탕으로 행렬계산법을 사용하여 각각의 경우를 일반화하였고, 위상의 복구와 파면의 보정에 따른 해석적인 오차의 관계를 논의하였다. (중략)

  • PDF

Power analysis of testing fixed effects with two way classification (이원혼합모형에서 고정효과 유의성검정에 대한 검정력 분석)

  • 이장택
    • The Korean Journal of Applied Statistics
    • /
    • v.10 no.1
    • /
    • pp.177-187
    • /
    • 1997
  • This article considers the power performance of the tests in unbalanced two way mixed linear models with one fixed factor. The generalized least squares (GLS) F statistic testing no differences among the effects of the levels of the fixed factor is estimated using Henderson's method III, minimum norm quadratic unbiased estimator (MINQUE) with prior guess 1, maximum likelihood (ML) and resticted maximum likelihood (REML). We investigate the power performance of these test statistics. It can be shown, through simulation, that the GLS F statistics using four estimators produce similar type I error rates and power performance.

  • PDF

On analysis of row-column designs (행-열 실험계획의 분석에 관한 연구)

  • 백운봉
    • The Korean Journal of Applied Statistics
    • /
    • v.5 no.2
    • /
    • pp.229-242
    • /
    • 1992
  • Bradley and Stewart(1991) considered a large class of experimental designs as multidimensional block designs(MBD's). The simplest MBD could be considered to be a row-column design(RCD). They presented the intrablock analysis of variance for a general row-column design. In this article, a generalized least squares solution for Bradley & Stewart's example is considered. In this case, the assumption is that row and column effects are random. This is an application of revised Paik(1990a,1990b)'s method. The Appendix is devoted to that revised method.

  • PDF

Application of the GPS Data Simplification Methods for Railway Alignments Reconstruction (철도 선형 복원을 위한 GPS 데이터 단순화 방법의 적용)

  • 정의환
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.22 no.1
    • /
    • pp.63-71
    • /
    • 2004
  • This research is to reconstruction of railway alignment using GPS data, an investigation is made on the method of optimum simplification for reduction of unnecessary linear data, improve the accuracy by using four simplification algorithms among the methods. By applying two measured of displacement between observed data and it's simplification methods have been evaluated. The results showed that the complexities of lines is not practical to investigate simplification algorithms, the Douglas-Peucker method produced a little displacement between observed data and it's simplification. Its by using the Douglas-Peucker method to observed linear GPS data in railway track, design elements of horizontal alignment have been calculated. Then we could know that obtain the good results fur reconstruction of alignment elements through the methods and algorithns of this study.

Prediction of movie audience numbers using hybrid model combining GLS and Bass models (GLS와 Bass 모형을 결합한 하이브리드 모형을 이용한 영화 관객 수 예측)

  • Kim, Bokyung;Lim, Changwon
    • The Korean Journal of Applied Statistics
    • /
    • v.31 no.4
    • /
    • pp.447-461
    • /
    • 2018
  • Domestic film industry sales are increasing every year. Theaters are the primary sales channels for movies and the number of audiences using the theater affects additional selling rights. Therefore, the number of audiences using the theater is an important factor directly linked to movie industry sales. In this paper we consider a hybrid model that combines a multiple linear regression model and the Bass model to predict the audience numbers for a specific day. By combining the two models, the predictive value of the regression analysis was corrected to that of the Bass model. In the analysis, three films with different release dates were used. All subset regression method is used to generate all possible combinations and 5-fold cross validation to estimate the model 5 times. In this case, the predicted value is obtained from the model with the smallest root mean square error and then combined with the predicted value of the Bass model to obtain the final predicted value. With the existence of past data, it was confirmed that the weight of the Bass model increases and the compensation is added to the predicted value.