• Title/Summary/Keyword: 절사평균제곱오차

Search Result 4, Processing Time 0.019 seconds

A Comparative Study on Spatial Lattice Data Analysis - A Case Where Outlier Exists - (공간 격자데이터 분석에 대한 우위성 비교 연구 - 이상치가 존재하는 경우 -)

  • Kim, Su-Jung;Choi, Seung-Bae;Kang, Chang-Wan;Cho, Jang-Sik
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.2
    • /
    • pp.193-204
    • /
    • 2010
  • Recently, researchers of the various fields where the spatial analysis is needed have more interested in spatial statistics. In case of data with spatial correlation, methodologies accounting for the correlation are required and there have been developments in methods for spatial data analysis. Lattice data among spatial data is analyzed with following three procedures: (1) definition of the spatial neighborhood, (2) definition of spatial weight, and (3) the analysis using spatial models. The present paper shows a spatial statistical analysis method superior to a general statistical method in aspect estimation by using the trimmed mean squared error statistic, when we analysis the spatial lattice data that outliers are included. To show validation and usefulness of contents in this paper, we perform a small simulation study and show an empirical example with a criminal data in BusanJin-Gu, Korea.

Fixed-Width Booth-folding Squarer Design (고정길이 Booth-Folding 제곱기 디자인)

  • Cho Kyung-Ju;Chung Jin-Gyun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.8C
    • /
    • pp.832-837
    • /
    • 2005
  • This paper presents a design method for fixed-width squarer that receives a W-bit input and produces a W-bit squared product. To efficiently compensate for the quantization error, modified Booth encoder signals (not multiplier coefficients) are used for the generation of error compensation bias. The truncated bits are divided into two groups (major/minor group) depending upon their effects on the quantization error. Then, different error compensation methods are applied to each group. By simulations, it is shown that the performance of the proposed method is close to that of the rounding method and much better than that of the truncation method and conventional method. It is also shown that the proposed method leads to up to $28\%\;and\;27\%$ reduction in area and power consumption compared with the ideal squarers, respectively.

A Trimmed Spatial Median Estimator Using Bootstrap Method (붓스트랩을 활용한 최적 절사공간중위수 추정량)

  • Lee, Dong-Hee;Jung, Byoung-Cheol
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.2
    • /
    • pp.375-382
    • /
    • 2010
  • In this study, we propose a robust estimator of the multivariate location parameter by means of the spatial median based on data trimming which extending trimmed mean in the univariate setup. The trimming quantity of this estimator is determined by the bootstrap method, and its covariance matrix is estimated by using the double bootstrap method. This extends the work of Jhun et al. (1993) to the multivariate case. Monte Carlo study shows that the proposed trimmed spatial median estimator yields better efficiency than a spatial median, while its covariance matrix based on double bootstrap overcomes the under-estimating problem occurred on single bootstrap method.

A simulation study for various propensity score weighting methods in clinical problematic situations (임상에서 발생할 수 있는 문제 상황에서의 성향 점수 가중치 방법에 대한 비교 모의실험 연구)

  • Siseong Jeong;Eun Jeong Min
    • The Korean Journal of Applied Statistics
    • /
    • v.36 no.5
    • /
    • pp.381-397
    • /
    • 2023
  • The most representative design used in clinical trials is randomization, which is used to accurately estimate the treatment effect. However, comparison between the treatment group and the control group in an observational study without randomization is biased due to various unadjusted differences, such as characteristics between patients. Propensity score weighting is a widely used method to address these problems and to minimize bias by adjusting those confounding and assess treatment effects. Inverse probability weighting, the most popular method, assigns weights that are proportional to the inverse of the conditional probability of receiving a specific treatment assignment, given observed covariates. However, this method is often suffered by extreme propensity scores, resulting in biased estimates and excessive variance. Several alternative methods including trimming, overlap weights, and matching weights have been proposed to mitigate these issues. In this paper, we conduct a simulation study to compare performance of various propensity score weighting methods under diverse situation, such as limited overlap, misspecified propensity score, and treatment contrary to prediction. From the simulation results overlap weights and matching weights consistently outperform inverse probability weighting and trimming in terms of bias, root mean squared error and coverage probability.