• Title/Summary/Keyword: Kernel estimate

Search Result 141, Processing Time 0.019 seconds

Comparison of Jump-Preserving Smoothing and Smoothing Based on Jump Detector

  • Park, Dong-Ryeon
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.3
    • /
    • pp.519-528
    • /
    • 2009
  • This paper deals with nonparametric estimation of discontinuous regression curve. Quite number of researches about this topic have been done. These researches are classified into two categories, the indirect approach and direct approach. The major goal of the indirect approach is to obtain good estimates of jump locations, whereas the major goal of the direct approach is to obtain overall good estimate of the regression curve. Thus it seems that two approaches are quite different in nature, so people say that the comparison of two approaches does not make much sense. Therefore, a thorough comparison of them is lacking. However, even though the main issue of the indirect approach is the estimation of jump locations, it is too obvious that we have an estimate of regression curve as the subsidiary result. The point is whether the subsidiary result of the indirect approach is as good as the main result of the direct approach. The performance of two approaches is compared through a simulation study and it turns out that the indirect approach is a very competitive tool for estimating discontinuous regression curve itself.

A Video Deblurring Algorithm based on Sharpness Metric for Uniform Sharpness between Frames (프레임 간 선명도 균일화를 위한 선명도 메트릭 기반의 동영상 디블러링 알고리즘)

  • Lee, Byung-Ju;Lee, Dong-Bok;Song, Byung Cheol
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.4
    • /
    • pp.127-136
    • /
    • 2013
  • This paper proposes a video deblurring algorithm which maintains uniform sharpness between frames. Unlike the previous algorithms using fixed parameters, the proposed algorithm keeps uniform sharpness by adjusting parameters for each frame. First, we estimate the initial blur kernel and perform deconvolution, then measure the sharpness of the deblurred image. In order to maintain uniform sharpness, we adjust the regularization parameter and kernel according to the examined sharpness, and perform deconvolution again. The experimental results show that the proposed algorithm achieves outstanding deblurring results while providing consistent sharpness.

Standardized Total Tract Digestibility of Phosphorus in Copra Expellers, Palm Kernel Expellers, and Cassava Root Fed to Growing Pigs

  • So, A.R.;Shin, S.Y.;Kim, B.G.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.26 no.11
    • /
    • pp.1609-1613
    • /
    • 2013
  • An experiment was conducted to determine the apparent total tract digestibility (ATTD) and standardized total tract digestibility (STTD) of phosphorus (P) in copra expellers (CE), palm kernel expellers (PKE), and cassava root (CR). Eight barrows (initial BW of 40.0 kg, SD = 4.5) were individually housed in metabolism crates. A replicated $4{\times}3$ incomplete Latin square design was employed involving 4 dietary treatments, 3 periods, and 8 animals. Three experimental diets contained 40% CE, PKE or CR as the only source of P. A P-free diet mainly based on corn starch, sucrose, and gelatin was also prepared to estimate the basal endogenous loss of P. The marker-to-marker method was used for fecal collection. Values for the ATTD of P in the CE and PKE were greater than in the CR (46.0 and 39.7 vs -14.0%; p<0.05). However, the STTD of P did not differ greatly among the test ingredients (56.5, 49.0, and 43.2% in the CE, PKE, and CR, respectively). In conclusion, the ATTD of P values in CE and PKE were greater than that in CR, but the STTD of P did not differ greatly among CE, PKE, and CR.

Asymptotic optimal bandwidth selection in kernel regression function estimation (커널 회귀함수 추정에서 점근최적인 평활량의 선택에 관한 연구)

  • Seong, Kyoung-Ha;Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.9 no.1
    • /
    • pp.19-27
    • /
    • 1998
  • We considered the bandwidth selection method which has asymptotic optimal convergence rate $n^{-1/2}$ in kernel regression function estimation. For the proposed bandwidth selection, we considered Mean Averaged Squared Error as a performance criterion and its Taylor expansion to the fourth order. Then we estimate the bandwidth which minimizes the estimated approximate value of MASE. Finally we show the relative convergence rate between optimal bandwidth and proposed bandwidth.

  • PDF

Design and Performance Test of Large-Area Susceptor for the Improvement of Temperature Uniformity (온도 균일도 향상을 위한 대면적 서셉터의 설계 및 성능 시험)

  • Yang, Hac Jin;Kim, Seong Kun;Cho, Jung Kun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.6
    • /
    • pp.3714-3721
    • /
    • 2015
  • Although sheath-type heating line is generally used for susceptor heater, performance deterioration problems in temperature uniformity occurs in the case of large scale and high temperature condition. We developed new design and prototype of the susceptor using sheet metal to provide performance improvement in temperature uniformity. Temperature uniformity below 1.4% in the surface temperature condition of $450^{\circ}C$ was verified in the susceptor prototype. Also we developed Kernel regression algorithm to estimate measured temperature using temperature learning data. The reliability of the measured temperature uniformity was confirmed by comparative analysis between predicted data and measured data.

An Algorithm of Score Function Generation using Convolution-FFT in Independent Component Analysis (독립성분분석에서 Convolution-FFT을 이용한 효율적인 점수함수의 생성 알고리즘)

  • Kim Woong-Myung;Lee Hyon-Soo
    • The KIPS Transactions:PartB
    • /
    • v.13B no.1 s.104
    • /
    • pp.27-34
    • /
    • 2006
  • In this study, we propose this new algorithm that generates score function in ICA(Independent Component Analysis) using entropy theory. To generate score function, estimation of probability density function about original signals are certainly necessary and density function should be differentiated. Therefore, we used kernel density estimation method in order to derive differential equation of score function by original signal. After changing formula to convolution form to increase speed of density estimation, we used FFT algorithm that can calculate convolution faster. Proposed score function generation method reduces the errors, it is density difference of recovered signals and originals signals. In the result of computer simulation, we estimate density function more similar to original signals compared with Extended Infomax and Fixed Point ICA in blind source separation problem and get improved performance at the SNR(Signal to Noise Ratio) between recovered signals and original signal.

Initialization of Fuzzy C-Means Using Kernel Density Estimation (커널 밀도 추정을 이용한 Fuzzy C-Means의 초기화)

  • Heo, Gyeong-Yong;Kim, Kwang-Baek
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.8
    • /
    • pp.1659-1664
    • /
    • 2011
  • Fuzzy C-Means (FCM) is one of the most widely used clustering algorithms and has been used in many applications successfully. However, FCM has some shortcomings and initial prototype selection is one of them. As FCM is only guaranteed to converge on a local optimum, different initial prototype results in different clustering. Therefore, much care should be given to the selection of initial prototype. In this paper, a new initialization method for FCM using kernel density estimation (KDE) is proposed to resolve the initialization problem. KDE can be used to estimate non-parametric data distribution and is useful in estimating local density. After KDE, in the proposed method, one initial point is placed at the most dense region and the density of that region is reduced. By iterating the process, initial prototype can be obtained. The initial prototype such obtained showed better result than the randomly selected one commonly used in FCM, which was demonstrated by experimental results.

An Efficiency Assessment for Reflectance Normalization of RapidEye Employing BRD Components of Wide-Swath satellite

  • Kim, Sang-Il;Han, Kyung-Soo;Yeom, Jong-Min
    • Korean Journal of Remote Sensing
    • /
    • v.27 no.3
    • /
    • pp.303-314
    • /
    • 2011
  • Surface albedo is an important parameter of the surface energy budget, and its accurate quantification is of major interest to the global climate modeling community. Therefore, in this paper, we consider the direct solution of kernel based bidirectional reflectance distribution function (BRDF) models for retrieval of normalized reflectance of high resolution satellite. The BRD effects can be seen in satellite data having a wide swath such as SPOT/VGT (VEGETATION) have sufficient angular sampling, but high resolution satellites are impossible to obtain sufficient angular sampling over a pixel during short period because of their narrow swath scanning when applying semi-empirical model. This gives a difficulty to run BRDF model inferring the reflectance normalization of high resolution satellites. The principal purpose of the study is to estimate normalized reflectance of high resolution satellite (RapidEye) through BRDF components from SPOT/VGT. We use semi-empirical BRDF model to estimated BRDF components from SPOT/VGT and reflectance normalization of RapidEye. This study used SPOT/VGT satellite data acquired in the S1 (daily) data, and within this study is the multispectral sensor RapidEye. Isotropic value such as the normalized reflectance was closely related to the BRDF parameters and the kernels. Also, we show scatter plot of the SPOT/VGT and RapidEye isotropic value relationship. The linear relationship between the two linear regression analysis is performed by using the parameters of SPOTNGT like as isotropic value, geometric value and volumetric scattering value, and the kernel values of RapidEye like as geometric and volumetric scattering kernel Because BRDF parameters are difficult to directly calculate from high resolution satellites, we use to BRDF parameter of SPOT/VGT. Also, we make a decision of weighting for geometric value, volumetric scattering value and error through regression models. As a result, the weighting through linear regression analysis produced good agreement. For all sites, the SPOT/VGT isotropic and RapidEye isotropic values had the high correlation (RMSE, bias), and generally are very consistent.

Bandwidth selections based on cross-validation for estimation of a discontinuity point in density (교차타당성을 이용한 확률밀도함수의 불연속점 추정의 띠폭 선택)

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.4
    • /
    • pp.765-775
    • /
    • 2012
  • The cross-validation is a popular method to select bandwidth in all types of kernel estimation. The maximum likelihood cross-validation, the least squares cross-validation and biased cross-validation have been proposed for bandwidth selection in kernel density estimation. In the case that the probability density function has a discontinuity point, Huh (2012) proposed a method of bandwidth selection using the maximum likelihood cross-validation. In this paper, two forms of cross-validation with the one-sided kernel function are proposed for bandwidth selection to estimate the location and jump size of the discontinuity point of density. These methods are motivated by the least squares cross-validation and the biased cross-validation. By simulated examples, the finite sample performances of two proposed methods with the one of Huh (2012) are compared.

Development of MKDE-ebd for Estimation of Multivariate Probabilistic Distribution Functions (다변량 확률분포함수의 추정을 위한 MKDE-ebd 개발)

  • Kang, Young-Jin;Noh, Yoojeong;Lim, O-Kaung
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.32 no.1
    • /
    • pp.55-63
    • /
    • 2019
  • In engineering problems, many random variables have correlation, and the correlation of input random variables has a great influence on reliability analysis results of the mechanical systems. However, correlated variables are often treated as independent variables or modeled by specific parametric joint distributions due to difficulty in modeling joint distributions. Especially, when there are insufficient correlated data, it becomes more difficult to correctly model the joint distribution. In this study, multivariate kernel density estimation with bounded data is proposed to estimate various types of joint distributions with highly nonlinearity. Since it combines given data with bounded data, which are generated from confidence intervals of uniform distribution parameters for given data, it is less sensitive to data quality and number of data. Thus, it yields conservative statistical modeling and reliability analysis results, and its performance is verified through statistical simulation and engineering examples.