• Title/Summary/Keyword: Kernel parameter

Search Result 119, Processing Time 0.028 seconds

An Efficiency Assessment for Reflectance Normalization of RapidEye Employing BRD Components of Wide-Swath satellite

  • Kim, Sang-Il;Han, Kyung-Soo;Yeom, Jong-Min
    • Korean Journal of Remote Sensing
    • /
    • v.27 no.3
    • /
    • pp.303-314
    • /
    • 2011
  • Surface albedo is an important parameter of the surface energy budget, and its accurate quantification is of major interest to the global climate modeling community. Therefore, in this paper, we consider the direct solution of kernel based bidirectional reflectance distribution function (BRDF) models for retrieval of normalized reflectance of high resolution satellite. The BRD effects can be seen in satellite data having a wide swath such as SPOT/VGT (VEGETATION) have sufficient angular sampling, but high resolution satellites are impossible to obtain sufficient angular sampling over a pixel during short period because of their narrow swath scanning when applying semi-empirical model. This gives a difficulty to run BRDF model inferring the reflectance normalization of high resolution satellites. The principal purpose of the study is to estimate normalized reflectance of high resolution satellite (RapidEye) through BRDF components from SPOT/VGT. We use semi-empirical BRDF model to estimated BRDF components from SPOT/VGT and reflectance normalization of RapidEye. This study used SPOT/VGT satellite data acquired in the S1 (daily) data, and within this study is the multispectral sensor RapidEye. Isotropic value such as the normalized reflectance was closely related to the BRDF parameters and the kernels. Also, we show scatter plot of the SPOT/VGT and RapidEye isotropic value relationship. The linear relationship between the two linear regression analysis is performed by using the parameters of SPOTNGT like as isotropic value, geometric value and volumetric scattering value, and the kernel values of RapidEye like as geometric and volumetric scattering kernel Because BRDF parameters are difficult to directly calculate from high resolution satellites, we use to BRDF parameter of SPOT/VGT. Also, we make a decision of weighting for geometric value, volumetric scattering value and error through regression models. As a result, the weighting through linear regression analysis produced good agreement. For all sites, the SPOT/VGT isotropic and RapidEye isotropic values had the high correlation (RMSE, bias), and generally are very consistent.

Selection of bandwidth for local linear composite quantile regression smoothing (국소 선형 복합 분위수 회귀에서의 평활계수 선택)

  • Jhun, Myoungshic;Kang, Jongkyeong;Bang, Sungwan
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.5
    • /
    • pp.733-745
    • /
    • 2017
  • Local composite quantile regression is a useful non-parametric regression method widely used for its high efficiency. Data smoothing methods using kernel are typically used in the estimation process with performances that rely largely on the smoothing parameter rather than the kernel. However, $L_2$-norm is generally used as criterion to estimate the performance of the regression function. In addition, many studies have been conducted on the selection of smoothing parameters that minimize mean square error (MSE) or mean integrated square error (MISE). In this paper, we explored the optimality of selecting smoothing parameters that determine the performance of non-parametric regression models using local linear composite quantile regression. As evaluation criteria for the choice of smoothing parameter, we used mean absolute error (MAE) and mean integrated absolute error (MIAE), which have not been researched extensively due to mathematical difficulties. We proved the uniqueness of the optimal smoothing parameter based on MAE and MIAE. Furthermore, we compared the optimal smoothing parameter based on the proposed criteria (MAE and MIAE) with existing criteria (MSE and MISE). In this process, the properties of the proposed method were investigated through simulation studies in various situations.

The evaluate the usefulness of various CT kernel applications by PET/CT attenuation correction (PET/CT 감쇠보정시 다양한 CT Kernel 적용에 따른 유용성 평가)

  • Lee, Jae-Young;Seong, Yong-Jun;Yoon, Seok-Hwan;Park, Chan-Rok;Lee, Hong-Jae;Noh, Kyung-Wun
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.21 no.2
    • /
    • pp.37-43
    • /
    • 2017
  • Purpose Recently PET/CT image's attenuation correction is used CTAC(Computed Tomgraphy Attenuation Correction). it can quantitative evaluation by SUV(Standard Uptake Value). This study's purpose is to evaluate SUV and to find proper CT kernel using CTAC with applied various CT kernel to PET/CT construction. Materials and Methods Biograph mCT 64 was used for the equipment. We were performed on 20 patients who had examed at our hospital from february through March 2017. Using NEMA IEC Body Phantom, The data was reconstructed PET/CT images with CTAC appiled various CT kernel. ANOVA was used to evaluated the significant difference in the result. Results The result of measuring the radioactivity concentration of Phantom was B45F 96% and B80F 6.58% against B08F CT kernel, each respectively. the SUVmax increased to B45F 0.86% and B80F 6.54% against B08F CT kernel, In case of patient's parts data, the Lung SUVmax increased to B45F 1.6% and B80F 6.6%, Liver SUVmax increased to B45F 0.7% and B80F 4.7%, and Bone SUVmax increased to B45F 1.3% and B80F 6.2%, respectively. As for parts of patient's about Standard Deviation(SD), the Lung SD increased to B45F 4.2% and B80F 15.4%, Liver SD increased to B45F 2.1% and B80F 11%, and Bone SD increased to B45F 2.3% and B80F 14.7%, respectively. There was no significant difference discovered in three CT kernel (P >.05). Conclusion When using increased noise CT kernel for PET/CT reconstruction, It tends to change both SUVmax and SD in ROI(region of interest), Due to the increase the CT kernel number, Sharp noise increased in ROI. so SUVmax and SD were highly measured, but there was no statistically significant difference. Therefore Using CT kernel of low variation of SD occur less variation of SUV.

  • PDF

Multi-User Detection using Support Vector Machines

  • Lee, Jung-Sik;Lee, Jae-Wan;Hwang, Jae-Jeong;Chung, Kyung-Taek
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.12C
    • /
    • pp.1177-1183
    • /
    • 2009
  • In this paper, support vector machines (SVM) are applied to multi-user detector (MUD) for direct sequence (DS)-CDMA system. This work shows an analytical performance of SVM based multi-user detector with some of kernel functions, such as linear, sigmoid, and Gaussian. The basic idea in SVM based training is to select the proper number of support vectors by maximizing the margin between two different classes. In simulation studies, the performance of SVM based MUD with different kernel functions is compared in terms of the number of selected support vectors, their corresponding decision boundary, and finally the bit error rate. It was found that controlling parameter, in SVM training have an effect, in some degree, to SVM based MUD with both sigmoid and Gaussian kernel. It is shown that SVM based MUD with Gaussian kernels outperforms those with other kernels.

Bandwidth selection for discontinuity point estimation in density (확률밀도함수의 불연속점 추정을 위한 띠폭 선택)

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.1
    • /
    • pp.79-87
    • /
    • 2012
  • In the case that the probability density function has a discontinuity point, Huh (2002) estimated the location and jump size of the discontinuity point based on the difference between the right and left kernel density estimators using the one-sided kernel function. In this paper, we consider the cross-validation, made by the right and left maximum likelihood cross-validations, for the bandwidth selection in order to estimate the location and jump size of the discontinuity point. This method is motivated by the one-sided cross-validation of Hart and Yi (1998). The finite sample performance is illustrated by simulated example.

Shifted Linear Interpolation with an Image-Dependent Parameter (영상에 종속적인 매개변수를 갖는 이동 선형 보간법)

  • Park, Do-Young;Yoo, Hoon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.17 no.10
    • /
    • pp.2425-2430
    • /
    • 2013
  • This paper presents an shifted linear interpolation method with an image-dependent parameter. The previous shifted linear interpolation proposed the optimal shift parameter of 0.21, which is calculated by spectrum analysis of the shifted linear interpolation kernel. However, the parameter can be different if we takes an input image spectrum into account. Thus, we introduce an image-dependent parameter. An experiment shows the best shift parameter is 0.19 in average for real images. Also, simulation results indicate the proposed method is superior to the existing shifted linear interpolation as well as conventional methods such as linear interpolation and cubic convolution interpolation in terms of the subjective and objective image quality.

Parameter optimization for SVM using dynamic encoding algorithm

  • Park, Young-Su;Lee, Young-Kow;Kim, Jong-Wook;Kim, Sang-Woo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2542-2547
    • /
    • 2005
  • In this paper, we propose a support vector machine (SVM) hyper and kernel parameter optimization method which is based on minimizing radius/margin bound which is a kind of estimation of leave-one-error. This method uses dynamic encoding algorithm for search (DEAS) and gradient information for better optimization performance. DEAS is a recently proposed optimization algorithm which is based on variable length binary encoding method. This method has less computation time than genetic algorithm (GA) based and grid search based methods and better performance on finding global optimal value than gradient based methods. It is very efficient in practical applications. Hand-written letter data of MNI steel are used to evaluate the performance.

  • PDF

NEW PRIMAL-DUAL INTERIOR POINT METHODS FOR P*(κ) LINEAR COMPLEMENTARITY PROBLEMS

  • Cho, Gyeong-Mi;Kim, Min-Kyung
    • Communications of the Korean Mathematical Society
    • /
    • v.25 no.4
    • /
    • pp.655-669
    • /
    • 2010
  • In this paper we propose new primal-dual interior point methods (IPMs) for $P_*(\kappa)$ linear complementarity problems (LCPs) and analyze the iteration complexity of the algorithm. New search directions and proximity measures are defined based on a class of kernel functions, $\psi(t)=\frac{t^2-1}{2}-{\int}^t_1e{^{q(\frac{1}{\xi}-1)}d{\xi}$, $q\;{\geq}\;1$. If a strictly feasible starting point is available and the parameter $q\;=\;\log\;\(1+a{\sqrt{\frac{2{\tau}+2{\sqrt{2n{\tau}}+{\theta}n}}{1-{\theta}}\)$, where $a\;=\;1\;+\;\frac{1}{\sqrt{1+2{\kappa}}}$, then new large-update primal-dual interior point algorithms have $O((1\;+\;2{\kappa})\sqrt{n}log\;n\;log\;{\frac{n}{\varepsilon}})$ iteration complexity which is the best known result for this method. For small-update methods, we have $O((1\;+\;2{\kappa})q{\sqrt{qn}}log\;{\frac{n}{\varepsilon}})$ iteration complexity.

Modeling of Classifiers by Simple Kernel Update (단순한 커널 갱신을 통한 분류기의 설계)

  • Noh Yung-Kyun;Kim Cheong-Tag;Zhang Byoung-Tak
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.06a
    • /
    • pp.79-81
    • /
    • 2006
  • 커널(Kernel)을 이용한 분류 방법은 넓은 마진(large margin) 분류기로서 SVM(Support Vector Machine)을 주로 사용하게 된다 하지만, 이 방법은 라그랑제 파라미터(Lagrange Parameter)의 최적화 과정을 포함함으로써 학습 과정을 쉽지 않게 만든다. 이 최적화 과정은 특히 DNA computing과 같은 단순한 과정의 설계를 통해 결과를 얻어야 하는 새로운 계산 모델에 커널을 적용하고자 했을 경우 큰 장벽이 된다. 본 논문에서는 넓은 마진을 목표로 하는 최적화 과정이 아닌 다른 라벨(label)의 데이터간의 경계 파악을 위한 간단한 커널 갱신 방법의 도입을 통해 분류기를 설계한다. 이 방법을 가우시안 커널에 적용시켜 본 결과, 반복을 통해 데이터의 구조를 찾아갈 수 있는 특성을 보여주며, 결국 넓은 마진의 최적화된 파라미터를 찾게 됨을 보여준다. 본 논문에서는 이 최적화 방법을 DNA 분자를 이용한 커널 생성 모델인 DNA 커널에 적용시켰을 때 잘 알려진 AML/ALL 데이터를 잘 분류해 냄을 보여준다.

  • PDF

On Practical Choice of Smoothing Parameter in Nonparametric Classification (베이즈 리스크를 이용한 커널형 분류에서 평활모수의 선택)

  • Kim, Rae-Sang;Kang, Kee-Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.2
    • /
    • pp.283-292
    • /
    • 2008
  • Smoothing parameter or bandwidth plays a key role in nonparametric classification based on kernel density estimation. We consider choosing smoothing parameter in nonparametric classification, which optimize the Bayes risk. Hall and Kang (2005) clarified the theoretical properties of smoothing parameter in terms of minimizing Bayes risk and derived the optimal order of it. Bootstrap method was used in their exploring numerical properties. We compare cross-validation and bootstrap method numerically in terms of optimal order of bandwidth. Effects on misclassification rate are also examined. We confirm that bootstrap method is superior to cross-validation in both cases.