• Title/Summary/Keyword: penalty functions

Search Result 85, Processing Time 0.029 seconds

Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties

  • Park, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.1
    • /
    • pp.215-227
    • /
    • 2007
  • Variable selection algorithm for Sliced Inverse Regression using penalty function is proposed. We noted SIR models can be expressed as generalized eigenvalue decompositions and incorporated penalty functions on them. We found from small simulation that the HARD penalty function seems to be the best in preserving original directions compared with other well-known penalty functions. Also it turned out to be effective in forcing coefficient estimates zero for irrelevant predictors in regression analysis. Results from illustrative examples of simulated and real data sets will be provided.

Variable Selection with Nonconcave Penalty Function on Reduced-Rank Regression

  • Jung, Sang Yong;Park, Chongsun
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.1
    • /
    • pp.41-54
    • /
    • 2015
  • In this article, we propose nonconcave penalties on a reduced-rank regression model to select variables and estimate coefficients simultaneously. We apply HARD (hard thresholding) and SCAD (smoothly clipped absolute deviation) symmetric penalty functions with singularities at the origin, and bounded by a constant to reduce bias. In our simulation study and real data analysis, the new method is compared with an existing variable selection method using $L_1$ penalty that exhibits competitive performance in prediction and variable selection. Instead of using only one type of penalty function, we use two or three penalty functions simultaneously and take advantages of various types of penalty functions together to select relevant predictors and estimation to improve the overall performance of model fitting.

Basic Studies on Development of Turn Penalty Functions in Signalized Intersections (신호교차로의 회전제약함수 개발을 위한 기초연구)

  • O, Sang-Jin;Kim, Tae-Yeong;Park, Byeong-Ho
    • Journal of Korean Society of Transportation
    • /
    • v.27 no.1
    • /
    • pp.157-167
    • /
    • 2009
  • This study deals with the turn penalty functions in the urban transportation demand forecasting. The objectives are to develop the penalty functions of left-turn traffic in the case of signalized intersections, and to analyze the applicability of the functions to the traffic assignment models. This is based on the background that the existing models can not effectively account for the delays of left-turn traffic which is bigger than that of through traffic. In pursuing the above, this study gives particular attention to developing the penalty functions based on the degrees of saturation by simulation results of Transyt-7F, and analyzing the applicability of the functions by the case study of Cheongju. The major findings are the followings. First, two penalty functions developed according to the degrees of saturation, are evaluated to be all statistically significant. Second, the results that the above functions apply to the Cheongju network, are analyzed to be converging, though the iteration numbers increase. Third, the link volumes forecasted by turn penalty functions are evaluated to be better fitted to the observed data than those by the existing models. Finally, the differences of traffic volumes assigned by two functions, which are exponential and divided forms, are analyzed to be very small.

A Penalized Principal Component Analysis using Simulated Annealing

  • Park, Chongsun;Moon, Jong Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1025-1036
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalty function is proposed. We use the fact that usual principal component problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Simulated annealing algorithm is used in searching for optimal solutions with penalty functions. Comparisons between several well-known penalty functions through simulation reveals that the HARD penalty function should be suggested as the best one in several aspects. Illustrations with real and simulated examples are provided.

Weighted Support Vector Machines with the SCAD Penalty

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.6
    • /
    • pp.481-490
    • /
    • 2013
  • Classification is an important research area as data can be easily obtained even if the number of predictors becomes huge. The support vector machine(SVM) is widely used to classify a subject into a predetermined group because it gives sound theoretical background and better performance than other methods in many applications. The SVM can be viewed as a penalized method with the hinge loss function and penalty functions. Instead of $L_2$ penalty function Fan and Li (2001) proposed the smoothly clipped absolute deviation(SCAD) satisfying good statistical properties. Despite the ability of SVMs, they have drawbacks of non-robustness when there are outliers in the data. We develop a robust SVM method using a weight function with the SCAD penalty function based on the local quadratic approximation. We compare the performance of the proposed SVM with the SVM using the $L_1$ and $L_2$ penalty functions.

REGULARIZED PENALTY METHOD FOR NON-STATIONARY SET VALUED EQUILIBRIUM PROBLEMS IN BANACH SPACES

  • Salahuddin, Salahuddin
    • Korean Journal of Mathematics
    • /
    • v.25 no.2
    • /
    • pp.147-162
    • /
    • 2017
  • In this research works, we consider the general regularized penalty method for non-stationary set valued equilibrium problem in a Banach space. We define weak coercivity conditions and show that the weak and strong convergence problems of the regularized penalty method.

Multiclass Support Vector Machines with SCAD

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.5
    • /
    • pp.655-662
    • /
    • 2012
  • Classification is an important research field in pattern recognition with high-dimensional predictors. The support vector machine(SVM) is a penalized feature selector and classifier. It is based on the hinge loss function, the non-convex penalty function, and the smoothly clipped absolute deviation(SCAD) suggested by Fan and Li (2001). We developed the algorithm for the multiclass SVM with the SCAD penalty function using the local quadratic approximation. For multiclass problems we compared the performance of the SVM with the $L_1$, $L_2$ penalty functions and the developed method.

Real-coded Micro-Genetic Algorithm for Nonlinear Constrained Engineering Designs

  • Kim Yunyoung;Kim Byeong-Il;Shin Sung-Chul
    • Journal of Ship and Ocean Technology
    • /
    • v.9 no.4
    • /
    • pp.35-46
    • /
    • 2005
  • The performance of optimisation methods, based on penalty functions, is highly problem- dependent and many methods require additional tuning of some variables. This additional tuning is the influences of penalty coefficient, which depend strongly on the degree of constraint violation. Moreover, Binary-coded Genetic Algorithm (BGA) meets certain difficulties when dealing with continuous and/or discrete search spaces with large dimensions. With the above reasons, Real-coded Micro-Genetic Algorithm (R$\mu$GA) is proposed to find the global optimum of continuous and/or discrete nonlinear constrained engineering problems without handling any of penalty functions. R$\mu$GA can help in avoiding the premature convergence and search for global solution-spaces, because of its wide spread applicability, global perspective and inherent parallelism. The proposed R$\mu$GA approach has been demonstrated by solving three different engineering design problems. From the simulation results, it has been concluded that R$\mu$GA is an effective global optimisation tool for solving continuous and/or discrete nonlinear constrained real­world optimisation problems.

Edge-Preserving Iterative Reconstruction in Transmission Tomography Using Space-Variant Smoothing (투과 단층촬영에서 공간가변 평활화를 사용한 경계보존 반복연산 재구성)

  • Jung, Ji Eun;Ren, Xue;Lee, Soo-Jin
    • Journal of Biomedical Engineering Research
    • /
    • v.38 no.5
    • /
    • pp.219-226
    • /
    • 2017
  • Penalized-likelihood (PL) reconstruction methods for transmission tomography are known to provide improved image quality for reduced dose level by efficiently smoothing out noise while preserving edges. Unfortunately, however, most of the edge-preserving penalty functions used in conventional PL methods contain at least one free parameter which controls the shape of a non-quadratic penalty function to adjust the sensitivity of edge preservation. In this work, to avoid difficulties in finding a proper value of the free parameter involved in a non-quadratic penalty function, we propose a new adaptive method of space-variant smoothing with a simple quadratic penalty function. In this method, the smoothing parameter is adaptively selected for each pixel location at each iteration by using the image roughness measured by a pixel-wise standard deviation image calculated from the previous iteration. The experimental results demonstrate that our new method not only preserves edges, but also suppresses noise well in monotonic regions without requiring additional processes to select free parameters that may otherwise be included in a non-quadratic penalty function.

Variable Selection in PLS Regression with Penalty Function (벌점함수를 이용한 부분최소제곱 회귀모형에서의 변수선택)

  • Park, Chong-Sun;Moon, Guy-Jong
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.4
    • /
    • pp.633-642
    • /
    • 2008
  • Variable selection algorithm for partial least square regression using penalty function is proposed. We use the fact that usual partial least square regression problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Then simulated annealing algorithm can be used in searching for optimal solutions of above maximization problem with penalty functions added. The HARD penalty function would be suggested as the best in several aspects. Illustrations with real and simulated examples are provided.