• 제목/요약/키워드: Penalty function

검색결과 294건 처리시간 0.024초

A Penalized Principal Component Analysis using Simulated Annealing

  • Park, Chongsun;Moon, Jong Hoon
    • Communications for Statistical Applications and Methods
    • /
    • 제10권3호
    • /
    • pp.1025-1036
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalty function is proposed. We use the fact that usual principal component problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Simulated annealing algorithm is used in searching for optimal solutions with penalty functions. Comparisons between several well-known penalty functions through simulation reveals that the HARD penalty function should be suggested as the best one in several aspects. Illustrations with real and simulated examples are provided.

SMOOTHING APPROXIMATION TO l1 EXACT PENALTY FUNCTION FOR CONSTRAINED OPTIMIZATION PROBLEMS

  • BINH, NGUYEN THANH
    • Journal of applied mathematics & informatics
    • /
    • 제33권3_4호
    • /
    • pp.387-399
    • /
    • 2015
  • In this paper, a new smoothing approximation to the l1 exact penalty function for constrained optimization problems (COP) is presented. It is shown that an optimal solution to the smoothing penalty optimization problem is an approximate optimal solution to the original optimization problem. Based on the smoothing penalty function, an algorithm is presented to solve COP, with its convergence under some conditions proved. Numerical examples illustrate that this algorithm is efficient in solving COP.

AN EXACT LOGARITHMIC-EXPONENTIAL MULTIPLIER PENALTY FUNCTION

  • Lian, Shu-jun
    • Journal of applied mathematics & informatics
    • /
    • 제28권5_6호
    • /
    • pp.1477-1487
    • /
    • 2010
  • In this paper, we give a solving approach based on a logarithmic-exponential multiplier penalty function for the constrained minimization problem. It is proved exact in the sense that the local optimizers of a nonlinear problem are precisely the local optimizers of the logarithmic-exponential multiplier penalty problem.

Weighted Support Vector Machines with the SCAD Penalty

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • 제20권6호
    • /
    • pp.481-490
    • /
    • 2013
  • Classification is an important research area as data can be easily obtained even if the number of predictors becomes huge. The support vector machine(SVM) is widely used to classify a subject into a predetermined group because it gives sound theoretical background and better performance than other methods in many applications. The SVM can be viewed as a penalized method with the hinge loss function and penalty functions. Instead of $L_2$ penalty function Fan and Li (2001) proposed the smoothly clipped absolute deviation(SCAD) satisfying good statistical properties. Despite the ability of SVMs, they have drawbacks of non-robustness when there are outliers in the data. We develop a robust SVM method using a weight function with the SCAD penalty function based on the local quadratic approximation. We compare the performance of the proposed SVM with the SVM using the $L_1$ and $L_2$ penalty functions.

유전자알고리즘을 이용한 크레인가속도 최적화 (An Optimization Technique For Crane Acceleration Using A Genetic Algorithm)

  • 박창권;김재량;정원지;홍대선;권장렬;박범석
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 2003년도 춘계학술대회 논문집
    • /
    • pp.1701-1704
    • /
    • 2003
  • This paper presents a new optimization technique of acceleration curve for a wafer transfer crane movement in which high speed and low vibration are desirable. This technique is based on a genetic algorithm with a penalty function for acceleration optimization under the assumption that an initial profile of acceleration curves constitutes the first generation of the genetic algorithm. Especially the penalty function consists of the violation of constraints and the number of violated constraints. The proposed penalty function makes the convergence rate of optimization process using the genetic algorithm more faster than the case of genetic algorithm without a penalty function. The optimized acceleration of the crane through the genetic algorithm and commercial dynamic analysis software has shown to have accurate movement and low vibration.

  • PDF

작업 완료시간의 2차벌과금함수를 최소화하는 알고리즘에 관한 연구 (Optimal Scheduling Algorithm for Minimizing the Quadratic Penalty Function of Completion Times)

  • 노인규;이정환
    • 산업경영시스템학회지
    • /
    • 제13권22호
    • /
    • pp.35-42
    • /
    • 1990
  • This paper deals with a single machine scheduling problem with a quadratic penalty function of completion times. The objective is to find a optimal sequence which minimizes the total penalty. A new type of node elimination procedure and precedence relation is developed that determines the ordering between adjacent jobs and is incorporated into a branch and bound algorithm. In addition, modified penalty function is considered and numerical examples are provided to test the effectiveness of the optimum algorithm.

  • PDF

Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties

  • Park, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • 제14권1호
    • /
    • pp.215-227
    • /
    • 2007
  • Variable selection algorithm for Sliced Inverse Regression using penalty function is proposed. We noted SIR models can be expressed as generalized eigenvalue decompositions and incorporated penalty functions on them. We found from small simulation that the HARD penalty function seems to be the best in preserving original directions compared with other well-known penalty functions. Also it turned out to be effective in forcing coefficient estimates zero for irrelevant predictors in regression analysis. Results from illustrative examples of simulated and real data sets will be provided.

Variable Selection with Nonconcave Penalty Function on Reduced-Rank Regression

  • Jung, Sang Yong;Park, Chongsun
    • Communications for Statistical Applications and Methods
    • /
    • 제22권1호
    • /
    • pp.41-54
    • /
    • 2015
  • In this article, we propose nonconcave penalties on a reduced-rank regression model to select variables and estimate coefficients simultaneously. We apply HARD (hard thresholding) and SCAD (smoothly clipped absolute deviation) symmetric penalty functions with singularities at the origin, and bounded by a constant to reduce bias. In our simulation study and real data analysis, the new method is compared with an existing variable selection method using $L_1$ penalty that exhibits competitive performance in prediction and variable selection. Instead of using only one type of penalty function, we use two or three penalty functions simultaneously and take advantages of various types of penalty functions together to select relevant predictors and estimation to improve the overall performance of model fitting.

투과 단층촬영에서 공간가변 평활화를 사용한 경계보존 반복연산 재구성 (Edge-Preserving Iterative Reconstruction in Transmission Tomography Using Space-Variant Smoothing)

  • 정지은;;이수진
    • 대한의용생체공학회:의공학회지
    • /
    • 제38권5호
    • /
    • pp.219-226
    • /
    • 2017
  • Penalized-likelihood (PL) reconstruction methods for transmission tomography are known to provide improved image quality for reduced dose level by efficiently smoothing out noise while preserving edges. Unfortunately, however, most of the edge-preserving penalty functions used in conventional PL methods contain at least one free parameter which controls the shape of a non-quadratic penalty function to adjust the sensitivity of edge preservation. In this work, to avoid difficulties in finding a proper value of the free parameter involved in a non-quadratic penalty function, we propose a new adaptive method of space-variant smoothing with a simple quadratic penalty function. In this method, the smoothing parameter is adaptively selected for each pixel location at each iteration by using the image roughness measured by a pixel-wise standard deviation image calculated from the previous iteration. The experimental results demonstrate that our new method not only preserves edges, but also suppresses noise well in monotonic regions without requiring additional processes to select free parameters that may otherwise be included in a non-quadratic penalty function.

벌점함수를 이용한 부분최소제곱 회귀모형에서의 변수선택 (Variable Selection in PLS Regression with Penalty Function)

  • 박종선;문규종
    • Communications for Statistical Applications and Methods
    • /
    • 제15권4호
    • /
    • pp.633-642
    • /
    • 2008
  • 본 논문에서는 반응변수가 하나 이상이고 설명변수들의 수가 관측치에 비하여 상대적으로 많은 경우에 널리 사용되는 부분최소제곱회귀모형에 벌점함수를 적용하여 모형에 필요한 설명변수들을 선택하는 문제를 고려하였다. 모형에 필요한 설명변수들은 각각의 잠재변수들에 대한 최적해 문제에 벌점함수를 추가한 후 모의담금질을 이용하여 선택하였다. 실제 자료에 대한 적용 결과 모형의 설명력 및 예측력을 크게 떨어뜨리지 않으면서 필요없는 변수들을 효과적으로 제거하는 것으로 나타나 부분최소제곱회귀모형에서 최적인 설명변수들의 부분집합을 선택하는데 적용될 수 있을 것이다.