• 제목/요약/키워드: $F_{\infty}$-norm penalty

검색결과 1건 처리시간 0.017초

On the Use of Adaptive Weights for the F-Norm Support Vector Machine

  • Bang, Sung-Wan;Jhun, Myoung-Shic
    • 응용통계연구
    • /
    • 제25권5호
    • /
    • pp.829-835
    • /
    • 2012
  • When the input features are generated by factors in a classification problem, it is more meaningful to identify important factors, rather than individual features. The $F_{\infty}$-norm support vector machine(SVM) has been developed to perform automatic factor selection in classification. However, the $F_{\infty}$-norm SVM may suffer from estimation inefficiency and model selection inconsistency because it applies the same amount of shrinkage to each factor without assessing its relative importance. To overcome such a limitation, we propose the adaptive $F_{\infty}$-norm ($AF_{\infty}$-norm) SVM, which penalizes the empirical hinge loss by the sum of the adaptively weighted factor-wise $L_{\infty}$-norm penalty. The $AF_{\infty}$-norm SVM computes the weights by the 2-norm SVM estimator and can be formulated as a linear programming(LP) problem which is similar to the one of the $F_{\infty}$-norm SVM. The simulation studies show that the proposed $AF_{\infty}$-norm SVM improves upon the $F_{\infty}$-norm SVM in terms of classification accuracy and factor selection performance.