• Title/Summary/Keyword: Pruning ensemble

Search Result 5, Processing Time 0.019 seconds

Comparison of ensemble pruning methods using Lasso-bagging and WAVE-bagging (분류 앙상블 모형에서 Lasso-bagging과 WAVE-bagging 가지치기 방법의 성능비교)

  • Kwak, Seungwoo;Kim, Hyunjoong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.6
    • /
    • pp.1371-1383
    • /
    • 2014
  • Classification ensemble technique is a method to combine diverse classifiers to enhance the accuracy of the classification. It is known that an ensemble method is successful when the classifiers that participate in the ensemble are accurate and diverse. However, it is common that an ensemble includes less accurate and similar classifiers as well as accurate and diverse ones. Ensemble pruning method is developed to construct an ensemble of classifiers by choosing accurate and diverse classifiers only. In this article, we proposed an ensemble pruning method called WAVE-bagging. We also compared the results of WAVE-bagging with that of the existing pruning method called Lasso-bagging. We showed that WAVE-bagging method performed better than Lasso-bagging by the extensive empirical comparison using 26 real dataset.

Pruning the Boosting Ensemble of Decision Trees

  • Yoon, Young-Joo;Song, Moon-Sup
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.2
    • /
    • pp.449-466
    • /
    • 2006
  • We propose to use variable selection methods based on penalized regression for pruning decision tree ensembles. Pruning methods based on LASSO and SCAD are compared with the cluster pruning method. Comparative studies are performed on some artificial datasets and real datasets. According to the results of comparative studies, the proposed methods based on penalized regression reduce the size of boosting ensembles without decreasing accuracy significantly and have better performance than the cluster pruning method. In terms of classification noise, the proposed pruning methods can mitigate the weakness of AdaBoost to some degree.

Sparsity Increases Uncertainty Estimation in Deep Ensemble

  • Dorjsembe, Uyanga;Lee, Ju Hong;Choi, Bumghi;Song, Jae Won
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2021.05a
    • /
    • pp.373-376
    • /
    • 2021
  • Deep neural networks have achieved almost human-level results in various tasks and have become popular in the broad artificial intelligence domains. Uncertainty estimation is an on-demand task caused by the black-box point estimation behavior of deep learning. The deep ensemble provides increased accuracy and estimated uncertainty; however, linearly increasing the size makes the deep ensemble unfeasible for memory-intensive tasks. To address this problem, we used model pruning and quantization with a deep ensemble and analyzed the effect in the context of uncertainty metrics. We empirically showed that the ensemble members' disagreement increases with pruning, making models sparser by zeroing irrelevant parameters. Increased disagreement implies increased uncertainty, which helps in making more robust predictions. Accordingly, an energy-efficient compressed deep ensemble is appropriate for memory-intensive and uncertainty-aware tasks.

Review on Genetic Algorithms for Pattern Recognition (패턴 인식을 위한 유전 알고리즘의 개관)

  • Oh, Il-Seok
    • The Journal of the Korea Contents Association
    • /
    • v.7 no.1
    • /
    • pp.58-64
    • /
    • 2007
  • In pattern recognition field, there are many optimization problems having exponential search spaces. To solve of sequential search algorithms seeking sub-optimal solutions have been used. The algorithms have limitations of stopping at local optimums. Recently lots of researches attempt to solve the problems using genetic algorithms. This paper explains the huge search spaces of typical problems such as feature selection, classifier ensemble selection, neural network pruning, and clustering, and it reviews the genetic algorithms for solving them. Additionally we present several subjects worthy of noting as future researches.

A Standard Rose Cultivar 'Love Letter' with Thornless Stems and Red Colored Petals for Cut Flowers (무가시성 적색 절화장미 '러브레터' 육성)

  • Lee, Young Soon;Jung, Yun Kyung;Park, Mi Ok;Lim, Jae Wook
    • Horticultural Science & Technology
    • /
    • v.32 no.2
    • /
    • pp.269-275
    • /
    • 2014
  • A standard rose cultivar, 'Love Letter' was selected for cut flower from the progenies of a cross between 'Red Giant' and 'Ensemble' by the rose breeding team of the Gyeonggi-do Agricultural Research & Extension Services (GARES) in 2011. A standard rose cultivar 'Red Giant' with red colored (RHS Red Group 45C) petals and 1.8 prickles per stems of 10 cm was used as a mother plant. A standard rose cultivar 'Ensemble' with white and red purple colored (RHS White Group 155C + RHS Red Group N57B) petals and 2.4 prickles per stems of 10 cm was used as a pollen parent. 'Love Letter' was crossed in 2007 and seedlings were produced. After tests of specific characters from 2008 to 2011, this cultivar was finally selected and named. As a standard type with large sized flower, it has red colored (RHS Red Group 46A) petals with 9.3 cm flower diameter and 32.4 petals per flower. Vase life of this cultivar could be as long as 12 days. It takes 43 days from pruning to blooming and cut flower productivity was 152 stems/$m^2$ in a year. The stems of cut flower have no thorn and the length is about with 70.5 cm. When this cultivar grew below $15^{\circ}C$ and 1,000 lux, the petals color became dark red in winter season. 'Love Letter' was registered as a new cultivar on Korea Seed & Variety Service (KSVS) with No. 4482 on May 8, 2013.