• 제목/요약/키워드: Ensemble-based algorithm

검색결과 138건 처리시간 0.022초

시계열 예측의 변형된 ENSEMBLE ALGORITHM (Time Series Forecasting Based on Modified Ensemble Algorithm)

  • 김연형;김재훈
    • 응용통계연구
    • /
    • 제18권1호
    • /
    • pp.137-146
    • /
    • 2005
  • 신경망은 전통적인 시계열 기법들에 비해 대체적으로 예측성능의 우수함이 입증되었으나 계절성과 추세성을 갖는 시계열자료에 대해 예측력이 떨어지는 단점을 가지고 있다. 최근에는 Ensemble 기법인 Bagging Algorithm과 신경망의 혼합모형인 Bagging Neural Network이 개밭되었다. 이 기법은 분산과 편향을 많이 줄여줌으로써 더 좋은 예측을 할 수 있는 것으로 나타났다. 그러나 Ensemble 기법을 이용한 예측모형은 시계열자료를 적합 시키는데 있어 초기부여확률 및 예측자 선정시의 문제점을 가지고 있다. 이에 본 연구에서는 이러한 문제점을 해결하고 더불어 예측력을 향상시키기 위한 방법으로 초기부여확률이 균일분포가 아닌 순차적인 형태의 분포를 제시하고 신경망을 예측자로 활용한 변형된 Ensemble Algorithm을 제안한다. 또한 예측모형의 평가를 위해 실제자료를 가지고 기존 예측모형들과 제안한 방법을 이용하여 예측하고 각 MSE의 비교를 통하여 예측정확도를 알아보고자 한다.

Optimizing SVM Ensembles Using Genetic Algorithms in Bankruptcy Prediction

  • Kim, Myoung-Jong;Kim, Hong-Bae;Kang, Dae-Ki
    • Journal of information and communication convergence engineering
    • /
    • 제8권4호
    • /
    • pp.370-376
    • /
    • 2010
  • Ensemble learning is a method for improving the performance of classification and prediction algorithms. However, its performance can be degraded due to multicollinearity problem where multiple classifiers of an ensemble are highly correlated with. This paper proposes genetic algorithm-based optimization techniques of SVM ensemble to solve multicollinearity problem. Empirical results with bankruptcy prediction on Korea firms indicate that the proposed optimization techniques can improve the performance of SVM ensemble.

앙상블 학습 알고리즘을 이용한 컨벌루션 신경망의 분류 성능 분석에 관한 연구 (A Study on Classification Performance Analysis of Convolutional Neural Network using Ensemble Learning Algorithm)

  • 박성욱;김종찬;김도연
    • 한국멀티미디어학회논문지
    • /
    • 제22권6호
    • /
    • pp.665-675
    • /
    • 2019
  • In this paper, we compare and analyze the classification performance of deep learning algorithm Convolutional Neural Network(CNN) ac cording to ensemble generation and combining techniques. We used several CNN models(VGG16, VGG19, DenseNet121, DenseNet169, DenseNet201, ResNet18, ResNet34, ResNet50, ResNet101, ResNet152, GoogLeNet) to create 10 ensemble generation combinations and applied 6 combine techniques(average, weighted average, maximum, minimum, median, product) to the optimal combination. Experimental results, DenseNet169-VGG16-GoogLeNet combination in ensemble generation, and the product rule in ensemble combination showed the best performance. Based on this, it was concluded that ensemble in different models of high benchmarking scores is another way to get good results.

대용량 이미지넷 인식을 위한 CNN 기반 Weighted 앙상블 기법 (CNN-based Weighted Ensemble Technique for ImageNet Classification)

  • 정희철;최민국;김준광;권순;정우영
    • 대한임베디드공학회논문지
    • /
    • 제15권4호
    • /
    • pp.197-204
    • /
    • 2020
  • The ImageNet dataset is a large scale dataset and contains various natural scene images. In this paper, we propose a convolutional neural network (CNN)-based weighted ensemble technique for the ImageNet classification task. First, in order to fuse several models, our technique uses weights for each model, unlike the existing average-based ensemble technique. Then we propose an algorithm that automatically finds the coefficients used in later ensemble process. Our algorithm sequentially selects the model with the best performance of the validation set, and then obtains a weight that improves performance when combined with existing selected models. We applied the proposed algorithm to a total of 13 heterogeneous models, and as a result, 5 models were selected. These selected models were combined with weights, and we achieved 3.297% Top-5 error rate on the ImageNet test dataset.

디리클레 분포 기반 모델 기여도 예측을 이용한 앙상블 트레이딩 알고리즘 (Ensemble trading algorithm Using Dirichlet distribution-based model contribution prediction)

  • 정재용;이주홍;최범기;송재원
    • 스마트미디어저널
    • /
    • 제11권3호
    • /
    • pp.9-17
    • /
    • 2022
  • 알고리즘을 이용하여 금융 상품을 거래하는 알고리즘 트레이딩은 시장의 많은 요인들로 인해 그 결과가 안정적이지 못한 문제가 있다. 이 문제를 완화시키기 위해 트레이딩 알고리즘들을 조합한 앙상블 기법들이 제안되었다. 하지만 이 앙상블 방법에도 여러 문제가 존재한다. 첫째, 앙상블의 필요 요건인 앙상블에 포함된 알고리즘의 최소 성능 요건(랜덤 이상)을 만족시키도록, 트레이딩 알고리즘을 선택하지 못할 수 있다는 점이다. 둘째, 과거에 우수한 성능을 보인 앙상블 모델이 미래에도 우수한 성능을 보일 것이라는 보장이 없다는 점이다. 이 문제점들을 해결하기 위해 앙상블 모델에 포함되는 트레이딩 알고리즘들을 선택하는 방법을 다음과 같이 제안한다. 과거의 데이터를 기반으로 상위 성능의 앙상블 모델들에 포함된 트레이딩 알고리즘들의 기여도를 측정한다. 그러나 이 과거 데이터에만 기반 된 기여도들은 과거의 데이터가 충분히 많지 않고 과거 데이터의 불확실성이 반영되어 있지 않기 때문에 디리클레 분포를 사용하여 기여도 분포를 근사시키고, 기여도 분포에서 기여도 값들을 샘플하여 불확실성을 반영한다. 과거 데이터로부터 구한 트레이딩 알고리즘의 기여도 분포를 기반으로 Transformer을 훈련하여 미래의 기여도를 예측한다. 예측된 미래 기여도가 높은 트레이딩 알고리즘들을 앙상블 모델에 선택하여 포함시킨다. 실험을 통하여 제안된 앙상블 방법이 기존 앙상블 방법들과 비교하여 우수한 성능을 보임을 입증하였다.

재무부실화 예측을 위한 랜덤 서브스페이스 앙상블 모형의 최적화 (Optimization of Random Subspace Ensemble for Bankruptcy Prediction)

  • 민성환
    • 한국IT서비스학회지
    • /
    • 제14권4호
    • /
    • pp.121-135
    • /
    • 2015
  • Ensemble classification is to utilize multiple classifiers instead of using a single classifier. Recently ensemble classifiers have attracted much attention in data mining community. Ensemble learning techniques has been proved to be very useful for improving the prediction accuracy. Bagging, boosting and random subspace are the most popular ensemble methods. In random subspace, each base classifier is trained on a randomly chosen feature subspace of the original feature space. The outputs of different base classifiers are aggregated together usually by a simple majority vote. In this study, we applied the random subspace method to the bankruptcy problem. Moreover, we proposed a method for optimizing the random subspace ensemble. The genetic algorithm was used to optimize classifier subset of random subspace ensemble for bankruptcy prediction. This paper applied the proposed genetic algorithm based random subspace ensemble model to the bankruptcy prediction problem using a real data set and compared it with other models. Experimental results showed the proposed model outperformed the other models.

A New Incremental Learning Algorithm with Probabilistic Weights Using Extended Data Expression

  • Yang, Kwangmo;Kolesnikova, Anastasiya;Lee, Won Don
    • Journal of information and communication convergence engineering
    • /
    • 제11권4호
    • /
    • pp.258-267
    • /
    • 2013
  • New incremental learning algorithm using extended data expression, based on probabilistic compounding, is presented in this paper. Incremental learning algorithm generates an ensemble of weak classifiers and compounds these classifiers to a strong classifier, using a weighted majority voting, to improve classification performance. We introduce new probabilistic weighted majority voting founded on extended data expression. In this case class distribution of the output is used to compound classifiers. UChoo, a decision tree classifier for extended data expression, is used as a base classifier, as it allows obtaining extended output expression that defines class distribution of the output. Extended data expression and UChoo classifier are powerful techniques in classification and rule refinement problem. In this paper extended data expression is applied to obtain probabilistic results with probabilistic majority voting. To show performance advantages, new algorithm is compared with Learn++, an incremental ensemble-based algorithm.

Ensemble techniques and hybrid intelligence algorithms for shear strength prediction of squat reinforced concrete walls

  • Mohammad Sadegh Barkhordari;Leonardo M. Massone
    • Advances in Computational Design
    • /
    • 제8권1호
    • /
    • pp.37-59
    • /
    • 2023
  • Squat reinforced concrete (SRC) shear walls are a critical part of the structure for both office/residential buildings and nuclear structures due to their significant role in withstanding seismic loads. Despite this, empirical formulae in current design standards and published studies demonstrate a considerable disparity in predicting SRC wall shear strength. The goal of this research is to develop and evaluate hybrid and ensemble artificial neural network (ANN) models. State-of-the-art population-based algorithms are used in this research for hybrid intelligence algorithms. Six models are developed, including Honey Badger Algorithm (HBA) with ANN (HBA-ANN), Hunger Games Search with ANN (HGS-ANN), fitness-distance balance coyote optimization algorithm (FDB-COA) with ANN (FDB-COA-ANN), Averaging Ensemble (AE) neural network, Snapshot Ensemble (SE) neural network, and Stacked Generalization (SG) ensemble neural network. A total of 434 test results of SRC walls is utilized to train and assess the models. The results reveal that the SG model not only minimizes prediction variance but also produces predictions (with R2= 0.99) that are superior to other models.

유전자 알고리즘 기반 통합 앙상블 모형 (Genetic Algorithm based Hybrid Ensemble Model)

  • 민성환
    • Journal of Information Technology Applications and Management
    • /
    • 제23권1호
    • /
    • pp.45-59
    • /
    • 2016
  • An ensemble classifier is a method that combines output of multiple classifiers. It has been widely accepted that ensemble classifiers can improve the prediction accuracy. Recently, ensemble techniques have been successfully applied to the bankruptcy prediction. Bagging and random subspace are the most popular ensemble techniques. Bagging and random subspace have proved to be very effective in improving the generalization ability respectively. However, there are few studies which have focused on the integration of bagging and random subspace. In this study, we proposed a new hybrid ensemble model to integrate bagging and random subspace method using genetic algorithm for improving the performance of the model. The proposed model is applied to the bankruptcy prediction for Korean companies and compared with other models in this study. The experimental results showed that the proposed model performs better than the other models such as the single classifier, the original ensemble model and the simple hybrid model.

Extreme Learning Machine Ensemble Using Bagging for Facial Expression Recognition

  • Ghimire, Deepak;Lee, Joonwhoan
    • Journal of Information Processing Systems
    • /
    • 제10권3호
    • /
    • pp.443-458
    • /
    • 2014
  • An extreme learning machine (ELM) is a recently proposed learning algorithm for a single-layer feed forward neural network. In this paper we studied the ensemble of ELM by using a bagging algorithm for facial expression recognition (FER). Facial expression analysis is widely used in the behavior interpretation of emotions, for cognitive science, and social interactions. This paper presents a method for FER based on the histogram of orientation gradient (HOG) features using an ELM ensemble. First, the HOG features were extracted from the face image by dividing it into a number of small cells. A bagging algorithm was then used to construct many different bags of training data and each of them was trained by using separate ELMs. To recognize the expression of the input face image, HOG features were fed to each trained ELM and the results were combined by using a majority voting scheme. The ELM ensemble using bagging improves the generalized capability of the network significantly. The two available datasets (JAFFE and CK+) of facial expressions were used to evaluate the performance of the proposed classification system. Even the performance of individual ELM was smaller and the ELM ensemble using a bagging algorithm improved the recognition performance significantly.