• Title/Summary/Keyword: Gradient boosting(XGBoost)

Search Result 56, Processing Time 0.029 seconds

A Design and Implement of Efficient Agricultural Product Price Prediction Model

  • Im, Jung-Ju;Kim, Tae-Wan;Lim, Ji-Seoup;Kim, Jun-Ho;Yoo, Tae-Yong;Lee, Won Joo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.5
    • /
    • pp.29-36
    • /
    • 2022
  • In this paper, we propose an efficient agricultural products price prediction model based on dataset which provided in DACON. This model is XGBoost and CatBoost, and as an algorithm of the Gradient Boosting series, the average accuracy and execution time are superior to the existing Logistic Regression and Random Forest. Based on these advantages, we design a machine learning model that predicts prices 1 week, 2 weeks, and 4 weeks from the previous prices of agricultural products. The XGBoost model can derive the best performance by adjusting hyperparameters using the XGBoost Regressor library, which is a regression model. The implemented model is verified using the API provided by DACON, and performance evaluation is performed for each model. Because XGBoost conducts its own overfitting regulation, it derives excellent performance despite a small dataset, but it was found that the performance was lower than LGBM in terms of temporal performance such as learning time and prediction time.

Darknet Traffic Detection and Classification Using Gradient Boosting Techniques (Gradient Boosting 기법을 활용한 다크넷 트래픽 탐지 및 분류)

  • Kim, Jihye;Lee, Soo Jin
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.32 no.2
    • /
    • pp.371-379
    • /
    • 2022
  • Darknet is based on the characteristics of anonymity and security, and this leads darknet to be continuously abused for various crimes and illegal activities. Therefore, it is very important to detect and classify darknet traffic to prevent the misuse and abuse of darknet. This work proposes a novel approach, which uses the Gradient Boosting techniques for darknet traffic detection and classification. XGBoost and LightGBM algorithm achieve detection accuracy of 99.99%, and classification accuracy of over 99%, which could get more than 3% higher detection accuracy and over 13% higher classification accuracy, compared to the previous research. In particular, LightGBM algorithm could detect and classify darknet traffic in a way that is superior to XGBoost by reducing the learning time by about 1.6 times and hyperparameter tuning time by more than 10 times.

Indoor positioning system using Xgboosting (Xgboosting 기법을 이용한 실내 위치 측위 기법)

  • Hwang, Chi-Gon;Yoon, Chang-Pyo;Kim, Dae-Jin
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.492-494
    • /
    • 2021
  • The decision tree technique is used as a classification technique in machine learning. However, the decision tree has a problem of consuming a lot of speed or resources due to the problem of overfitting. To solve this problem, there are bagging and boosting techniques. Bagging creates multiple samplings and models them using them, and boosting models the sampled data and adjusts weights to reduce overfitting. In addition, recently, techniques Xgboost have been introduced to improve performance. Therefore, in this paper, we collect wifi signal data for indoor positioning, apply it to the existing method and Xgboost, and perform performance evaluation through it.

  • PDF

ConvXGB: A new deep learning model for classification problems based on CNN and XGBoost

  • Thongsuwan, Setthanun;Jaiyen, Saichon;Padcharoen, Anantachai;Agarwal, Praveen
    • Nuclear Engineering and Technology
    • /
    • v.53 no.2
    • /
    • pp.522-531
    • /
    • 2021
  • We describe a new deep learning model - Convolutional eXtreme Gradient Boosting (ConvXGB) for classification problems based on convolutional neural nets and Chen et al.'s XGBoost. As well as image data, ConvXGB also supports the general classification problems, with a data preprocessing module. ConvXGB consists of several stacked convolutional layers to learn the features of the input and is able to learn features automatically, followed by XGBoost in the last layer for predicting the class labels. The ConvXGB model is simplified by reducing the number of parameters under appropriate conditions, since it is not necessary re-adjust the weight values in a back propagation cycle. Experiments on several data sets from UCL Repository, including images and general data sets, showed that our model handled the classification problems, for all the tested data sets, slightly better than CNN and XGBoost alone and was sometimes significantly better.

A LightGBM and XGBoost Learning Method for Postoperative Critical Illness Key Indicators Analysis

  • Lei Han;Yiziting Zhu;Yuwen Chen;Guoqiong Huang;Bin Yi
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.8
    • /
    • pp.2016-2029
    • /
    • 2023
  • Accurate prediction of critical illness is significant for ensuring the lives and health of patients. The selection of indicators affects the real-time capability and accuracy of the prediction for critical illness. However, the diversity and complexity of these indicators make it difficult to find potential connections between them and critical illnesses. For the first time, this study proposes an indicator analysis model to extract key indicators from the preoperative and intraoperative clinical indicators and laboratory results of critical illnesses. In this study, preoperative and intraoperative data of heart failure and respiratory failure are used to verify the model. The proposed model processes the datum and extracts key indicators through four parts. To test the effectiveness of the proposed model, the key indicators are used to predict the two critical illnesses. The classifiers used in the prediction are light gradient boosting machine (LightGBM) and eXtreme Gradient Boosting (XGBoost). The predictive performance using key indicators is better than that using all indicators. In the prediction of heart failure, LightGBM and XGBoost have sensitivities of 0.889 and 0.892, and specificities of 0.939 and 0.937, respectively. For respiratory failure, LightGBM and XGBoost have sensitivities of 0.709 and 0.689, and specificity of 0.936 and 0.940, respectively. The proposed model can effectively analyze the correlation between indicators and postoperative critical illness. The analytical results make it possible to find the key indicators for postoperative critical illnesses. This model is meaningful to assist doctors in extracting key indicators in time and improving the reliability and efficiency of prediction.

Mean fragmentation size prediction in an open-pit mine using machine learning techniques and the Kuz-Ram model

  • Seung-Joong Lee;Sung-Oong Choi
    • Geomechanics and Engineering
    • /
    • v.34 no.5
    • /
    • pp.547-559
    • /
    • 2023
  • We evaluated the applicability of machine learning techniques and the Kuz-Ram model for predicting the mean fragmentation size in open-pit mines. The characteristics of the in-situ rock considered here were uniaxial compressive strength, tensile strength, rock factor, and mean in-situ block size. Seventy field datasets that included these characteristics were collected to predict the mean fragmentation size. Deep neural network, support vector machine, and extreme gradient boosting (XGBoost) models were trained using the data. The performance was evaluated using the root mean squared error (RMSE) and the coefficient of determination (r2). The XGBoost model had the smallest RMSE and the highest r2 value compared with the other models. Additionally, when analyzing the error rate between the measured and predicted values, XGBoost had the lowest error rate. When the Kuz-Ram model was applied, low accuracy was observed owing to the differences in the characteristics of data used for model development. Consequently, the proposed XGBoost model predicted the mean fragmentation size more accurately than other models. If its performance is improved by securing sufficient data in the future, it will be useful for improving the blasting efficiency at the target site.

Optimal Sensor Location in Water Distribution Network using XGBoost Model (XGBoost 기반 상수도관망 센서 위치 최적화)

  • Hyewoon Jang;Donghwi Jung
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.217-217
    • /
    • 2023
  • 상수도관망은 사용자에게 고품질의 물을 안정적으로 공급하는 것을 목적으로 하며, 이를 평가하기 위한 지표 중 하나로 압력을 활용한다. 최근 스마트 센서의 설치가 확장됨에 따라 기계학습기법을 이용한 실시간 데이터 기반의 분석이 활발하다. 따라서 어디에서 데이터를 수집하느냐에 대한 센서 위치 결정이 중요하다. 본 연구는 eXtreme Gradient Boosting(XGBoost) 모델을 활용하여 대규모 상수도관망 내 센서 위치를 최적화하는 방법론을 제안한다. XGBoost 모델은 여러 의사결정 나무(decision tree)를 활용하는 앙상블(ensemble) 모델이며, 오차에 따른 가중치를 부여하여 성능을 향상시키는 부스팅(boosting) 방식을 이용한다. 이는 분산 및 병렬 처리가 가능해 메모리리소스를 최적으로 사용하고, 학습 속도가 빠르며 결측치에 대한 전처리 과정을 모델 내에 포함하고 있다는 장점이 있다. 모델 구현을 위한 독립 변수 결정을 위해 압력 데이터의 변동성 및 평균압력 값을 고려하여 상수도관망을 대표하는 중요 절점(critical node)를 선정한다. 중요 절점의 압력 값을 예측하는 XGBoost 모델을 구축하고 모델의 성능과 요인 중요도(feature importance) 값을 고려하여 센서의 최적 위치를 선정한다. 이러한 방법론을 기반으로 상수도관망의 특성에 따른 경향성을 파악하기 위해 다양한 형태(예를 들어, 망형, 가지형)와 구성 절점의 수를 변화시키며 결과를 분석한다. 본 연구에서 구축한 XGBoost 모델은 추가적인 전처리 과정을 최소화하며 대규모 관망에 간편하게 사용할 수 있어 추후 다양한 입출력 데이터의 조합을 통해 센서 위치 외에도 상수도관망에서의 성능 최적화에 활용할 수 있을 것으로 기대한다.

  • PDF

A Study On User Skin Color-Based Foundation Color Recommendation Method Using Deep Learning (딥러닝을 이용한 사용자 피부색 기반 파운데이션 색상 추천 기법 연구)

  • Jeong, Minuk;Kim, Hyeonji;Gwak, Chaewon;Oh, Yoosoo
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.9
    • /
    • pp.1367-1374
    • /
    • 2022
  • In this paper, we propose an automatic cosmetic foundation recommendation system that suggests a good foundation product based on the user's skin color. The proposed system receives and preprocesses user images and detects skin color with OpenCV and machine learning algorithms. The system then compares the performance of the training model using XGBoost, Gradient Boost, Random Forest, and Adaptive Boost (AdaBoost), based on 550 datasets collected as essential bestsellers in the United States. Based on the comparison results, this paper implements a recommendation system using the highest performing machine learning model. As a result of the experiment, our system can effectively recommend a suitable skin color foundation. Thus, our system model is 98% accurate. Furthermore, our system can reduce the selection trials of foundations against the user's skin color. It can also save time in selecting foundations.

An advanced machine learning technique to predict compressive strength of green concrete incorporating waste foundry sand

  • Danial Jahed Armaghani;Haleh Rasekh;Panagiotis G. Asteris
    • Computers and Concrete
    • /
    • v.33 no.1
    • /
    • pp.77-90
    • /
    • 2024
  • Waste foundry sand (WFS) is the waste product that cause environmental hazards. WFS can be used as a partial replacement of cement or fine aggregates in concrete. A database comprising 234 compressive strength tests of concrete fabricated with WFS is used. To construct the machine learning-based prediction models, the water-to-cement ratio, WFS replacement percentage, WFS-to-cement content ratio, and fineness modulus of WFS were considered as the model's inputs, and the compressive strength of concrete is set as the model's output. A base extreme gradient boosting (XGBoost) model together with two hybrid XGBoost models mixed with the tunicate swarm algorithm (TSA) and the salp swarm algorithm (SSA) were applied. The role of TSA and SSA is to identify the optimum values of XGBoost hyperparameters to obtain the higher performance. The results of these hybrid techniques were compared with the results of the base XGBoost model in order to investigate and justify the implementation of optimisation algorithms. The results showed that the hybrid XGBoost models are faster and more accurate compared to the base XGBoost technique. The XGBoost-SSA model shows superior performance compared to previously published works in the literature, offering a reduced system error rate. Although the WFS-to-cement ratio is significant, the WFS replacement percentage has a smaller influence on the compressive strength of concrete. To improve the compressive strength of concrete fabricated with WFS, the simultaneous consideration of the water-to-cement ratio and fineness modulus of WFS is recommended.

Estimating pile setup parameter using XGBoost-based optimized models

  • Xigang Du;Ximeng Ma;Chenxi Dong;Mehrdad Sattari Nikkhoo
    • Geomechanics and Engineering
    • /
    • v.36 no.3
    • /
    • pp.259-276
    • /
    • 2024
  • The undrained shear strength is widely acknowledged as a fundamental mechanical property of soil and is considered a critical engineering parameter. In recent years, researchers have employed various methodologies to evaluate the shear strength of soil under undrained conditions. These methods encompass both numerical analyses and empirical techniques, such as the cone penetration test (CPT), to gain insights into the properties and behavior of soil. However, several of these methods rely on correlation assumptions, which can lead to inconsistent accuracy and precision. The study involved the development of innovative methods using extreme gradient boosting (XGB) to predict the pile set-up component "A" based on two distinct data sets. The first data set includes average modified cone point bearing capacity (qt), average wall friction (fs), and effective vertical stress (σvo), while the second data set comprises plasticity index (PI), soil undrained shear cohesion (Su), and the over consolidation ratio (OCR). These data sets were utilized to develop XGBoost-based methods for predicting the pile set-up component "A". To optimize the internal hyperparameters of the XGBoost model, four optimization algorithms were employed: Particle Swarm Optimization (PSO), Social Spider Optimization (SSO), Arithmetic Optimization Algorithm (AOA), and Sine Cosine Optimization Algorithm (SCOA). The results from the first data set indicate that the XGBoost model optimized using the Arithmetic Optimization Algorithm (XGB - AOA) achieved the highest accuracy, with R2 values of 0.9962 for the training part and 0.9807 for the testing part. The performance of the developed models was further evaluated using the RMSE, MAE, and VAF indices. The results revealed that the XGBoost model optimized using XGBoost - AOA outperformed other models in terms of accuracy, with RMSE, MAE, and VAF values of 0.0078, 0.0015, and 99.6189 for the training part and 0.0141, 0.0112, and 98.0394 for the testing part, respectively. These findings suggest that XGBoost - AOA is the most accurate model for predicting the pile set-up component.