Browse > Article
http://dx.doi.org/10.9708/jksci.2022.27.05.029

A Design and Implement of Efficient Agricultural Product Price Prediction Model  

Im, Jung-Ju (Dept. of Applied Artificial Intelligence, Hanyang University)
Kim, Tae-Wan (Dept. of Applied Artificial Intelligence, Hanyang University)
Lim, Ji-Seoup (Dept. of Applied Artificial Intelligence, Hanyang University)
Kim, Jun-Ho (Dept. of Computer Science & Engineering, Inha Technical College)
Yoo, Tae-Yong (Dept. of Computer Science & Engineering, Inha Technical College)
Lee, Won Joo (Dept. of Computer Science & Engineering, Inha Technical College)
Abstract
In this paper, we propose an efficient agricultural products price prediction model based on dataset which provided in DACON. This model is XGBoost and CatBoost, and as an algorithm of the Gradient Boosting series, the average accuracy and execution time are superior to the existing Logistic Regression and Random Forest. Based on these advantages, we design a machine learning model that predicts prices 1 week, 2 weeks, and 4 weeks from the previous prices of agricultural products. The XGBoost model can derive the best performance by adjusting hyperparameters using the XGBoost Regressor library, which is a regression model. The implemented model is verified using the API provided by DACON, and performance evaluation is performed for each model. Because XGBoost conducts its own overfitting regulation, it derives excellent performance despite a small dataset, but it was found that the performance was lower than LGBM in terms of temporal performance such as learning time and prediction time.
Keywords
Agricultural Product Price forecasting; Machine Learning; Gradient Boosting Algorithm; DACON;
Citations & Related Records
연도 인용수 순위
  • Reference
1 M. Fafchamps and B. Minten, "Impact of SMS-Based Agricultural Information on Indian Farmers," In the World Bank Economic Review, Vol. 26, Issue.3, pp. 383-414, Nov. 2012 https://doi.org/10.1093/wber/lhr056   DOI
2 https://biz.newdaily.co.kr/site/data/html/2021/09/05/2021090500031.html.
3 https://www.kaggle.com/competitions
4 http://www.index.go.kr/potal/main/EachDtlPageDetail.do?idx_cd=1628
5 Sutskever, I., Vinyals, O., & Le, Q. V., "Sequence to Sequence Learning with Neural Networks," NIPS. Dec. 2014, https://arxiv.org/abs/1409.3215
6 Dorogush, A.V., Gulin, A., Gusev, G., Kazeev, N., Ostroumova, L., & Vorobev, A., "Fighting biases with dynamic boosting," June 2017, https://arxiv.org/pdf/1706.09516v1.pdf
7 "XGBoost Parameters," https://xgboost.readthedocs.io/en/stable/parameter.html
8 Friedman, J. H. (2002). "Stochastic gradient boosting," Computational statistics & data analysis, Vol. 38, No. 4, pp. 367-378, Feb. 2002 https://doi.org/10.1016/S0167-9473(01)00065-2   DOI
9 "Topic 10. Gradient Boosting," Kaggle, last modified Jul 01, 2020, accessed Oct. 09, 2021, www.kaggle.com/kashnitsky/topic-10-gradient-boosting
10 Jae Byung Lee, "A Study on Recent Boosting Methods," Thesis, Konkuk University, 2020
11 Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., & Liu, T., "LightGBM: A Highly Efficient Gradient Boosting Decision Tree," Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017)
12 "CatBoost Training Parameters," https://CatBoost.ai/en/docs/references/training-parameters/
13 https://www.mafra.go.kr/mafra/1336/subview.do
14 Chen, T., & Guestrin, C., "XGBoost: A Scalable Tree Boosting System," Proceedings of the 22th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, June 2016, https://arxiv.org/abs/1603.02754