Acknowledgement
This work was supported by Hanshin University Research Grant.
References
- S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural Computation, vol. 9, no. 8, pp. 1735-1780, Dec. 1997. DOI: 10.1162/neco.1997.9.8.1735.
- Facebook, Prophet: Automatic forecasting procedure, [Online] Available: https://pypi.org/project/fbprophet.
- Kaggle, Rossmann store sales: Forecast sales using store, promotion, and competitor data, 2015. [Online] Available: https://www.kaggle.com/c/rossmann-store-sales.
- S. Kohli, G. T. Godwin, and S. Urolagin, "Sales prediction using linear and KNN regression," in Advances in Machine Learning and Computational Intelligence, Springer, Singapore, pp. 321-329, 2021. DOI: 10.1007/978-981-15-5243-4_29.
- T. Weng, W. Liu, and J. Xiao, "Supply chain sales forecasting based on light GBM and LSTM combination model," Industrial Management & Data Systems, vol. 120, no. 2, pp. 265-279, Sep. 2019. DOI: 10.1108/IMDS-03-2019-0170.
- Kaggle, [Online] Available: https://www.kaggle.com.
- G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and TY. Liu, "Lightgbm: A highly efficient gradient boosting decision tree," in Advances in Neural Information Processing Systems, pp. 3149-3157, 2017.
- J. Li, K. Cheng, S. Wang, F. Morstatter, R. P. Trevino, J. Tang, and H. Liu, "Feature selection: A data perspective," ACM Computing Surveys, vol. 50, no. 6, pp. 1-45, Dec. 2017. DOI: 10.1145/3136625.
- Y. Saeys, T. Abeel, and Y. V. Peer, "Robust feature selection using ensemble feature selection techniques," in Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Antwerp, Belgium, pp. 313-325, 2008. DOI: 10.1007/978-3-540-87481-2_21.
- B. H. Menze, B. M. Kelm, R. Masuch, U. Himmelreich, P. Bachert, W. Petrich, and F. A. Hamprecht, "A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data," BMC Bioinformatics, vol. 10, no. 213, pp. 1-16, Jul. 2009. DOI: 10.1186/1471-2105-10-213.
- F. Pan, T. Converse, D. Ahn, F. Salvetti, and G. Donato, "Feature selection for ranking using boosted trees," in Proceedings of the 18th ACM Conference on Information and Knowledge Management, Hong Kong, China, pp. 2025-2028, 2009. DOI: 10.1145/1645953.1646292.
- H. Jeon and S. Oh, "Hybrid-recursive feature elimination for efficient feature selection," Applied Sciences, vol. 10, no. 9, pp. 3211, May 2020. DOI: 10.3390/app10093211.
- L. Zhang and Q. Duan, "A feature selection method for multi-label text based on feature importance," Applied Sciences, vol. 9, no. 4, pp. 665, Feb. 2019. DOI: 10.3390/app9040665.
- E. Fezer, D. Raab, and A. Theissler, "XplainableClusterExplorer: a novel approach for interactive feature selection for clustering," in Proceedings of the 13th International Symposium on Visual Information Communication and Interaction, Eindhoven, Netherlands, pp. 1-5, 2020. DOI: 10.1145/3430036.3430066.
- X. Man and E. P. Chan, "The best way to select features? comparing mda, lime, and shap," The Journal of Financial Data Science, vol. 3, no. 1, pp. 127-139, 2021. DOI: 10.3905/jfds.2020.1.047.
- M. T. Ribeiro, S. Singh, and C. Guestrin, "Why should I trust you?: Explaining the predictions of any classifier," in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, USA, pp. 1135-1144, 2016. DOI: 10.1145/2939672.2939778.
- T. K. Ho, "Random decision forests," in Proceedings of 3rd International Conference on Document Analysis and Recognition, Montreal, Canada, vol. 1, pp. 278-282, 1995. DOI: 10.1109/ICDAR.1995.598994.