DOI QR코드

DOI QR Code

Predicting the Future Price of Export Items in Trade Using a Deep Regression Model

딥러닝 기반 무역 수출 가격 예측 모델

  • 김지훈 (상명대학교 휴먼지능정보공학전공) ;
  • 이지항 (상명대학교 휴먼지능정보공학과)
  • Received : 2021.12.28
  • Accepted : 2022.04.09
  • Published : 2022.10.31

Abstract

Korea Trade-Investment Promotion Agency (KOTRA) annually publishes the trade data in South Korea under the guidance of the Ministry of Trade, Industry and Energy in South Korea. The trade data usually contains Gross domestic product (GDP), a custom tariff, business score, and the price of export items in previous and this year, with regards to the trading items and the countries. However, it is challenging to figure out the meaningful insight so as to predict the future price on trading items every year due to the significantly large amount of data accumulated over the several years under the limited human/computing resources. Within this context, this paper proposes a multi layer perception that can predict the future price of potential trading items in the next year by training large amounts of past year's data with a low computational and human cost.

산업통상자원부에서 제공하는 KOTRA 무역 데이터는 해당 품목과 해당 국가에 대하여 GDP, 관세율, 비즈니스 점수, 과/차년도 수출금액 등을 제공한다. 그러나 무역 수출품목은 수없이 많을뿐더러 그에 따른 대량의 데이터를 매년 수작업 기반 분석을 통해 유의미한 결과를 이끌어내는 것은 상당히 큰 시간과 비용을 요구한다. 따라서 이번 연구에선 대량의 데이터를 학습하여 단기간에 저비용으로 결과 예측이 가능한 다층 퍼셉트론 모델을 구현하고 성능을 평가하였다. 먼저 딥러닝 기반 무역 수출 가격 예측 모델을 일반적 다변량 회귀 모델과 비교하였을 때, 예측 오류와 학습 시간 측면에서 통계적으로 우수한 성능을 보였다. 수출 가격 데이터는 시계열 속성이 있을 것으로 예상하는 바, 은닉 노드들이 모두 연결된 다층 퍼셉트론과 순환 신경망을 이용하여 수출 가격 데이터를 예측하였다. 그 결과 새로운 데이터에 대해 수출 가격 예측을 위한 일반화 능력은 순환 신경망이 우수한 성능을 보였으나, 다층 퍼셉트론이 무역 수출 가격 예측에서 더 뛰어난 성능을 보였다. 추후 장기간 데이터를 확보한다면, 순환 신경망 혹은 트랜스포머 기반 딥러닝 모델을 이용하여 더 뛰어난 수출 가격 예측이 가능할 것으로 사료된다.

Keywords

Acknowledgement

이 논문은 정부(과학기술정보통신부)의 재원으로 한국연구재단의 지원을 받아 수행된 연구임(No. 2020R1G1A1102683). 본 연구는 삼성미래기술육성센터의 지원을 받아 수행하였음(No. SRFC-TC1603-52).

References

  1. Korea International Trade Association [Internet], https://stat.kita.net.
  2. B. M. Lee, H. J. Jeong, and K. S. Park, "An influence of the fourth industrial revolution on international trade and countermeasure strategies to promote export in Korea," Korea Trade Review, Vol.42, No.3, pp.1-24, 2017.
  3. S. H. Nam, "Comparison of long-term forecasting performance of export growth rate using time series analysis models and machine learning analysis," Korea Trade Review, Vol.46, No.6, pp.191-209, 2021. https://doi.org/10.22659/KTRA.2021.46.6.191
  4. The 9th Public Data Utilization BI Contest [Internet], http://www.datacontest.kr (retrieved 20210926)
  5. S. Weisberg, "Applied linear regression," John Wiley & Sons, pp.47, 2005.
  6. Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," Nature, Vol.521, No.7553, pp.436-444, 2015. https://doi.org/10.1038/nature14539
  7. A. Keck, A. Raubold, and A. Truppia. "Forecasting international trade: A time series approach," OECD Journal: Journal of Business Cycle Measurement and Analysis, Vol.2009, No.2, pp.157-176, 2010.
  8. A. W. Veenstra and H. E. Haralambides. "Multivariate autoregressive models for forecasting seaborne trade flows," Transportation Research Part E: Logistics and Transportation Review, Vol.37, No.4, pp.311-319, 2001. https://doi.org/10.1016/S1366-5545(00)00020-X
  9. E. S. Silva and H. Hassani. "On the use of singular spectrum analysis for forecasting US trade before, during and after the 2008 recession," International Economics, Vol.141, pp.34-49, 2015. https://doi.org/10.1016/j.inteco.2014.11.003
  10. M. E. Torres, M. A. Colominas, G. Schlotthauer, and P. Flandrin, "A complete ensemble empirical mode decomposition with adaptive noise," 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, 2011.
  11. H. Lu, X. Ma, K. Huang, and M. Azimi, "Carbon trading volume and price forecasting in China using multiple machine learning models," Journal of Cleaner Production, Vol.249, pp.119386, 2020. https://doi.org/10.1016/j.jclepro.2019.119386
  12. S. Ioffe and C. Szegedy, "Batch normalization: Accelerating deep network training by reducing internal covariate shift," International Conference on Machine Learning, PMLR, 2015.
  13. S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural Computation, Vol.9, No.8, pp.1735-1780, 1997. https://doi.org/10.1162/neco.1997.9.8.1735
  14. Analytics Vidhya, "Feature Scaling for Machine Learning: Understanding the Difference Between Normalization vs. Standardization," [Internet], https://www.analyticsvidhya.com/blog/2020/04/feature-scaling-machine-learning-normalization-standardization/(retrieved 20200403).
  15. N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, "Dropout: A simple way to prevent neural networks from overfitting," The Journal of Machine Learning Research, Vol.15, No.1, pp.1929-1958, 2014.
  16. R. Genuer, J. M. Poggi, and C. Tuleau-Malot. "Variable selection using random forests," Pattern Recognition Letters, Vol.31, No.14, pp.2225-2236, 2010. https://doi.org/10.1016/j.patrec.2010.03.014
  17. Christoph Molnar, "Permutation feature importance," [Internet], https://christophm.github.io/interpretable-ml-book/feature-importance.html (retrieved 20220217).
  18. K. Kira and L. A. Rendell, "A practical approach to feature selection," Machine Learning Proceedings 1992, Morgan Kaufmann, pp.249-256, 1992.
  19. J. W. Tukey, "Exploratory data analysis," Addison-Wesley. ISBN 978-0-201-07616-5. OCLC 3058187, 1977.
  20. A. Krizhevsky, I. Sutskever, and G. E. Hinton. "Imagenet classification with deep convolutional neural networks," Communications of the ACM, Vol.60, No.6, pp.84-90, 2017. https://doi.org/10.1145/3065386
  21. Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, "Gradient-based learning applied to document recognition," Proceedings of the IEEE, Vol.86, No.11, pp.2278-2324, 1998. https://doi.org/10.1109/5.726791
  22. A. Vaswani et al., "Attention is all you need," Advances in Neural Information Processing Systems, 2017.