DOI QR코드

DOI QR Code

방사선량률 예측을 위한 기계학습 기반 모델 개발 및 최적화 연구

Machine Learning Based Model Development and Optimization for Predicting Radiation

  • 이시현 ((주)알엠택) ;
  • 이홍연 ((주)알엠택) ;
  • 염정민 ((주)알엠택)
  • SiHyun Lee (RMTEC Co., Ltd.) ;
  • HongYeon Lee (RMTEC Co., Ltd.) ;
  • JungMin Yeom (RMTEC Co., Ltd.)
  • 투고 : 2023.12.07
  • 심사 : 2023.12.19
  • 발행 : 2023.12.31

초록

In recent years, radiation has become a socially important issue, increasing the need for accurate prediction of radiation levels. In this study, machine learning-based models such as Multiple Linear Regression (MLR), Random Forest (RF), XGBoost, and LightGBM, which predict the dose rate by time(nSv h-1) by selecting only important variables, were used, and the correlation between temperature, humidity, cumulative precipitation, wind direction, wind speed, local air pressure, sea pressure, solar radiation, and radiation dose rate (nSv h-1) was analyzed by collecting weather data and radiation dose rate for about 6 months in Jangseong, Jeollanam-do. As a result of the evaluation based on the RMSE (Root Mean Squared Error) and R-Squared (R-Squared coefficient of determination) scores, the RMSE of the XGBoost model was 22.92 and the R-Squared was 0.73, showing the best performance among the models used. As a result of optimizing hyperparameters of all models using the GridSearch method and comparing them by adding variables inside the measuring instrument, it was confirmed that the performance improved to 2.39 for RMSE and 0.99 for R-Squared in both XGBoost and LightGBM.

키워드

과제정보

본 연구는 한국연구재단에서 주관하는 원자력기초연구지원사업의 지원을 받아 수행한 연구과제입니다 (No. 2022M2D2A201634122).

참고문헌

  1. Weather data from Jangseong, Jeollanam-do, Korea Meteorological Administration(Feb. 2022 - Mar. 2023). 
  2. Airborne dose rate (nSv h-1) data, RMTEC (Feb. 2022 - Mar. 2023). 
  3. Lee S, Environmental factors and meteorological variables inside a radiation meter, Chosun University. 
  4. Breiman, L. 2001. Random Forests. Machine Learning 45:5-32. https://doi.org/10.1023/A:1010933404324. 
  5. Chen T and Guestrin C. KDD's16. 2016. Proceeding of 22nd ACM SICKDD International Conference on Knowledge Discovey and Data Mibing. 12 August 2016(785-794). XGBoost: A scalable tree boosting system. https://doi.org/10.1145/2939672.2939785. 
  6. Ke G, Meng Q and Finley T. 2017. NIPS': 17 Proceedings of the 31st International Conference on Neural Information Processing Systems December 2017 Pages 3149-3157. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. https://doi.org/10.5555/3294996.3295074.