Analysis of bias correction performance of satellite-derived precipitation products by deep learning model

  • Le, Xuan-Hien (Disaster Prevention Emergency Management Institute, Kyungpook National University) ;
  • Nguyen, Giang V. (Dept. of Advanced Science and Technology Convergence, Kyungpook National University) ;
  • Jung, Sungho (Dept. of Advanced Science and Technology Convergence, Kyungpook National University) ;
  • Lee, Giha (Dept. of Advanced Science and Technology Convergence, Kyungpook National University)
  • Published : 2022.05.19

Abstract

Spatiotemporal precipitation data is one of the primary quantities in hydrological as well as climatological studies. Despite the fact that the estimation of these data has made considerable progress owing to advances in remote sensing, the discrepancy between satellite-derived precipitation product (SPP) data and observed data is still remarkable. This study aims to propose an effective deep learning model (DLM) for bias correction of SPPs. In which TRMM (The Tropical Rainfall Measuring Mission), CMORPH (CPC Morphing technique), and PERSIANN-CDR (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks) are three SPPs with a spatial resolution of 0.25o exploited for bias correction, and APHRODITE (Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation) data is used as a benchmark to evaluate the effectiveness of DLM. We selected the Mekong River Basin as a case study area because it is one of the largest watersheds in the world and spans many countries. The adjusted dataset has demonstrated an impressive performance of DLM in bias correction of SPPs in terms of both spatial and temporal evaluation. The findings of this study indicate that DLM can generate reliable estimates for the gridded satellite-based precipitation bias correction.

Keywords

Acknowledgement

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2020R1A2C1102758).