• Title/Summary/Keyword: Interpolated data

Search Result 198, Processing Time 0.027 seconds

Quality Test and Control of Kinematic DGPS Survey Results

  • Lim, Sam-Sung
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.10 no.5 s.23
    • /
    • pp.75-80
    • /
    • 2002
  • Depending upon geographical features and surrounding errors in the survey field, inaccurate positioning is inevitable in a kinematic DGPs survey. Therefore, a data inaccuracy detection algorithm and an interpolation algorithm are essential to meet the requirement of a digital map. In this study, GPS characteristics are taken into account to develop the data inaccuracy detection algorithm. Then, the data interpolation algothim is obtained, based on the feature type of the survey. A digital map for 20km of a rural highway is produced by the kinematic DGPS survey and the features of interests are lines associated with the road. Since the vertical variation of GPS data is relatively higher, the trimmed mean of vertical variation is used as criteria of the inaccuracy detection. Four cases of 0.5%, 1%, 2.5% and 5% trimmings have been experimented. Criteria of four cases are 69cm, 65cm, 61cm and 42cm, respectively. For the feature of a curved line, cublic spine interpolation is used to correct the inaccurate data. When the feature is more or less a straight line, the interpolation has been done by a linear polynomial. Difference between the actual distance and the interpolated distance are few centimeters in RMS.

  • PDF

Application of Convolutional Neural Networks (CNN) for Bias Correction of Satellite Precipitation Products (SPPs) in the Amazon River Basin

  • Alena Gonzalez Bevacqua;Xuan-Hien Le;Giha Lee
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.159-159
    • /
    • 2023
  • The Amazon River basin is one of the largest basins in the world, and its ecosystem is vital for biodiversity, hydrology, and climate regulation. Thus, understanding the hydrometeorological process is essential to the maintenance of the Amazon River basin. However, it is still tricky to monitor the Amazon River basin because of its size and the low density of the monitoring gauge network. To solve those issues, remote sensing products have been largely used. Yet, those products have some limitations. Therefore, this study aims to do bias corrections to improve the accuracy of Satellite Precipitation Products (SPPs) in the Amazon River basin. We use 331 rainfall stations for the observed data and two daily satellite precipitation gridded datasets (CHIRPS, TRMM). Due to the limitation of the observed data, the period of analysis was set from 1st January 1990 to 31st December 2010. The observed data were interpolated to have the same resolution as the SPPs data using the IDW method. For bias correction, we use convolution neural networks (CNN) combined with an autoencoder architecture (ConvAE). To evaluate the bias correction performance, we used some statistical indicators such as NSE, RMSE, and MAD. Hence, those results can increase the quality of precipitation data in the Amazon River basin, improving its monitoring and management.

  • PDF

Application of Artificial Neural Network for estimation of daily maximum snow depth in Korea (우리나라에서 일최심신적설의 추정을 위한 인공신경망모형의 활용)

  • Lee, Geon;Lee, Dongryul;Kim, Dongkyun
    • Journal of Korea Water Resources Association
    • /
    • v.50 no.10
    • /
    • pp.681-690
    • /
    • 2017
  • This study estimated the daily maximum snow depth using the Artificial Neural Network (ANN) model in Korean Peninsula. First, the optimal ANN model structure was determined through the trial-and-error approach. As a result, daily precipitation, daily mean temperature, and daily minimum temperature were chosen as the input data of the ANN. The number of hidden layer was set to 1 and the number of nodes in the hidden layer was set to 10. In case of using the observed value as the input data of the ANN model, the cross validation correlation coefficient was 0.87, which is higher than that of the case in which the daily maximum snow depth was spatially interpolated using the Ordinary Kriging method (0.40). In order to investigate the performance of the ANN model for estimating the daily maximum snow depth of the ungauged area, the input data of the ANN model was spatially interpolated using Ordinary Kriging. In this case, the correlation coefficient of 0.49 was obtained. The performance of the ANN model in mountainous areas above 200m above sea level was found to be somewhat lower than that in the rest of the study area. This result of this study implies that the ANN model can be used effectively for the accurate and immediate estimation of the maximum snow depth over the whole country.

Generating Cartesian Tool Paths for Machining Sculptured Surfaces from 3D Measurement Data (3차원 측정자료부터 자유곡면의 가공을 위한 공구경로생성)

  • Ko, Byung-Chul;Kim, Kwang-Soo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.19 no.3
    • /
    • pp.123-137
    • /
    • 1993
  • In this paper, an integrated approach is proposed to generate gouging-free Cartesian tool paths for machining sculptured surfaces from 3D measurement data. The integrated CAD/CAM system consists of two modules : offset surface module an Carteian tool path module. The offset surface module generates an offset surface of an object from its 3D measurement data, using an offsetting method and a surface fitting method. The offsetting is based on the idea that the envelope of an inversed tool generates an offset surface without self-intersection as the center of the inversed tool moves along on the surface of an object. The surface-fitting is the process of constructing a compact representation to model the surface of an object based on a fairly large number of data points. The resulting offset surtace is a composite Bezier surface without self-intersection. When an appropriate tool-approach direction is selected, the tool path module generates the Cartesian tool paths while the deviation of the tool paths from the surface stays within the user-specified tolerance. The tool path module is a two-step process. The first step adaptively subdivides the offset surface into subpatches until the thickness of each subpatch is small enough to satisfy the user-defined tolerance. The second step generates the Cartesian tool paths by calculating the intersection of the slicing planes and the adaptively subdivided subpatches. This tool path generation approach generates the gouging-free Cartesian CL tool paths, and optimizes the cutter movements by minimizing the number of interpolated points.

  • PDF

The Development of Technique for the Visualization of Geological Information Using Geostatistics (지구통계학을 활용한 지반정보 가시화 기법 개발)

  • 송명규;김진하;황제돈;김승렬
    • Proceedings of the Korean Geotechical Society Conference
    • /
    • 2001.03a
    • /
    • pp.501-508
    • /
    • 2001
  • A graph or topographic map can often convey larger amounts of information in a shorter time than ordinary text-based methods. To visualize information precisely it is necessary to collect all the geological information at design stage, but actually it is almost impossible to bore or explore the entire area to gather the required data. So, tunnel engineers have to rely on the judgement of expert from the limited number of the results of exploration and experiment. In this study, several programs are developed to handle the results of geological investigation with various data processing techniques. The results of the typical case study are also presented. For the electric survey, eleven points are chosen at the valley to measure the resistivity using Schlumberger array. The measured data are interpolated in 3-dimensional space by kriging and the distribution of resistivity are visualized to find weak or fractured zone. The correlation length appears to be around 5 to 20 meter in depth. Regression analyses were performed to find a correlation length. No nugget effect is assumed, and the topographic map, geologic formation, fault zone, joint geometry and the distribution of resistivity are successfully visualized by using the proposed technique.

  • PDF

3D data Compression by Modulating Function Based Decimation (변조함수를 이용한 decimation기법에 의한 3D 데이터 압축)

  • Yang, Hun-Gi;Lee, Seung-Hyeon;Gang, Bong-Sun
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.37 no.5
    • /
    • pp.16-22
    • /
    • 2000
  • This paper presents a compression algorithm applicable for transmitting a HPO hologram data. The proposed algorithm exploits a modulating function to compress the bandwidth of the hologram pattern, resulting in decimation due to relaxed Nyquist sampling constraints. At the receiver, the compressed data will be interpolated and compensated via being divided by the modulating function. We also present compression rate and analyze the resolution of a reconstructed image and the periodicity of harmonic interferences. Finally, we shows the validity of the proposed algorithm by simulation where a reconstructed image from undersampled data is compared with a reconstructed image obtained through decimatioin by modulating function, interpolation and compensation.

  • PDF

Spatial interpolation of geotechnical data: A case study for Multan City, Pakistan

  • Aziz, Mubashir;Khan, Tanveer A.;Ahmed, Tauqir
    • Geomechanics and Engineering
    • /
    • v.13 no.3
    • /
    • pp.475-488
    • /
    • 2017
  • Geotechnical data contributes substantially to the cost of engineering projects due to increasing cost of site investigations. Existing information in the form of soil maps can save considerable time and expenses while deciding the scope and extent of site exploration for a proposed project site. This paper presents spatial interpolation of data obtained from soil investigation reports of different construction sites and development of soil maps for geotechnical characterization of Multan area using ArcGIS. The subsurface conditions of the study area have been examined in terms of soil type and standard penetration resistance. The Inverse Distance Weighting method in the Spatial Analyst extension of ArcMap10 has been employed to develop zonation maps at different depths of the study area. Each depth level has been interpolated as a surface to create zonation maps for soil type and standard penetration resistance. Correlations have been presented based on linear regression of standard penetration resistance values with depth for quick estimation of strength and stiffness of soil during preliminary planning and design stage of a proposed project in the study area. Such information helps engineers to use data derived from nearby sites or sites of similar subsoils subjected to similar geological process to build a preliminary ground model for a new site. Moreover, reliable information on geometry and engineering properties of underground layers would make projects safer and economical.

Real-time SCR-HP(Selective catalytic reduction - high pressure) valve temperature collection and failure prediction using ARIMA (ARIMA를 활용한 실시간 SCR-HP 밸브 온도 수집 및 고장 예측)

  • Lee, Suhwan;Hong, Hyeonji;Park, Jisoo;Yeom, Eunseop
    • Journal of the Korean Society of Visualization
    • /
    • v.19 no.1
    • /
    • pp.62-67
    • /
    • 2021
  • Selective catalytic reduction(SCR) is an exhaust gas reduction device to remove nitro oxides (NOx). SCR operation of ship can be controlled through valves for minimizing economic loss from SCR. Valve in SCR-high pressure (HP) system is directly connected to engine exhaust and operates in high temperature and high pressure. Long-term thermal deformation induced by engine heat weakens the sealing of the valve, which can lead to unexpected failures during ship sailing. In order to prevent the unexpected failures due to long-term valve thermal deformation, a failure prediction system using autoregressive integrated moving average (ARIMA) was proposed. Based on the heating experiment, virtual data mimicking temperature range around the SCR-HP valve were produced. By detecting abnormal temperature rise and fall based on the short-term ARIMA prediction, an algorithm determines whether present temperature data is required for failure prediction. The signal processed by the data collection algorithm was interpolated for the failure prediction. By comparing mean average error (MAE) and root mean square error (RMSE), ARIMA model and suitable prediction instant were determined.

Estimation of Net Flux of Water Mass and Tidal Prism at a Tidal Entrance through Bottom Tracking with ADCP (단면 유속관측을 통한 조석 유입구에서의 단면통과 유량 및 조량 산정)

  • Yang, Su-Hyun;Kim, Yong-Muk;Hwang, Kyu-Nam
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.28 no.3
    • /
    • pp.160-170
    • /
    • 2016
  • In this study, the bottom tracking observation in the tidal entrance within Mokpo coast is performed using ADCP in order to estimate net flux of water mass and tidal prism. First of all, observed rawdata was conducted coordinate rotation considering rotation of the cross-section in order to derive the predominant velocity component. And observed rawdata is converted into Sigma coordinate with normalization and blank zone data near the water surface and bottom is interpolated using von-Karman equation. Net flux of water mass is calculated quantitively from the interpolated data, calculated results show that these represent well characteristic of ebb superiority at Mokpo coast as well as change of net flux of water mass with tide. Also, by complementing the definition of tidal prism proposed in past studies, the definition of tidal prism including tidal condition was re-established. Based on the new definition, tidal prism at a tidal entrance using bottom tracking data with ADCP is estimated quantitively for the first time domestically. The results are compared with those for results of previous study, calculated results were in good agreement with previous studies.

Possibility analysisof future droughts using long short term memory and standardized groundwater level index (LSTM과 SGI를 이용한 미래 가뭄 발생 가능성 분석)

  • Lim, Jae Deok;Yang, Jeong-Seok
    • Journal of Korea Water Resources Association
    • /
    • v.53 no.2
    • /
    • pp.131-140
    • /
    • 2020
  • The purpose of this study is to analyze the possibility of future droughts by calculating the Standardized Groundwater level Index(SGI) after predicting groundwater level using Long Short Term Memory (LSTM) model. The groundwater level of the Kumho River basin was predicted for the next three years by using the LSTM model, and it was validated through RMSE after learning with observation data except the last three years. The temporal SGI was calculated by using the prediction data and the observation data. The calculated SGI was interpolated within the study area, and the spatial SGI was calculated as the average value for each catchment using the interpolated SGI. The possibility of spatio-temporal drought was analyzed using calculated spatio-temporal SGI. It is confirmed that there is a spatio-temporal difference in the possibility of drought. Through the improvement of deep learning model and diversification of validation method, it is expected to obtain more reliable prediction results and the expansion of study area can be used to respond to drought nationwide, and furthermore it can provide important information for future water resource management.