• Title/Summary/Keyword: prediction

Search Result 26,056, Processing Time 0.056 seconds

Groundwater Level Responses due to Moderate·Small Magnitude Earthquakes Using 1Hz groundwater Data (1Hz 지하수 데이터를 활용한 중·소규모 지진으로 인한 지하수위 반응)

  • Gahyeon Lee;Jae Min Lee;Dongkyu Park;Dong-Hun Kim;Jaehoon Jung;Soo-Hyoung Lee
    • Journal of Soil and Groundwater Environment
    • /
    • v.29 no.4
    • /
    • pp.32-43
    • /
    • 2024
  • Recently, numerous earthquakes have caused significant casualties and property damage worldwide, including major events in 2023 (Türkiye, M7.8; Morocco, M6.8) and 2024 (Noto Peninsula, Japan, M7.6; Taiwan, M7.4). In South Korea, the frequency of detectable and noticeable earthquakes has been gradually increasing since the M5.8 Gyeongju Earthquake. Notable recent events include those in Jeju (M4.9), Goesan (M4.1), the East Sea (M4.5), and Gyeongju (M4.0) since 2020. This study, for the first time in South Korea, monitored groundwater levels and temperatures at a 1Hz frequency to observe the responses in groundwater to moderate and small earthquakes primarily occurring within the country. Between April 23, 2023, and May 22, 2023, 17 earthquakes were reported in the East Sea region with magnitudes ranging from M2.0 to M4.5. Analysis of groundwater level responses at the Gangneung observation station revealed fluctuations associated with five of these events. The 1Hz observation data clearly showed groundwater level changes even for small earthquakes, indicating that groundwater is highly sensitive to the frequent small earthquakes recently occurring in South Korea. The analysis confirmed that the maximum amplitude of groundwater level changes due to earthquakes is proportional to the earthquake's magnitude and the distance from the epicenter. These findings highlight the importance of precise 1Hz-level observations in earthquake-groundwater research. This study provides foundational data for earthquake monitoring and prediction and emphasizes the need for ongoing research into monitoring the changes in groundwater parameters (such as aquifer characteristics, quantity/quality, and contaminant migration) induced by various magnitudes of earthquakes that may occur within the country in the future.

In a Time of Change: Reflections on Humanities Research and Methodologies (변화의 시대, 인문학적 변화 연구와 방법에 대한 고찰)

  • Kim Dug-sam
    • Journal of the Daesoon Academy of Sciences
    • /
    • v.49
    • /
    • pp.265-294
    • /
    • 2024
  • This study begins with a question about research methods in humanities. It is grounded in the humanities, focusing on the changes that have brought light and darkness to the humanities, and focusing on discourse regarding research methods that explore those changes. If the role of the humanities is to prevent the proverbial "gray rhino," unlike the sciences, and if the humanities have a role to play in moderating the uncontrollable development of the sciences, what kind of research methods should humanities pursue. Furthermore, what kind of research methods should be pursued in the humanities, in line with the development of the sciences and the changing environment? This study discusses research methods in the humanities as follows: first, in Section 2, I advocate for the collaboration between humanities and scientific methods, utilizing accumulated assets produced by humanities and continuously introducing scientific methods. Prediction of change is highly precise and far-reaching in engineering and the natural sciences. However, it is difficult to approach change in these fields in a macro or integrated manner. Because they are not precise, they are not welcome in disciplines that deal with the real world. This is primarily the responsibility of humanities. Where science focuses on precision, humanities focuses on questions of essence. This is because while the ends of change have varied throughout history, the nature of change has not varied that much. Section 3 then discusses the changing environment, proposals for changes to humanistic research methods, reviews and proposals inductive change research methods, and makes some suggestions for humanistic change research. The data produced by the field of humanities accumulated by humankind in the past is abundant and has a wide range of applications. In the future, we should not only actively accept the results of scientific advances but also actively seek systematic humanistic approaches and utilize them across disciplinary boundaries to find solutions at the intersection of scientific methods and humanistic assets.

Comparing the Performance of a Deep Learning Model (TabPFN) for Predicting River Algal Blooms with Varying Data Composition (데이터 구성에 따른 하천 조류 예측 딥러닝 모형 (TabPFN) 성능 비교)

  • Hyunseok Yang;Jungsu Park
    • Journal of Wetlands Research
    • /
    • v.26 no.3
    • /
    • pp.197-203
    • /
    • 2024
  • The algal blooms in rivers can negatively affect water source management and water treatment processes, necessitating continuous management. In this study, a multi-classification model was developed to predict the concentration of chlorophyll-a (chl-a), one of the key indicators of algal blooms, using Tabular Prior Fitted Networks (TabPFN), a novel deep learning algorithm known for its relatively superior performance on small tabular datasets. The model was developed using daily observation data collected at Buyeo water quality monitoring station from January 1, 2014, to December 31, 2022. The collected data were averaged to construct input data sets with measurement frequencies of 1 day, 3 days, 6 days, 12 days. The performance comparison of the four models, constructed with input data on observation frequencies of 1 day, 3 days, 6 days, and 12 days, showed that the model exhibits stable performance even when the measurement frequency is longer and the number of observations is smaller. The macro average for each model were analyzed as follows: Precision was 0.77, 0.76, 0.83, 0.84; Recall was 0.63, 0.65, 0.66, 0.74; F1-score was 0.67, 0.69, 0.71, 0.78. For the weighted average, Precision was 0.76, 0.77, 0.81, 0.84; Recall was 0.76, 0.78, 0.81, 0.85; F1-score was 0.74, 0.77, 0.80, 0.84. This study demonstrates that the chl-a prediction model constructed using TabPFN exhibits stable performance even with small-scale input data, verifying the feasibility of its application in fields where the input data required for model construction is limited.

Real-Time 3D Volume Deformation and Visualization by Integrating NeRF, PBD, and Parallel Resampling (NeRF, PBD 및 병렬 리샘플링을 결합한 실시간 3D 볼륨 변형체 시각화)

  • Sangmin Kwon;Sojin Jeon;Juni Park;Dasol Kim;Heewon Kye
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.189-198
    • /
    • 2024
  • Research combining deep learning-based models and physical simulations is making important advances in the medical field. This extracts the necessary information from medical image data and enables fast and accurate prediction of deformation of the skeleton and soft tissue based on physical laws. This study proposes a system that integrates Neural Radiance Fields (NeRF), Position-Based Dynamics (PBD), and Parallel Resampling to generate 3D volume data, and deform and visualize them in real-time. NeRF uses 2D images and camera coordinates to produce high-resolution 3D volume data, while PBD enables real-time deformation and interaction through physics-based simulation. Parallel Resampling improves rendering efficiency by dividing the volume into tetrahedral meshes and utilizing GPU parallel processing. This system renders the deformed volume data using ray casting, leveraging GPU parallel processing for fast real-time visualization. Experimental results show that this system can generate and deform 3D data without expensive equipment, demonstrating potential applications in engineering, education, and medicine.

Predicting 30-day mortality in severely injured elderly patients with trauma in Korea using machine learning algorithms: a retrospective study

  • Jonghee Han;Su Young Yoon;Junepill Seok;Jin Young Lee;Jin Suk Lee;Jin Bong Ye;Younghoon Sul;Se Heon Kim;Hong Rye Kim
    • Journal of Trauma and Injury
    • /
    • v.37 no.3
    • /
    • pp.201-208
    • /
    • 2024
  • Purpose: The number of elderly patients with trauma is increasing; therefore, precise models are necessary to estimate the mortality risk of elderly patients with trauma for informed clinical decision-making. This study aimed to develop machine learning based predictive models that predict 30-day mortality in severely injured elderly patients with trauma and to compare the predictive performance of various machine learning models. Methods: This study targeted patients aged ≥65 years with an Injury Severity Score of ≥15 who visited the regional trauma center at Chungbuk National University Hospital between 2016 and 2022. Four machine learning models-logistic regression, decision tree, random forest, and eXtreme Gradient Boosting (XGBoost)-were developed to predict 30-day mortality. The models' performance was compared using metrics such as area under the receiver operating characteristic curve (AUC), accuracy, precision, recall, specificity, F1 score, as well as Shapley Additive Explanations (SHAP) values and learning curves. Results: The performance evaluation of the machine learning models for predicting mortality in severely injured elderly patients with trauma showed AUC values for logistic regression, decision tree, random forest, and XGBoost of 0.938, 0.863, 0.919, and 0.934, respectively. Among the four models, XGBoost demonstrated superior accuracy, precision, recall, specificity, and F1 score of 0.91, 0.72, 0.86, 0.92, and 0.78, respectively. Analysis of important features of XGBoost using SHAP revealed associations such as a high Glasgow Coma Scale negatively impacting mortality probability, while higher counts of transfused red blood cells were positively correlated with mortality probability. The learning curves indicated increased generalization and robustness as training examples increased. Conclusions: We showed that machine learning models, especially XGBoost, can be used to predict 30-day mortality in severely injured elderly patients with trauma. Prognostic tools utilizing these models are helpful for physicians to evaluate the risk of mortality in elderly patients with severe trauma.

Clinical implementation of PerFRACTIONTM for pre-treatment patient-specific quality assurance

  • Sang-Won Kang;Boram Lee;Changhoon Song;Keun-Yong Eeom;Bum-Sup Jang;In Ah Kim;Jae-Sung Kim;Jin-Beom Chung;Seonghee Kang;Woong Cho;Dong-Suk Shin;Jin-Young Kim;Minsoo Chun
    • Journal of the Korean Physical Society
    • /
    • v.80
    • /
    • pp.516-525
    • /
    • 2022
  • This study is to assess the clinical use of commercial PerFRACTIONTM for patient-specific quality assurance of volumetric-modulated arc therapy. Forty-six pretreatment verification plans for patients treated using a TrueBeam STx linear accelerator for lesions in various treatment sites such as brain, head and neck (H&N), prostate, and lung were included in this study. All pretreatment verification plans were generated using the Eclipse treatment planning system (TPS). Dose distributions obtained from electronic portal imaging device (EPID), ArcCHECKTM, and two-dimensional (2D)/three-dimensional (3D) PerFRACTIONTM were then compared with the dose distribution calculated from the Eclipse TPS. In addition, the correlation between the plan complexity (the modulation complexity score and the leaf travel modulation complexity score) and the gamma passing rates (GPRs) of each quality assurance (QA) system was evaluated by calculating Spearman's rank correlation coefficient (rs) with the corresponding p-values. The gamma passing rates of 46 patients analyzed with the 2D/3D PerFRACTIONTM using the 2%/2 mm and 3%/3 mm criteria showed almost similar trends to those analyzed with the Portal dose imaging prediction (PDIP) and ArcCHECKTM except for those analyzed with ArcCHECKTM using the 2%/2 mm criterion. Most of weak or moderate correlations between GPRs and plan complexity were observed for all QA systems. The trend of mean rs between GPRs using PDIP and 2D/3D PerFRACTIONTM for both criteria and plan complexity indices as in the GPRs analysis was significantly similar for brain, prostate, and lung cases with lower complexity compared to H&N case. Furthermore, the trend of mean rs for 2D/3D PerFRACTIONTM for H&N case with high complexity was similar to that of ArcCHECKTM and slightly lower correlation was observed than that of PDIP. This work showed that the performance of 2D/3D PerFRACTIONTM for pretreatment patient-specific QA was almost comparable to that of PDIP, although there was small difference from ArcCHECKTM for some cases. Thus, we found that the PerFRACTIONTM is a suitable QA system for pretreatment patient-specific QA in a variety of treatment sites.

Comparative assessment of sequential data assimilation-based streamflow predictions using semi-distributed and lumped GR4J hydrologic models: a case study of Namgang Dam basin (준분포형 및 집중형 GR4J 수문모형을 활용한 순차자료동화 기반 유량 예측 특성 비교: 남강댐 유역 사례)

  • Lee, Garim;Woo, Dong Kook;Noh, Seong Jin
    • Journal of Korea Water Resources Association
    • /
    • v.57 no.9
    • /
    • pp.585-598
    • /
    • 2024
  • To mitigate natural disasters and efficiently manage water resources, it is essential to enhance hydrologic prediction while reducing model structural uncertainties. This study analyzed the impact of lumped and semi-distributed GR4J model structures on simulation performance and evaluated uncertainties with and without data assimilation techniques. The Ensemble Kalman Filter (EnKF) and Particle Filter (PF) methods were applied to the Namgang Dam basin. Simulation results showed that the Kling-Gupta efficiency (KGE) index was 0.749 for the lumped model and 0.831 for the semi-distributed model, indicating improved performance in semi-distributed modeling by 11.0%. Additionally, the impact of uncertainties in meteorological forcings (precipitation and potential evapotranspiration) on data assimilation performance was analyzed. Optimal uncertainty conditions varied by data assimilation method for the lumped model and by sub-basin for the semi-distributed model. Moreover, reducing the calibration period length during data assimilation led to decreased simulation performance. Overall, the semi-distributed model showed improved flood simulation performance when combined with data assimilation compared to the lumped model. Selecting appropriate hyper-parameters and calibration periods according to the model structure was crucial for achieving optimal performance.

Study on Method to Develop Case-based Security Threat Scenario for Cybersecurity Training in ICS Environment (ICS 환경에서의 사이버보안 훈련을 위한 사례 기반 보안 위협 시나리오 개발 방법론 연구)

  • GyuHyun Jeon;Kwangsoo Kim;Jaesik Kang;Seungwoon Lee;Jung Taek Seo
    • Journal of Platform Technology
    • /
    • v.12 no.1
    • /
    • pp.91-105
    • /
    • 2024
  • As the number of cases of applying IT systems to the existing isolated ICS (Industrial Control System) network environment continues to increase, security threats in the ICS environment have rapidly increased. Security threat scenarios help to design security strategies in cybersecurity training, including analysis, prediction, and response to cyberattacks. For successful cybersecurity training, research is needed to develop valid and reliable security threat scenarios for meaningful training. Therefore, this paper proposes a case-based security threat scenario development methodology for cybersecurity training in the ICS environment. To this end, we develop a methodology consisting of five steps based on analyzing actual cybersecurity incident cases targeting ICS. Threat techniques are standardized in the same form using objective data based on the MITER ATT&CK framework, and then a list of CVEs and CWEs corresponding to the threat technique is identified. Additionally, it analyzes and identifies vulnerable functions in programming used in CWE and ICS assets. Based on the data generated up to the previous stage, develop security threat scenarios for cybersecurity training for new ICS. As a result of verification through a comparative analysis between the proposed methodology and existing research confirmed that the proposed method was more effective than the existing method regarding scenario validity, appropriateness of evidence, and development of various scenarios.

  • PDF

Discussion on Detection of Sediment Moisture Content at Different Altitudes Employing UAV Hyperspectral Images (무인항공 초분광 영상을 기반으로 한 고도에 따른 퇴적물 함수율 탐지 고찰)

  • Kyoungeun Lee;Jaehyung Yu;Chanhyeok Park;Trung Hieu Pham
    • Economic and Environmental Geology
    • /
    • v.57 no.4
    • /
    • pp.353-362
    • /
    • 2024
  • This study examined the spectral characteristics of sediments according to moisture content using an unmanned aerial vehicle (UAV)-based hyperspectral sensor and evaluated the efficiency of moisture content detection at different flight altitudes. For this purpose, hyperspectral images in the 400-1000nm wavelength range were acquired and analyzed at altitudes of 40m and 80m for sediment samples with various moisture contents. The reflectance of the sediments generally showed a decreasing trend as the moisture content increased. Correlation analysis between moisture content and reflectance showed a strong negative correlation (r < -0.8) across the entire 400-900nm range. The moisture content detection model constructed using the Random Forest technique showed detection accuracies of RMSE 2.6%, R2 0.92 at 40m altitude and RMSE 2.2%, R2 0.95 at 80m altitude, confirming that the difference in accuracy between altitudes was minimal. Variable importance analysis revealed that the 600-700nm band played a crucial role in moisture content detection. This study is expected to be utilized in efficient sediment moisture management and natural disaster prediction in the field of environmental monitoring in the future.

Improving the Accuracy of the Mohr Failure Envelope Approximating the Generalized Hoek-Brown Failure Criterion (일반화된 Hoek-Brown 파괴기준식의 근사 Mohr 파괴포락선 정확도 개선)

  • Youn-Kyou Lee
    • Tunnel and Underground Space
    • /
    • v.34 no.4
    • /
    • pp.355-373
    • /
    • 2024
  • The Generalized Hoek-Brown (GHB) criterion is a nonlinear failure criterion specialized for rock engineering applications and has recently seen increased usage. However, the GHB criterion expresses the relationship between minimum and maximum principal stresses at failure, and when GSI≠100, it has disadvantage of being difficult to express as an explicit relationship between the normal and shear stresses acting on the failure plane, i.e., as a Mohr failure envelope. This disadvantage makes it challenging to apply the GHB criterion in numerical analysis techniques such as limit equilibrium analysis, upper-bound limit analysis, and the critical plane approach. Consequently, recent studies have attempted to express the GHB Mohr failure envelope as an approximate analytical formula, and there is still a need for continued interest in related research. This study presents improved formulations for the approximate GHB Mohr failure envelope, offering higher accuracy in predicting shear strength compared to existing formulas. The improved formulation process employs a method to enhance the approximation accuracy of the tangential friction angle and utilizes the tangent line equation of the nonlinear GHB failure envelope to improve the accuracy of shear strength approximation. In the latter part of this paper, the advantages and limitations of the proposed approximate GHB failure envelopes in terms of shear strength prediction accuracy and calculation time are discussed.