• Title/Summary/Keyword: Model validation

Search Result 3,218, Processing Time 0.034 seconds

A Study on the Prediction of Community Smart Pension Intention Based on Decision Tree Algorithm

  • Liu, Lijuan;Min, Byung-Won
    • International Journal of Contents
    • /
    • v.17 no.4
    • /
    • pp.79-90
    • /
    • 2021
  • With the deepening of population aging, pension has become an urgent problem in most countries. Community smart pension can effectively resolve the problem of traditional pension, as well as meet the personalized and multi-level needs of the elderly. To predict the pension intention of the elderly in the community more accurately, this paper uses the decision tree classification method to classify the pension data. After missing value processing, normalization, discretization and data specification, the discretized sample data set is obtained. Then, by comparing the information gain and information gain rate of sample data features, the feature ranking is determined, and the C4.5 decision tree model is established. The model performs well in accuracy, precision, recall, AUC and other indicators under the condition of 10-fold cross-validation, and the precision was 89.5%, which can provide the certain basis for government decision-making.

A new extension of Lindley distribution: modified validation test, characterizations and different methods of estimation

  • Ibrahim, Mohamed;Yadav, Abhimanyu Singh;Yousof, Haitham M.;Goual, Hafida;Hamedani, G.G.
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.5
    • /
    • pp.473-495
    • /
    • 2019
  • In this paper, a new extension of Lindley distribution has been introduced. Certain characterizations based on truncated moments, hazard and reverse hazard function, conditional expectation of the proposed distribution are presented. Besides, these characterizations, other statistical/mathematical properties of the proposed model are also discussed. The estimation of the parameters is performed through different classical methods of estimation. Bayes estimation is computed under gamma informative prior under the squared error loss function. The performances of all estimation methods are studied via Monte Carlo simulations in mean square error sense. The potential of the proposed model is analyzed through two data sets. A modified goodness-of-fit test using the Nikulin-Rao-Robson statistic test is investigated via two examples and is observed that the new extension might be used as an alternative lifetime model.

Analysis of the effect of aged concrete layer on RC beams, and a strengthening method employing carbon-fiber-reinforced polymer (CFRP) sheets.

  • Liana Satlykova;Young Sook Roh
    • Architectural research
    • /
    • v.26 no.2
    • /
    • pp.31-39
    • /
    • 2024
  • The numerical study focuses on the analysis of the structural behavior of concrete beams containing outdated concrete and offers an innovative method of strengthening them using carbon-fiber-reinforced polymer sheets (CFRP). The focus is on modeling and analyzing the performance of aged concrete beams strengthened by CFRP in the flexural direction. This study presents an ultimate load model for CFRP-strengthened RC beams featuring outdated concrete layers. Validation through four-point bending tests and finite element modeling demonstrated the efficacy of the model. Findings indicate that CFRP sheets significantly enhance beam strength, particularly in structures with outdated concrete layers, resulting in increased ultimate load capacity. Moreover, an inverse relationship between ultimate load and concrete layer height was observed, with the CFS-21-15-30 sample exhibiting the most substantial reduction. Validation of the model was achieved using finite element analysis con-ducted in Abaqus software.

Failure simulation of nuclear pressure vessel under severe accident conditions: Part II - Failure modeling and comparison with OLHF experiment

  • Eui-Kyun Park;Jun-Won Park;Yun-Jae Kim;Yukio Takahashi;Kukhee Lim;Eung Soo Kim
    • Nuclear Engineering and Technology
    • /
    • v.55 no.11
    • /
    • pp.4134-4145
    • /
    • 2023
  • This paper proposes strain-based failure model of A533B1 pressure vessel steel to simulate failure, followed by application to OECD lower head failure (OLHF) test simulation for experimental validation. The proposed strain-based failure model uses simple constant and linear functions based on physical failure modes with the critical strain value determined either using the lower bound of true fracture strain or using the average value of total elongation depending on the temperature. Application to OECD Lower Head Failure (OLHF) tests shows that progressive deformation, failure time and failure location can be well predicted.

Development and Validation of a Model Using Radiomics Features from an Apparent Diffusion Coefficient Map to Diagnose Local Tumor Recurrence in Patients Treated for Head and Neck Squamous Cell Carcinoma

  • Minjae Kim;Jeong Hyun Lee;Leehi Joo;Boryeong Jeong;Seonok Kim;Sungwon Ham;Jihye Yun;NamKug Kim;Sae Rom Chung;Young Jun Choi;Jung Hwan Baek;Ji Ye Lee;Ji-hoon Kim
    • Korean Journal of Radiology
    • /
    • v.23 no.11
    • /
    • pp.1078-1088
    • /
    • 2022
  • Objective: To develop and validate a model using radiomics features from apparent diffusion coefficient (ADC) map to diagnose local tumor recurrence in head and neck squamous cell carcinoma (HNSCC). Materials and Methods: This retrospective study included 285 patients (mean age ± standard deviation, 62 ± 12 years; 220 male, 77.2%), including 215 for training (n = 161) and internal validation (n = 54) and 70 others for external validation, with newly developed contrast-enhancing lesions at the primary cancer site on the surveillance MRI following definitive treatment of HNSCC between January 2014 and October 2019. Of the 215 and 70 patients, 127 and 34, respectively, had local tumor recurrence. Radiomics models using radiomics scores were created separately for T2-weighted imaging (T2WI), contrast-enhanced T1-weighted imaging (CE-T1WI), and ADC maps using non-zero coefficients from the least absolute shrinkage and selection operator in the training set. Receiver operating characteristic (ROC) analysis was used to evaluate the diagnostic performance of each radiomics score and known clinical parameter (age, sex, and clinical stage) in the internal and external validation sets. Results: Five radiomics features from T2WI, six from CE-T1WI, and nine from ADC maps were selected and used to develop the respective radiomics models. The area under ROC curve (AUROC) of ADC radiomics score was 0.76 (95% confidence interval [CI], 0.62-0.89) and 0.77 (95% CI, 0.65-0.88) in the internal and external validation sets, respectively. These were significantly higher than the AUROC values of T2WI (0.53 [95% CI, 0.40-0.67], p = 0.006), CE-T1WI (0.53 [95% CI, 0.40-0.67], p = 0.012), and clinical parameters (0.53 [95% CI, 0.39-0.67], p = 0.021) in the external validation set. Conclusion: The radiomics model using ADC maps exhibited higher diagnostic performance than those of the radiomics models using T2WI or CE-T1WI and clinical parameters in the diagnosis of local tumor recurrence in HNSCC following definitive treatment.

Calibration and Validation of SWAT for the Neponset River Watershed in Boston (보스턴 넷폰셋강의 수질체계에 대한 스왓모델의 교정과 유효성 검증)

  • Lee, Ja-Won
    • Journal of the Korean association of regional geographers
    • /
    • v.14 no.1
    • /
    • pp.19-26
    • /
    • 2008
  • A validation study has been performed using the Soil and Water Assessment Tool(SWAT) model with data collected for the Neponset River watershed, which includes roughly 130 square miles of land located southwest of Boston. All of this land drains into the Neponset River, and ultimately into Boston Harbor. This paper presents the methodology of a SWAT model. The calculated contribution of the baseflow to the streamflow is far too high whereas the interflow is strongly underestimated. Alternatively, the modified and calibrated model yields far better results for the catchment. The modification allows hydrological processes to be modeled while not restraining the applicability of the model to catchments with other characteristics. For this study, the SWAT 2005 model is used with ArcGIS 9.1 as an interlace, and sensitivity analysis is performed to provide rough estimated values before adjusting sensitive input parameters during calibration period.

  • PDF

Hydrologic Calibration of HSPF Model using Parameter Estimation (PEST) Program at Imha Watershed (PEST를 이용한 임하호유역 HSPF 수문 보정)

  • Jeon, Ji-Hong;Kim, Tae-Il;Choi, Donghyuk;Lim, Kyung-Jae;Kim, Tae-Dong
    • Journal of Korean Society on Water Environment
    • /
    • v.26 no.5
    • /
    • pp.802-809
    • /
    • 2010
  • An automatic calibration tool of Hydrological Simulation Program-Fortran (HSPF), Parameter Estimation (PEST) program, was applied at the Imha lake watershed to get optimal hydrological parameters of HSPF. Calibration of HSPF parameters was performed during 2004 ~ 2008 by PEST and validation was carried out to examine the model's ability by using another data set of 1999 ~ 2003. The calibrated HSPF parameters had tendencies to minimize water loss to soil layer by infiltration and deep percolation and to atmosphere by evapotranspiration and maximize runoff rate. The results of calibration indicated that the PEST program could calibrate the hydrological parameters of HSPF with showing 0.83 and 0.97 Nash-Sutcliffe coefficient (NS) for daily and monthly stream flow and -3% of relative error for yearly stream flow. The validation results also represented high model efficiency with showing 0.88 and 0.95, -10% relative error for daily, monthly, and yearly stream flow. These statistical values of daily, monthly, and yearly stream flow for calibration and validation show a 'very good' agreement between observed and simulated values. Overall, the PEST program was useful for automatic calibration of HSPF, and reduced numerous time and effort for model calibration, and improved model setup.

Bankruptcy prediction using ensemble SVM model (앙상블 SVM 모형을 이용한 기업 부도 예측)

  • Choi, Ha Na;Lim, Dong Hoon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.6
    • /
    • pp.1113-1125
    • /
    • 2013
  • Corporate bankruptcy prediction has been an important topic in the accounting and finance field for a long time. Several data mining techniques have been used for bankruptcy prediction. However, there are many limits for application to real classification problem with a single model. This study proposes ensemble SVM (support vector machine) model which assembles different SVM models with each different kernel functions. Our ensemble model is made and evaluated by v-fold cross-validation approach. The k top performing models are recruited into the ensemble. The classification is then carried out using the majority voting opinion of the ensemble. In this paper, we investigate the performance of ensemble SVM classifier in terms of accuracy, error rate, sensitivity, specificity, ROC curve, and AUC to compare with single SVM classifiers based on financial ratios dataset and simulation dataset. The results confirmed the advantages of our method: It is robust while providing good performance.

Sensitivity Validation Technique for Sequential Kriging Metamodel (순차적 크리깅 메타모델의 민감도 검증법)

  • Huh, Seung-Kyun;Lee, Jin-Min;Lee, Tae-Hee
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.36 no.8
    • /
    • pp.873-879
    • /
    • 2012
  • Metamodels have been developed with a variety of design optimization techniques in the field of structural engineering over the last decade because they are efficient, show excellent prediction performance, and provide easy interconnections into design frameworks. To construct a metamodel, a sequential procedure involving steps such as the design of experiments, metamodeling techniques, and validation techniques is performed. Because validation techniques can measure the accuracy of the metamodel, the number of presampled points for an accurate kriging metamodel is decided by the validation technique in the sequential kriging metamodel. Because the interpolation model such as the kriging metamodel based on computer experiments passes through responses at presampled points, additional analyses or reconstructions of the metamodels are required to measure the accuracy of the metamodel if existing validation techniques are applied. In this study, we suggest a sensitivity validation that does not require additional analyses or reconstructions of the metamodels. Fourteen two-dimensional mathematical problems and an engineering problem are illustrated to show the feasibility of the suggested method.

Deep Learning Model Validation Method Based on Image Data Feature Coverage (영상 데이터 특징 커버리지 기반 딥러닝 모델 검증 기법)

  • Lim, Chang-Nam;Park, Ye-Seul;Lee, Jung-Won
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.10 no.9
    • /
    • pp.375-384
    • /
    • 2021
  • Deep learning techniques have been proven to have high performance in image processing and are applied in various fields. The most widely used methods for validating a deep learning model include a holdout verification method, a k-fold cross verification method, and a bootstrap method. These legacy methods consider the balance of the ratio between classes in the process of dividing the data set, but do not consider the ratio of various features that exist within the same class. If these features are not considered, verification results may be biased toward some features. Therefore, we propose a deep learning model validation method based on data feature coverage for image classification by improving the legacy methods. The proposed technique proposes a data feature coverage that can be measured numerically how much the training data set for training and validation of the deep learning model and the evaluation data set reflects the features of the entire data set. In this method, the data set can be divided by ensuring coverage to include all features of the entire data set, and the evaluation result of the model can be analyzed in units of feature clusters. As a result, by providing feature cluster information for the evaluation result of the trained model, feature information of data that affects the trained model can be provided.