• Title/Summary/Keyword: prediction error methods

Search Result 518, Processing Time 0.026 seconds

Assessment of genomic prediction accuracy using different selection and evaluation approaches in a simulated Korean beef cattle population

  • Nwogwugwu, Chiemela Peter;Kim, Yeongkuk;Choi, Hyunji;Lee, Jun Heon;Lee, Seung-Hwan
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.33 no.12
    • /
    • pp.1912-1921
    • /
    • 2020
  • Objective: This study assessed genomic prediction accuracies based on different selection methods, evaluation procedures, training population (TP) sizes, heritability (h2) levels, marker densities and pedigree error (PE) rates in a simulated Korean beef cattle population. Methods: A simulation was performed using two different selection methods, phenotypic and estimated breeding value (EBV), with an h2 of 0.1, 0.3, or 0.5 and marker densities of 10, 50, or 777K. A total of 275 males and 2,475 females were randomly selected from the last generation to simulate ten recent generations. The simulation of the PE dataset was modified using only the EBV method of selection with a marker density of 50K and a heritability of 0.3. The proportions of errors substituted were 10%, 20%, 30%, and 40%, respectively. Genetic evaluations were performed using genomic best linear unbiased prediction (GBLUP) and single-step GBLUP (ssGBLUP) with different weighted values. The accuracies of the predictions were determined. Results: Compared with phenotypic selection, the results revealed that the prediction accuracies obtained using GBLUP and ssGBLUP increased across heritability levels and TP sizes during EBV selection. However, an increase in the marker density did not yield higher accuracy in either method except when the h2 was 0.3 under the EBV selection method. Based on EBV selection with a heritability of 0.1 and a marker density of 10K, GBLUP and ssGBLUP_0.95 prediction accuracy was higher than that obtained by phenotypic selection. The prediction accuracies from ssGBLUP_0.95 outperformed those from the GBLUP method across all scenarios. When errors were introduced into the pedigree dataset, the prediction accuracies were only minimally influenced across all scenarios. Conclusion: Our study suggests that the use of ssGBLUP_0.95, EBV selection, and low marker density could help improve genetic gains in beef cattle.

A New Vessel Path Prediction Method Based on Anticipation of Acceleration of Vessel (가속도 예측 기반 새로운 선박 이동 경로 예측 방법)

  • Kim, Jonghee;Jung, Chanho;Kang, Dokeun;Lee, Chang Jin
    • Journal of IKEEE
    • /
    • v.24 no.4
    • /
    • pp.1176-1179
    • /
    • 2020
  • Vessel path prediction methods generally predict the latitude and longitude of a future location directly. However, in the case of direct prediction, errors could be large since the possible output range is too broad. In addition, error accumulation could occur since recurrent neural networks-based methods employ previous predicted data to forecast future data. In this paper, we propose a vessel path prediction method that does not directly predict the longitude and latitude. Instead, the proposed method predicts the acceleration of the vessel. Then the acceleration is employed to generate the velocity and direction, and the values decide the longitude and latitude of the future location. In the experiment, we show that the proposed method makes smaller errors than the direct prediction method, while both methods employ the same model.

Estimation of Coverage Growth Functions

  • Park, Joong-Yang;Lee, Gye-Min;Kim, Seo-Yeong
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.5
    • /
    • pp.667-674
    • /
    • 2011
  • A recent trend in software reliability engineering accounts for the coverage growth behavior during testing. The coverage growth function (representing the coverage growth behavior) has become an essential component of software reliability models. Application of a coverage growth function requires the estimation of the coverage growth function. This paper considers the problem of estimating the coverage growth function. The existing maximum likelihood method is reviewed and corrected. A method of minimizing the sum of squares of the standardized prediction error is proposed for situations where the maximum likelihood method is not applicable.

A modified partial least squares regression for the analysis of gene expression data with survival information

  • Lee, So-Yoon;Huh, Myung-Hoe;Park, Mira
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.5
    • /
    • pp.1151-1160
    • /
    • 2014
  • In DNA microarray studies, the number of genes far exceeds the number of samples and the gene expression measures are highly correlated. Partial least squares regression (PLSR) is one of the popular methods for dimensional reduction and known to be useful for the classifications of microarray data by several studies. In this study, we suggest a modified version of the partial least squares regression to analyze gene expression data with survival information. The method is designed as a new gene selection method using PLSR with an iterative procedure of imputing censored survival time. Mean square error of prediction criterion is used to determine the dimension of the model. To visualize the data, plot for variables superimposed with samples are used. The method is applied to two microarray data sets, both containing survival time. The results show that the proposed method works well for interpreting gene expression microarray data.

Predicting the Unemployment Rate Using Social Media Analysis

  • Ryu, Pum-Mo
    • Journal of Information Processing Systems
    • /
    • v.14 no.4
    • /
    • pp.904-915
    • /
    • 2018
  • We demonstrate how social media content can be used to predict the unemployment rate, a real-world indicator. We present a novel method for predicting the unemployment rate using social media analysis based on natural language processing and statistical modeling. The system collects social media contents including news articles, blogs, and tweets written in Korean, and then extracts data for modeling using part-of-speech tagging and sentiment analysis techniques. The autoregressive integrated moving average with exogenous variables (ARIMAX) and autoregressive with exogenous variables (ARX) models for unemployment rate prediction are fit using the analyzed data. The proposed method quantifies the social moods expressed in social media contents, whereas the existing methods simply present social tendencies. Our model derived a 27.9% improvement in error reduction compared to a Google Index-based model in the mean absolute percentage error metric.

Performance Evaluation of Wavelet-based ECG Compression Algorithms over CDMA Networks (CDMA 네트워크에서의 ECG 압축 알고리즘의 성능 평가)

  • 김병수;유선국
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.53 no.9
    • /
    • pp.663-669
    • /
    • 2004
  • The mobile tole-cardiology system is the new research area that support an ubiquitous health care based on mobile telecommunication networks. Although there are many researches presenting the modeling concepts of a GSM-based mobile telemedical system, practical application needs to be considered both compression performance and error corruption in the mobile environment. This paper evaluates three wavelet ECG compression algorithms over CDMA networks. The three selected methods are Rajoub using EPE thresholding, Embedded Zerotree Wavelet(EZW) and Wavelet transform Higher Order Statistics Coding(WHOSC) with linear prediction. All methodologies protected more significant information using Forward Error Correction coding and measured not only compression performance in noise-free but also error robustness and delay profile in CDMA environment. In addition, from the field test we analyzed the PRD for movement speed and the features of CDMA 1X. The test results show that Rajoub has low robustness over high error attack and EZW contributes to more efficient exploitation in variable bandwidth and high error. WHOSC has high robustness in overall BER but loses performance about particular abnormal ECG.

Prediction of the Equivalent Coefficient of Thermal Expansion of Fiber Reinforced Plastic Lamina and Thermal Pointing Error Analysis of Satellites (섬유강화 복합재료 등가열팽창계수 예측 및 인공위성 열지향오차 해석)

  • You, Won Young;Lim, Jae Hyuk;Kim, Sun Won;Kim, Chang-Ho;Kim, Sung-Ho
    • Aerospace Engineering and Technology
    • /
    • v.13 no.1
    • /
    • pp.76-85
    • /
    • 2014
  • In this paper, the equivalent coefficient of thermal expansion (CTE) of fiber reinforced plastic composite material is investigated with various CTE prediction schemes. Although there are several methods for predicting the equivalent CTEs, most of them have some limitations of are not much accurate when comparing prediction results with test results. In the framework of computational homogenization, a representative volume element is taken from the predefined fiber-volume ratio, and modelled with finite element mesh. Finally, the equivalent CTEs are obtained by applying periodic boundary condition. To verify the performance of the proposed method, the results obtained are compared with those by the existing methods and test results. Additionally, the thermal pointing error analysis for star tracker support structure is conducted and its accuracy is estimated according to CTE prediction schemes.

Predicting Human Errors in Landing Situations of Aircraft by Using SHERPA (SHERPA기법을 이용한 항공기 착륙상황에서 발생 가능한 인적오류 예측)

  • Choi, Jae-Rim;Han, Hyeok Jae;Ham, Dong-Han
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.29 no.2
    • /
    • pp.14-24
    • /
    • 2021
  • This study aims to examine probable human errors when landing an airplane by the use of SHERPA(systematic human error reduction and prediction approach) and propose methods for preventing the predictive human errors. It has been reported that human errors are concerned with a lot of accidents or incidents of an airplane. It is significant to predict presumable human errors, particularly in the operation mode of human-automation interaction, and attempt to reduce the likelihood of predicted human error. By referring to task procedures and interviewing domain experts, we analyzed airplane landing task by using HTA(hierarchical task analysis) method. In total, 6 sub-tasks and 19 operations were identified from the task analysis. SHERPA method was used for predicting probable human error types for each task. As a result, we identified 31 human errors and predicted their occurrence probability and criticality. Based on them, we suggested a set of methods for minimizing the probability of the predicted human errors. From this study, it can be said that SHERPA can be effectively used for predicting probable human error types in the context of human-automation interaction needed for navigating an airplane.

Validations of Typhoon Intensity Guidance Models in the Western North Pacific (북서태평양 태풍 강도 가이던스 모델 성능평가)

  • Oh, You-Jung;Moon, Il-Ju;Kim, Sung-Hun;Lee, Woojeong;Kang, KiRyong
    • Atmosphere
    • /
    • v.26 no.1
    • /
    • pp.1-18
    • /
    • 2016
  • Eleven Tropical Cyclone (TC) intensity guidance models in the western North Pacific have been validated over 2008~2014 based on various analysis methods according to the lead time of forecast, year, month, intensity, rapid intensity change, track, and geographical area with an additional focus on TCs that influenced the Korean peninsula. From the evaluation using mean absolute error and correlation coefficients for maximum wind speed forecasts up to 72 h, we found that the Hurricane Weather Research and Forecasting model (HWRF) outperforms all others overall although the Global Forecast System (GFS), the Typhoon Ensemble Prediction System of Japan Meteorological Agency (TEPS), and the Korean version of Weather and Weather Research and Forecasting model (KWRF) also shows a good performance in some lead times of forecast. In particular, HWRF shows the highest performance in predicting the intensity of strong TCs above Category 3, which may be attributed to its highest spatial resolution (~3 km). The Navy Operational Global Prediction Model (NOGAPS) and GFS were the most improved model during 2008~2014. For initial intensity error, two Japanese models, Japan Meteorological Agency Global Spectral Model (JGSM) and TEPS, had the smallest error. In track forecast, the European Centre for Medium-Range Weather Forecasts (ECMWF) and recent GFS model outperformed others. The present results has significant implications for providing basic information for operational forecasters as well as developing ensemble or consensus prediction systems.

Kriging Interpolation Methods in Geostatistics and DACE Model

  • Park, Dong-Hoon;Ryu, Je-Seon;Kim, Min-Seo;Cha, Kyung-Joon;Lee, Tae-Hee
    • Journal of Mechanical Science and Technology
    • /
    • v.16 no.5
    • /
    • pp.619-632
    • /
    • 2002
  • In recent study on design of experiments, the complicate metamodeling has been studied because defining exact model using computer simulation is expensive and time consuming. Thus, some designers often use approximate models, which express the relation between some inputs and outputs. In this paper, we review and compare the complicate metamodels, which are expressed by the interaction of various data through trying many physical experiments and running a computer simulation. The prediction model in this paper employs interpolation schemes known as ordinary kriging developed in the fields of spatial statistics and kriging in Design and Analysis of Computer Experiments (DACE) model. We will focus on describing the definitions, the prediction functions and the algorithms of two kriging methods, and assess the error measures of those by using some validation methods.