• Title/Summary/Keyword: Traditional Statistical

Search Result 924, Processing Time 0.021 seconds

Machine Learning vs. Statistical Model for Prediction Modelling: Application in Medical Imaging Research (예측모형의 머신러닝 방법론과 통계학적 방법론의 비교: 영상의학 연구에서의 적용)

  • Leeha Ryu;Kyunghwa Han
    • Journal of the Korean Society of Radiology
    • /
    • v.83 no.6
    • /
    • pp.1219-1228
    • /
    • 2022
  • Clinical prediction models has been increasingly published in radiology research. In particular, as a radiomics research is being actively conducted, the prediction model is developed based on the traditional statistical model, as well as machine learning, to account for the high-dimensional data. In this review, we investigated the statistical and machine learning methods used in clinical prediction model research, and briefly summarized each analytical method for statistical model, machine learning, and statistical learning. Finally, we discussed several considerations for choosing the prediction modeling method.

Survey Analysis of the Traditional Characteristics of Children's Play-Songs (놀이노래 가사의 실태와 가사의 전통성 조사)

  • Yi, Soon Hyung
    • Korean Journal of Child Studies
    • /
    • v.12 no.1
    • /
    • pp.68-77
    • /
    • 1991
  • The purpose of this study was to analyze the words of the songs in children's games in order to determine the traditional characteristics of their rhythm and subject matter and to investigate age and sex differences in recognition of the songs. 71 play songs were used for content analysis. After this, 840 subjects in 15 grades (preschool, first to 12th grades, and college students) were presented with the songs. Statistical analysis was done by one-way ANOVA and Scheffe. The songs exhibited traditional rhythms (3.3 or 4.4) and subject matter (nature, everyday life, and family). Some historical events such as the Korean war, and current phenomena such as TV, were also included. Sex but not age differences were found in recognition of the songs.

  • PDF

한국과 미국간 항공기 탑승객 수 예측을 위한 뉴럴네트웍의 응용

  • 남경두
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1995.09a
    • /
    • pp.334-343
    • /
    • 1995
  • In recent years, neural networks have been developed as an alternative to traditional statistical techniques. In this study, a neural network model was compared to traditional forecasting models in terms of their capabilities to forecast passenger traffic for flights between U.S. and Korea. The results show that the forecasting ability of the neural networks was superior to the traditional models. In terms of accuracy, the performance of the neural networks was quite encouraging. Using mean absolute deviation, the neural network performed best. The new technique is easy to learn and apply with commercial neural network software. Therefore, airline decision makers should benefit from using neural networks in forecasting passenger loads.

  • PDF

Dose Sol Raises Consumer Prices? (음력설이 소비자물가에 영향을 미치는가?)

  • Lee, Geung Hui
    • The Korean Journal of Applied Statistics
    • /
    • v.12 no.2
    • /
    • pp.387-387
    • /
    • 1999
  • The traditional holiday, Sol which is based on a lunar calendar, falls in January orFebruary and makes it difficult to analyze time series data accurately. To analyze whetherSol raises consumer prices or not, RegARIMA models and paired t tests are used. It isfound that Sol raises consumer prices of food products significantly, but So1's effects onconsumer prices of all items are not significant.

Dual Generalized Maximum Entropy Estimation for Panel Data Regression Models

  • Lee, Jaejun;Cheon, Sooyoung
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.5
    • /
    • pp.395-409
    • /
    • 2014
  • Data limited, partial, or incomplete are known as an ill-posed problem. If the data with ill-posed problems are analyzed by traditional statistical methods, the results obviously are not reliable and lead to erroneous interpretations. To overcome these problems, we propose a dual generalized maximum entropy (dual GME) estimator for panel data regression models based on an unconstrained dual Lagrange multiplier method. Monte Carlo simulations for panel data regression models with exogeneity, endogeneity, or/and collinearity show that the dual GME estimator outperforms several other estimators such as using least squares and instruments even in small samples. We believe that our dual GME procedure developed for the panel data regression framework will be useful to analyze ill-posed and endogenous data sets.

Development of Measurement Assurance Test Procedures between Calibrations (계기 검교정간의 보증시험 절차의 개발)

  • Yum, Bong-Jin;Cho, Jae-Gyeun;Lee, Dong-Wha
    • IE interfaces
    • /
    • v.6 no.1
    • /
    • pp.55-65
    • /
    • 1993
  • A nonstandard instrument used in the filed frequently becomes out-of-calibration due to environmental noise, misuse, aging, etc. A substantial amount of loss may result if such nonstandard instrument is used to check product quality and performance. Traditional periodic calibration at the calibration center is not capable of detecting out-of-calibration status while the instrument is in use, and therefore, statistical methods need to be developed to check the status of a nonstandard instrument in the field between calibrations. Developed in this paper is a unified measurement assurance model in which statistical calibration at the calibration center and measurement assurance test in the filed are combined. We developed statistical procedures to detect changes in precision and in the coefficients of the calibration equation. Futher, computational experiments are conducted to evaluate how the power of test varies with respect to the parameters involved. Based upon the computational results we suggest procedures for designing effective measurement assurance tests.

  • PDF

A comparative Study of ARIMA and Neural Network Model;Case study in Korea Corporate Bond Yields

  • Kim, Steven H.;Noh, Hyunju
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1996.10a
    • /
    • pp.19-22
    • /
    • 1996
  • A traditional approach to the prediction of economic and financial variables takes the form of statistical models to summarize past observations and to project them into the envisioned future. Over the past decade, an increasing number of organizations has turned to the use of neural networks. To date, however, many spheres of interest still lack a systematic evaluation of the statistical and neural approaches. One of these lies in the prediction of corporate bond yields for Korea. This paper reports on a comparative evaluation of ARIMA models and neural networks in the context of interest rate prediction. An additional experiment relates to an integration of the two methods. More specifically, the statistical model serves as a filter by providing estimtes which are then used as input into the neural network models.

  • PDF

Statistical Analysis of Breakdown Field Distribution of PECVD SiN Films (PECVD SiN 막의 절연파괴 전계분포의 통계적 고찰)

  • Sung, Yung-Kwon;Han, Joo-Min;Oh, Jae-Ha
    • Proceedings of the Korean Institute of Electrical and Electronic Material Engineers Conference
    • /
    • 1988.05a
    • /
    • pp.84-87
    • /
    • 1988
  • cIn this paper. we evaluate the breakdown and TDDB characteristics of ammonia free ECVD SiN films which studied widely as a gate insulator to substitute the silicon dioxide because of it's superior film characteristics with the merit of low temperature process. And also, we propose a new statistical model by introduce a dispersion factor in the traditional Weibull statistics. From the comparison of experimental result, and simulation one, try to dock the breakdown mechanism and statistical analysis.

  • PDF

Development of an Item Selection Method for Test-Construction by using a Relationship Structure among Abilities

  • Kim, Sung-Ho;Jeong, Mi-Sook;Kim, Jung-Ran
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.1
    • /
    • pp.193-207
    • /
    • 2001
  • When designing a test set, we need to consider constraints on items that are deemed important by item developers or test specialists. The constraints are essentially on the components of the test domain or abilities relevant to a given test set. And so if the test domain could be represented in a more refined form, test construction would be made in a more efficient way. We assume that relationships among task abilities are representable by a causal model and that the item response theory (IRT) is not fully available for them. In such a case we can not apply traditional item selection methods that are based on the IRT. In this paper, we use entropy as an uncertainty measure for making inferences on task abilities and developed an optimal item selection algorithm which reduces most the entropy of task abilities when items are selected from an item pool.

  • PDF