• Title/Summary/Keyword: Choice Probability

Search Result 172, Processing Time 0.034 seconds

Evaluation of DVH and NTCP in Hepatoma for 3D Conformal Radiation Therapy (3차원 입체조형치료에 대한 간암의 선량분포와 정상조직손상확률의 평가)

  • Chung, Kap-Soo;Yang, Han-Joon;Ko, Shin-Gwan
    • Journal of radiological science and technology
    • /
    • v.20 no.2
    • /
    • pp.79-82
    • /
    • 1997
  • Image-based three dimensional radiation treatment planning(3D RTP) has a potential of generating superior treatment plans. Advances in computer technology and software developments quickly make 3D RTP a feasible choice for routine clinical use. However, it has become clear that an evaluation of a 3D plan is more difficult than a 2D plan. A number of tools have been developed to facilitate the evaluation of 3D RTP both qualitatively and quantitatively. For example, beam's eye view(BEV) is one of the most powerful and time-saving method as a qualitative tools. Dose-volume histogram(DVH) has been proven to be one of the most valuable methods for a quantitative tools. But it has a limitation to evaluate several different plans for biological effects of the tissue and critical organ. Therefore, there is a strong interest in developing quantitative models which would predict the likely biological response of irradiated organs and tissues, such as tumor control probability(TCP) and normal tissue complication probability(NTCP). DVH and NTCP of hepatoma were evaluated for three dimensional conformal radiotherapy(3D CRT). Also, 3D RTP was analysed as a dose optimization based on beam arrangement and beam modulation.

  • PDF

Exploring modern machine learning methods to improve causal-effect estimation

  • Kim, Yeji;Choi, Taehwa;Choi, Sangbum
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.2
    • /
    • pp.177-191
    • /
    • 2022
  • This paper addresses the use of machine learning methods for causal estimation of treatment effects from observational data. Even though conducting randomized experimental trials is a gold standard to reveal potential causal relationships, observational study is another rich source for investigation of exposure effects, for example, in the research of comparative effectiveness and safety of treatments, where the causal effect can be identified if covariates contain all confounding variables. In this context, statistical regression models for the expected outcome and the probability of treatment are often imposed, which can be combined in a clever way to yield more efficient and robust causal estimators. Recently, targeted maximum likelihood estimation and causal random forest is proposed and extensively studied for the use of data-adaptive regression in estimation of causal inference parameters. Machine learning methods are a natural choice in these settings to improve the quality of the final estimate of the treatment effect. We explore how we can adapt the design and training of several machine learning algorithms for causal inference and study their finite-sample performance through simulation experiments under various scenarios. Application to the percutaneous coronary intervention (PCI) data shows that these adaptations can improve simple linear regression-based methods.

Study on Variability of WTP Estimates by the Estimation Methods using Dichotomous Choice Contingent Valuation Data (양분선택형 조건부가치측정(CV) 자료의 추정방법에 따른 지불의사금액의 변동성 연구)

  • Shin, Youngchul
    • Environmental and Resource Economics Review
    • /
    • v.25 no.1
    • /
    • pp.1-25
    • /
    • 2016
  • This study investigated the variability of WTP estimates(i.e. mean or median) with ad hoc assumptions of specific parametric probability distributions(i.e. normal, logistic, lognormal, and exponential distribution) to estimate WTP function using dichotomous choice CV data on mortality risk reduction. From the perspective of policy decision, the variability of these WTP estimates are intolerable in comparison with those of Turnbull nonparametric estimation method which is free from ad hoc distribution assumptions. The Turnbull nonparametric estimation can avoid a kind of misspecification bias due to ad hoc assumption of specific parametric distributions. Furthermore, the WTP estimates by Turnbull nonparametric estimation are robust because the similar estimates are elicited from a dichotomous choice or double dichotomous choice CV data, and the statistically significant WTP estimates can be obtained even though it is not possible by parametric estimation methods. If there are considerable variability among those WTP estimates by parametric estimation methods in condition with no criteria of model adequacy, the mean WTPs from Turnbull nonparametric estimation can be the robust estimates without ad hoc assumptions, which can avoid controversial issues in the perspective of policy decisions.

Safety Regulation of Railway Embankment using Velocity of Failure Probability (파괴확률 변화속도를 이용한 철도 성토사면의 안전관리기준)

  • Kim, Hyun-Ki;Shin, Min-Ho;Lee, Sung-Hyeok;Choi, Chan-Yong
    • Journal of the Korean Society for Railway
    • /
    • v.12 no.6
    • /
    • pp.1037-1042
    • /
    • 2009
  • Safety regulation of railway embankment is restricted by safety factor in dry season or rainy season in Korea. Safety factor which is results from the limit equilibrium analysis is varied by various external conditions. And because it has no reflection point, it is very difficult to manage the safety of trains. Safety regulation such like warning sign, reduce speed and train stop is the best choice to reduce the damage of embankments where it is worried about occurrence of disasters. In this study, additional index is proposed to support present safety standards based on unsaturated soil mechanics and reliability analysis. It is velocity of failure probability. It has an apparent reflection point near present safety regulation. It is possible to modify the regulation for safety management and monitoring system of embankments by using this index.

The application of Multiple Discrete Continuous Extreme Value Model on fresh meat purchase in Korea (다중 이산 연속선택모형(MDCEV)을 이용한 한국 소비자의 신선육 구매 결정 요인)

  • Song, Cheol Ho;Eom, Jin Yong;Jang, Ik Hoon;Choe, Young Chan
    • Journal of Agricultural Extension & Community Development
    • /
    • v.24 no.4
    • /
    • pp.249-264
    • /
    • 2017
  • Modeling the consumer demand of fresh meat requires its distinct feature which other types of food product does not have. Most of the fresh meat products are likely to be unbranded, bought on a weight basis and affected by macro shocks such as seasonality, holiday effect and the disease incidence. Furthermore, consumers tend to purchase multiple categories of fresh meat in a week. Therefore, we apply a multiple discrete/continuous model on fresh meat consumption data to study the effect of macro shocks on fresh meat sales as well as of price change. As a result shows, Each fresh meat is relatively more likely to be bought in peak season of each fresh meat compared with imported pork which is set as a 'reference category' in this analysis. For clarity of the effect of disease incidence, we perform further analysis regarding the effect of livestock disease on fresh meat purchase probability. It shows that the avian flu in 2014 has strong negative impact on the purchase probability of chicken and the foot-and-mouth disease has negative impact on the purchase probability of pork and beef for part of outbreak periods.

A Study on Greenhouse Farmers' Willingness to Pay of Agricultural Water Supply through Pipeline (관수로 농업용수 공급에 대한 시설재배 농가의 비용 지불의사 연구)

  • Lim, Cheong-Ryong;Park, Seong-gyeong;Chung, Won-ho
    • Journal of Korean Society of Rural Planning
    • /
    • v.24 no.2
    • /
    • pp.109-114
    • /
    • 2018
  • In this study, we estimate the greenhouse farmers' willingness to pay of agricultural water supply through pipeline. First, in the questionnaire design, orthogonal design and block design were used to enhance the ease of survey. Second, the theoretical model was constructed through the setting of the probability utility function, and the parameters were estimated by using the conditional logit model. Third, all of the estimation coefficients were statistically significant at the 1% significance level. The results of analysis are summarized as follows. First, the probability of selection is increased when maintenance is carried out by Korea Rural Community Corporation or local government. Second, the probability of selection is increased when agricultural water supply through pipeline is higher than the current level. Third, if the Korea Rural Community Corporation carries out maintenance management, the marginal willingness to pay is 44 won per ton. And if the local government carries out maintenance management, the marginal willingness to pay is 25 won per ton. Fourth, according to the quality level of agricultural water supply, the marginal willingness to pay is 101 won, 114 won and 120 won per ton, respectively. This study can be used as a basic data on the cost setting for agricultural water supply through pipeline.

A Study on Error of Frequence Rainfall Estimates Using Random Variate (무작위변량을 이용한 강우빈도분석시 내외삽오차에 관한 연구)

  • Chai, Han Kyu;Eam, Ki Ok
    • Journal of Industrial Technology
    • /
    • v.20 no.A
    • /
    • pp.159-167
    • /
    • 2000
  • In the study rainfall frequency analysis attemped the many specific property data record duration it is differance from occur to error-term and probability ditribution of concern manifest. error-term analysis of method are fact sample data using method in other hand it is not appear to be fault that sample data of number to be small random variates. Therefore, day-rainfall data: to randomicity consider of this study sample data to the Monte Carlo method by randomize after data recode duration of form was choice method which compared an assumed maternal distribution from splitting frequency analysis consequence. In the conclusion, frequency analysis of chuncheon region rainfall appeared samll RMSE to the Gamma II distribution. In the rainfall frequency analysis estimate RMSE using random variates great transform, RMSE is appear that return period increasing little by little RMSE incresed and data number incresing to RMSE decreseing.

  • PDF

A Goneral Procedure for Testing Equivalence

  • Sung Nae Kyung
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.2
    • /
    • pp.491-501
    • /
    • 1998
  • Motivated by bioequivalence studies which involve comparisons of pharmaceutically equivalent dosage forms, we propose a more general decision rule for showing equivalence simultaneously between multiple means and a control mean. Namely, this testing procedure is concerned with the situation in that one must make decisions as to the bioequivalence of an original drug product and several generic formulations of that drug. This general test is developed by considering a spherical confidence region, which is a direct extension of the usual t-based confidence interval rule formally approved by the U.S. Food and Drug Administration. We characterize the test by the probability of rejection curves and assess its performance via Monte-Carlo simulation. Since the manufacturer's main concern is the proper choice of sample sizes, we provide optimal sample sizes from the Monte-Carlo simulation results. We also consider an application of the generalized equivalence test to a repeated measures design.

  • PDF

Choice of Statistical Calibration Procedures When the Standard Measurement is Also Subject to Error

  • Lee, Seung-Hoon;Yum, Bong-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.14 no.2
    • /
    • pp.63-75
    • /
    • 1985
  • This paper considers a statistical calibration problem in which the standard as wel as the nonstandard measurement is subject to error. Since the classicla approach cannot handle this situation properly, a functional relationship model with additional feature of prediction is proposed. For the analysis of the problem four different approaches-two estimation techniques (ordinary and grouping least squares) combined with two prediction methods (classical and inverse prediction)-are considered. By Monte Carlo simulation the perromance of each approach is assessed in term of the probability of concentration. The simulation results indicate that the ordinary least squares with inverse prediction is generally preferred in interpolation while the grouping least squares with classical prediction turns out to be better in extrapolation.

  • PDF

Probabilistic assessment on the basis of interval data

  • Thacker, Ben H.;Huyse, Luc J.
    • Structural Engineering and Mechanics
    • /
    • v.25 no.3
    • /
    • pp.331-345
    • /
    • 2007
  • Uncertainties enter a complex analysis from a variety of sources: variability, lack of data, human errors, model simplification and lack of understanding of the underlying physics. However, for many important engineering applications insufficient data are available to justify the choice of a particular probability density function (PDF). Sometimes the only data available are in the form of interval estimates which represent, often conflicting, expert opinion. In this paper we demonstrate that Bayesian estimation techniques can successfully be used in applications where only vague interval measurements are available. The proposed approach is intended to fit within a probabilistic framework, which is established and widely accepted. To circumvent the problem of selecting a specific PDF when only little or vague data are available, a hierarchical model of a continuous family of PDF's is used. The classical Bayesian estimation methods are expanded to make use of imprecise interval data. Each of the expert opinions (interval data) are interpreted as random interval samples of a parent PDF. Consequently, a partial conflict between experts is automatically accounted for through the likelihood function.