• Title/Summary/Keyword: monte carlo methods

Search Result 949, Processing Time 0.027 seconds

Assessment of Applicability of Portable HPGe Detector with In Situ Object Counting System based on Performance Evaluation of Thyroid Radiobioassays

  • Park, MinSeok;Kwon, Tae-Eun;Pak, Min Jung;Park, Se-Young;Ha, Wi-Ho;Jin, Young-Woo
    • Journal of Radiation Protection and Research
    • /
    • v.42 no.2
    • /
    • pp.83-90
    • /
    • 2017
  • Background: Different cases exist in the measurement of thyroid radiobioassays owing to the individual characteristics of the subjects, especially the potential variation in the counting efficiency. An In situ Object Counting System (ISOCS) was developed to perform an efficiency calibration based on the Monte Carlo calculation, as an alternative to conventional calibration methods. The purpose of this study is to evaluate the applicability of ISOCS to thyroid radiobioassays by comparison with a conventional thyroid monitoring system. Materials and Methods: The efficiency calibration of a portable high-purity germanium (HPGe) detector was performed using ISOCS software. In contrast, the conventional efficiency calibration, which needed a radioactive material, was applied to a scintillator-based thyroid monitor. Four radioiodine samples that contained $^{125}I$ and $^{131}I$ in both aqueous solution and gel forms were measured to evaluate radioactivity in the thyroid. ANSI/HPS N13.30 performance criteria, which included the relative bias, relative precision, and root-mean-squared error, were applied to evaluate the performance of the measurement system. Results and Discussion: The portable HPGe detector could measure both radioiodines with ISOCS but the thyroid monitor could not measure $^{125}I$ because of the limited energy resolution of the NaI(Tl) scintillator. The $^{131}I$ results from both detectors agreed to within 5% with the certified results. Moreover, the $^{125}I$ results from the portable HPGe detector agreed to within 10% with the certified results. All measurement results complied with the ANSI/HPS N13.30 performance criteria. Conclusion: The results of the intercomparison program indicated the feasibility of applying ISOCS software to direct thyroid radiobioassays. The portable HPGe detector with ISOCS software can provide the convenience of efficiency calibration and higher energy resolution for identifying photopeaks, compared with a conventional thyroid monitor with a NaI(Tl) scintillator. The application of ISOCS software in a radiation emergency can improve the response in terms of internal contamination monitoring.

A Feasibility Study on Using Neural Network for Dose Calculation in Radiation Treatment (방사선 치료 선량 계산을 위한 신경회로망의 적용 타당성)

  • Lee, Sang Kyung;Kim, Yong Nam;Kim, Soo Kon
    • Journal of Radiation Protection and Research
    • /
    • v.40 no.1
    • /
    • pp.55-64
    • /
    • 2015
  • Dose calculations which are a crucial requirement for radiotherapy treatment planning systems require accuracy and rapid calculations. The conventional radiotherapy treatment planning dose algorithms are rapid but lack precision. Monte Carlo methods are time consuming but the most accurate. The new combined system that Monte Carlo methods calculate part of interesting domain and the rest is calculated by neural can calculate the dose distribution rapidly and accurately. The preliminary study showed that neural networks can map functions which contain discontinuous points and inflection points which the dose distributions in inhomogeneous media also have. Performance results between scaled conjugated gradient algorithm and Levenberg-Marquardt algorithm which are used for training the neural network with a different number of neurons were compared. Finally, the dose distributions of homogeneous phantom calculated by a commercialized treatment planning system were used as training data of the neural network. In the case of homogeneous phantom;the mean squared error of percent depth dose was 0.00214. Further works are programmed to develop the neural network model for 3-dimensinal dose calculations in homogeneous phantoms and inhomogeneous phantoms.

An Improved Monte-Carlo Simulation Method for Typhoon Risk Assessment in Korea (개선(改善)된 Monte-Carlo 시뮬레이션 방법(方法)에 의한 한국(韓國)의 태풍위험도(颱風危險度) 분석(分析))

  • Cho, Hyo Nam;Chang, Dong Il;Cha, Cheol Jun
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.7 no.4
    • /
    • pp.159-165
    • /
    • 1987
  • This study proposes an operational method of typhoon risk assessments in Korea, using Statistical analysis and probabilistic description of typhoon at a site. Two alternative simulation and fitting methods are discussed to predict the probabilistic typhoon wind speeds by indirect methods. A Commonly used indirect method is Russell's procedure, which generates about 1,000 Simulation data for typhoon winds, statistically evaluate the base-line distribution, and then fits the results to the Weibull distribution based on probabilistic description of climatological Characteristics and Wind field model of typhoon at a site. However, an alternative procedure proposed in this Paper simulates extreme typhoon wind data of about 150~200 years and directly fits the generated data to the Weibull distribution. The computational results show that the proposed simulation method is more economical and reasonable for typhoon risk-assessment based on the indirect method. And using the proposed indirect method, the probabilistic design wind speed for transmission towers in typhoon-prone region along the South-Western coast is investigated.

  • PDF

Construction of Logic Trees and Hazard Curves for Probabilistic Tsunami Hazard Analysis (확률론적 지진해일 재해도평가를 위한 로직트리 작성 및 재해곡선 산출 방법)

  • Jho, Myeong Hwan;Kim, Gun Hyeong;Yoon, Sung Bum
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.31 no.2
    • /
    • pp.62-72
    • /
    • 2019
  • Due to the difficulties in forecasting the intensity and the source location of tsunami the countermeasures prepared based on the deterministic approach fail to work properly. Thus, there is an increasing demand of the tsunami hazard analyses that consider the uncertainties of tsunami behavior in probabilistic approach. In this paper a fundamental study is conducted to perform the probabilistic tsunami hazard analysis (PTHA) for the tsunamis that caused the disaster to the east coast of Korea. A logic tree approach is employed to consider the uncertainties of the initial free surface displacement and the tsunami height distribution along the coast. The branches of the logic tree are constructed by reflecting characteristics of tsunamis that have attacked the east coast of Korea. The computational time is nonlinearly increasing if the number of branches increases in the process of extracting the fractile curves. Thus, an improved method valid even for the case of a huge number of branches is proposed to save the computational time. The performance of the discrete weight distribution method proposed first in this study is compared with those of the conventional sorting method and the Monte Carlo method. The present method is comparable to the conventional methods in its accuracy, and is efficient in the sense of computational time when compared with the conventional sorting method. The Monte Carlo method, however, is more efficient than the other two methods if the number of branches and the number of fault segments increase significantly.

Resistance Factors of Driven Steel Pipe Piles for LRFD Design in Korea (LRFD 설계를 위한 국내 항타강관말뚝의 저항계수 산정)

  • Park, Jae Hyun;Huh, Jungwon;Kim, Myung Mo;Kwak, Kiseok
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.28 no.6C
    • /
    • pp.367-377
    • /
    • 2008
  • As part of study to develop LRFD (Load and Resistance Factor Design) codes for foundation structures in Korea, resistance factors for static bearing capacity of driven steel pipe piles were calibrated in the framework of reliability theory. The 57 data sets of static load tests and soil property tests conducted in the whole domestic area were collected and these load test piles were sorted into two cases: SPT N at pile tip less than 50, SPT N at pile tip equal to or more than 50. The static bearing capacity formula and the Meyerhof method using N values were applied to calculate the expected design bearing capacities of the piles. The resistance bias factors were evaluated for the two static design methods by comparing the representative measured bearing capacities with the expected design values. Reliability analysis was performed by two types of advanced methods: the First Order Reliability Method (FORM), and the Monte Carlo Simulation (MCS) method using resistance bias factor statistics. The target reliability indices are selected as 2.0 and 2.33 for group pile case and 2.5 for single pile case, in consideration of the reliability level of the current design practice, redundancy of pile group, acceptable risk level, construction quality control, and significance of individual structure. Resistance factors of driven steel pipe piles were recommended based on the results derived from the First Order Reliability Method and the Monte Carlo Simulation method.

Estimation of Interaction Effects among Nucleotide Sequence Variants in Animal Genomes

  • Lee, Chaeyoung;Kim, Younyoung
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.22 no.1
    • /
    • pp.124-130
    • /
    • 2009
  • Estimating genetic interaction effects in animal genomics would be one of the most challenging studies because the phenotypic variation for economically important traits might be largely explained by interaction effects among multiple nucleotide sequence variants under various environmental exposures. Genetic improvement of economic animals would be expected by understanding multi-locus genetic interaction effects associated with economic traits. Most analyses in animal breeding and genetics, however, have excluded the possibility of genetic interaction effects in their analytical models. This review discusses a historical estimation of the genetic interaction and difficulties in analyzing the interaction effects. Furthermore, two recently developed methods for assessing genetic interactions are introduced to animal genomics. One is the restricted partition method, as a nonparametric grouping-based approach, that iteratively utilizes grouping of genotypes with the smallest difference into a new group, and the other is the Bayesian method that draws inferences about the genetic interaction effects based on their marginal posterior distributions and attains the marginalization of the joint posterior distribution through Gibbs sampling as a Markov chain Monte Carlo. Further developing appropriate and efficient methods for assessing genetic interactions would be urgent to achieve accurate understanding of genetic architecture for complex traits of economic animals.

On the use of weighted adaptive nearest neighbors for missing value imputation (가중 적응 최근접 이웃을 이용한 결측치 대치)

  • Yum, Yunjin;Kim, Dongjae
    • The Korean Journal of Applied Statistics
    • /
    • v.31 no.4
    • /
    • pp.507-516
    • /
    • 2018
  • Widely used among the various single imputation methods is k-nearest neighbors (KNN) imputation due to its robustness even when a parametric model such as multivariate normality is not satisfied. We propose a weighted adaptive nearest neighbors imputation method that combines the adaptive nearest neighbors imputation method that accounts for the local features of the data in the KNN imputation method and weighted k-nearest neighbors method that are less sensitive to extreme value or outlier among k-nearest neighbors. We conducted a Monte Carlo simulation study to compare the performance of the proposed imputation method with previous imputation methods.

Deterministic and reliability-based design of necessary support pressures for tunnel faces

  • Li, Bin;Yao, Kai;Li, Hong
    • Geomechanics and Engineering
    • /
    • v.22 no.1
    • /
    • pp.35-48
    • /
    • 2020
  • This paper provides methods for the deterministic and reliability-based design of the support pressures necessary to prevent tunnel face collapse. The deterministic method is developed by extending the use of the unique load multiplier, which is embedded within OptumG2/G3 with the intention of determining the maximum load that can be supported by a system. Both two-dimensional and three-dimensional examples are presented to illustrate the applications. The obtained solutions are validated according to those derived from the existing methods. The reliability-based method is developed by incorporating the Response Surface Method and the advanced first-order second-moment reliability method into the bisection algorithm, which continuously updates the support pressure within previously determined brackets until the difference between the computed reliability index and the user-defined value is less than a specified tolerance. Two-dimensional reliability-based support pressure is compared and validated via Monte Carlo simulations, whereas the three-dimensional solution is compared with the relationship between the support pressure and the resulting reliability index provided in the existing literature. Finally, a parametric study is carried out to investigate the influences of factors on the required support pressure.

On using computational versus data-driven methods for uncertainty propagation of isotopic uncertainties

  • Radaideh, Majdi I.;Price, Dean;Kozlowski, Tomasz
    • Nuclear Engineering and Technology
    • /
    • v.52 no.6
    • /
    • pp.1148-1155
    • /
    • 2020
  • This work presents two different methods for quantifying and propagating the uncertainty associated with fuel composition at end of life for cask criticality calculations. The first approach, the computational approach uses parametric uncertainty including those associated with nuclear data, fuel geometry, material composition, and plant operation to perform forward depletion on Monte-Carlo sampled inputs. These uncertainties are based on experimental and prior experience in criticality safety. The second approach, the data-driven approach relies on using radiochemcial assay data to derive code bias information. The code bias data is used to perturb the isotopic inventory in the data-driven approach. For both approaches, the uncertainty in keff for the cask is propagated by performing forward criticality calculations on sampled inputs using the distributions obtained from each approach. It is found that the data driven approach yielded a higher uncertainty than the computational approach by about 500 pcm. An exploration is also done to see if considering correlation between isotopes at end of life affects keff uncertainty, and the results demonstrate an effect of about 100 pcm.

A Comparative Study on Tests of Correlation (상관계수에 대한 검정법 비교)

  • Cho, Hyun-Joo;Song, Myung-Unn;Jeong, Dong-Myung;Song, Jae-Kee
    • Journal of the Korean Data and Information Science Society
    • /
    • v.7 no.2
    • /
    • pp.235-245
    • /
    • 1996
  • In this paper, we studied about several methods of testing hypothesis of correlation, specially Approximate method, Empirical method and Bootstrap method. The Approximate method is based on the Fisher's Z-transformation and the Empirical and Bootstrap methods approximate the distribution of the sample correlation coefficient by Monte Carlo simulation and Bootstrap technique, respectively. In order to compare how good these tests are, we computed powers under various alternatives. Consequently, we see that the Approximate test performs very well even if in small sample and all tests have almost the same power in large sample.

  • PDF