• Title/Summary/Keyword: Monte Carlo Analysis

Search Result 1,743, Processing Time 0.029 seconds

Feasibility study of a dedicated nuclear desalination system: Low-pressure Inherent heat sink Nuclear Desalination plant (LIND)

  • Kim, Ho Sik;NO, Hee Cheon;Jo, YuGwon;Wibisono, Andhika Feri;Park, Byung Ha;Choi, Jinyoung;Lee, Jeong Ik;Jeong, Yong Hoon;Cho, Nam Zin
    • Nuclear Engineering and Technology
    • /
    • v.47 no.3
    • /
    • pp.293-305
    • /
    • 2015
  • In this paper, we suggest the conceptual design of a water-cooled reactor system for a low-pressure inherent heat sink nuclear desalination plant (LIND) that applies the safety-related design concepts of high temperature gas-cooled reactors to a water-cooled reactor for inherent and passive safety features. Through a scoping analysis, we found that the current LIND design satisfied several essential thermal-hydraulic and neutronic design requirements. In a thermal-hydraulic analysis using an analytical method based on the Wooton-Epstein correlation, we checked the possibility of safely removing decay heat through the steel containment even if all the active safety systems failed. In a neutronic analysis using the Monte Carlo N-particle transport code, we estimated a cycle length of approximately 6 years under 200 $MW_{th}$ and 4.5% enrichment. The very long cycle length and simple safety features minimize the burdens from the operation, maintenance, and spent-fuel management, with a positive impact on the economic feasibility. Finally, because a nuclear reactor should not be directly coupled to a desalination system to prevent the leakage of radioactive material into the desalinated water, three types of intermediate systems were studied: a steam producing system, a hot water system, and an organic Rankine cycle system.

Reliability-Based Design Optimization Using Enhanced Pearson System (개선된 피어슨 시스템을 이용한 신뢰성기반 최적설계)

  • Kim, Tae-Kyun;Lee, Tae-Hee
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.35 no.2
    • /
    • pp.125-130
    • /
    • 2011
  • Since conventional optimization that is classified as a deterministic method does not consider the uncertainty involved in a modeling or manufacturing process, an optimum design is often determined to be on the boundaries of the feasible region of constraints. Reliability-based design optimization is a method for obtaining a solution by minimizing the objective function while satisfying the reliability constraints. This method includes an optimization process and a reliability analysis that facilitates the quantization of the uncertainties related to design variables. Moment-based reliability analysis is a method for calculating the reliability of a system on the basis of statistical moments. In general, on the basis of these statistical moments, the Pearson system estimates seven types of distributions and determines the reliability of the system. However, it is technically difficult to practically consider the Pearson Type IV distribution. In this study, we propose an enhanced Pearson Type IV distribution based on a kriging model and validate the accuracy of the enhanced Pearson Type IV distribution by comparing it with a Monte Carlo simulation. Finally, reliability-based design optimization is performed for a system with type IV distribution by using the proposed method.

Average Data Rate Analysis for Data Exchanging Nodes via Relay by Concurrent Transmission (데이타 교환 노드의 동시 전송 릴레이 이용을 위한 평균 데이터 전송률 분석)

  • Kwon, Taehoon
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.11 no.6
    • /
    • pp.638-644
    • /
    • 2018
  • Relay systems have recently gained attentions because of its capability of cell coverage extension and the power gain as the one of key technologies for 5G. Relays can be exploited for small-cell base stations and the autonomous network, where communication devices communicate with each other cooperatively. Therefore, the relay technology is expected to enable the low power and large capacity communication. In order to maximize the benefits of using a limited number of relays, the efficient relay selection method is required. Especially, when two nodes exchange their data with each other via relay, the relay selection can maximize the average data rate by the spatial location of the relay. For this purpose, the average data rate is analyzed first according to the relay selection. In this paper, we analyzed the average data rate when two nodes exchange their data via dual-hop decode and forward relaying considering the interference by the concurrent transmission under Nakagami-m fading channel. The correctness of the analysis is verified by the Monte Carlo simulation. The results show that the concurrent transmission is superior to the non-concurrent transmission in the high required data rate region rather than in the low required data rate region.

Financial Analysis Model Development by Applying Optimization Method in Residential Officetel (최적화 기법을 활용한 주거용 오피스텔 수지분석 모델 개발)

  • Jang, Jun-Ho;Ha, Sun-Geun;Son, Ki-Young;Son, Seung-Hyun
    • Journal of the Korea Institute of Building Construction
    • /
    • v.19 no.1
    • /
    • pp.67-76
    • /
    • 2019
  • The domestic construction industry is changing according to its preference for demand and supply along with urbanization and economic development. Accordingly, initial risk assessments is more important than before. In particular, demand for lease-based investment products such as commercial and office buildings has surged as a substitute for financial products due to low interest rates of banks. Therefore, the objective is to suggest a basic study on financial analysis model development by applying optimization method in residential officetel. To achieve the objective, first, the previous studies are investigated. Second, the causal loop diagram is structured based on the collected data. Third, the system dynamics method is used to develop cost-income simulation and optimization model sequentially. Finally, the developed model was verifed through analyzing a case project. In the future, the proposed model can be helpful whether or not conduct execution of an officetel development project to the decision makers.

Development of Risk Analysis Structure for Large-scale Underground Construction in Urban Areas (도심지 대규모 지하공사의 리스크 분석 체계 개발)

  • Seo, Jong-Won;Yoon, Ji-Hyeok;Kim, Jeong-Hwan;Jee, Sung-Hyun
    • Journal of the Korean Geotechnical Society
    • /
    • v.26 no.3
    • /
    • pp.59-68
    • /
    • 2010
  • Systematic risk management is necessary in grand scaled urban construction because of the existence of complicated and various risk factors. Problems of obstructions, adjacent structures, safety, environment, traffic and geotechnical properties need to be solved because urban construction is progressed in limited space not as general earthwork. Therefore the establishment of special risk management system is necessary to manage not only geotechnical properties but also social and cultural uncertainties. This research presents the technique analysis by the current state of risk management technique. Risk factors were noticed and the importance of each factor was estimated through survey. The systemically categorized database was established. Risk extraction module, matrix and score module were developed based on the database. Expected construction budget and time distribution can be computed by Monte Carlo analysis of probabilities and influences. Construction budgets and time distributions of before and after response can be compared and analyzed 80 the risks are manageable for entire whole construction time. This system will be the foundation of standardization and integration. Procurement, efficiency improvement, effective time and resource management are available through integrated management technique development and application. Conclusively decrease in cost and time is expected by systemization of project management.

Informative Role of Marketing Activity in Financial Market: Evidence from Analysts' Forecast Dispersion

  • Oh, Yun Kyung
    • Asia Marketing Journal
    • /
    • v.15 no.3
    • /
    • pp.53-77
    • /
    • 2013
  • As advertising and promotions are categorized as operating expenses, managers tend to reduce marketing budget to improve their short term profitability. Gauging the value and accountability of marketing spending is therefore considered as a major research priority in marketing. To respond this call, recent studies have documented that financial market reacts positively to a firm's marketing activity or marketing related outcomes such as brand equity and customer satisfaction. However, prior studies focus on the relation of marketing variable and financial market variables. This study suggests a channel about how marketing activity increases firm valuation. Specifically, we propose that a firm's marketing activity increases the level of the firm's product market information and thereby the dispersion in financial analysts' earnings forecasts decreases. With less uncertainty about the firm's future prospect, the firm's managers and shareholders have less information asymmetry, which reduces the firm's cost of capital and thereby increases the valuation of the firm. To our knowledge, this is the first paper to examine how informational benefits can mediate the effect of marketing activity on firm value. To test whether marketing activity contributes to increase in firm value by mitigating information asymmetry, this study employs a longitudinal data which contains 12,824 firm-year observations with 2,337 distinct firms from 1981 to 2006. Firm value is measured by Tobin's Q and one-year-ahead buy-and-hold abnormal return (BHAR). Following prior literature, dispersion in analysts' earnings forecasts is used as a proxy for the information gap between management and shareholders. For model specification, to identify mediating effect, the three-step regression approach is adopted. All models are estimated using Markov chain Monte Carlo (MCMC) methods to test the statistical significance of the mediating effect. The analysis shows that marketing intensity has a significant negative relationship with dispersion in analysts' earnings forecasts. After including the mediator variable about analyst dispersion, the effect of marketing intensity on firm value drops from 1.199 (p < .01) to 1.130 (p < .01) in Tobin's Q model and the same effect drops from .192 (p < .01) to .188 (p < .01) in BHAR model. The results suggest that analysts' forecast dispersion partially accounts for the positive effect of marketing on firm valuation. Additionally, the same analysis was conducted with an alternative dependent variable (forecast accuracy) and a marketing metric (advertising intensity). The analysis supports the robustness of the main results. In sum, the results provide empirical evidence that marketing activity can increase shareholder value by mitigating problem of information asymmetry in the capital market. The findings have important implications for managers. First, managers should be cognizant of the role of marketing activity in providing information to the financial market as well as to the consumer market. Thus, managers should take into account investors' reaction when they design marketing communication messages for reducing the cost of capital. Second, this study shows a channel on how marketing creates shareholder value and highlights the accountability of marketing. In addition to the direct impact of marketing on firm value, an indirect channel by reducing information asymmetry should be considered. Potentially, marketing managers can justify their spending from the perspective of increasing long-term shareholder value.

  • PDF

Evaluation of the CNESTEN's TRIGA Mark II research reactor physical parameters with TRIPOLI-4® and MCNP

  • H. Ghninou;A. Gruel;A. Lyoussi;C. Reynard-Carette;C. El Younoussi;B. El Bakkari;Y. Boulaich
    • Nuclear Engineering and Technology
    • /
    • v.55 no.12
    • /
    • pp.4447-4464
    • /
    • 2023
  • This paper focuses on the development of a new computational model of the CNESTEN's TRIGA Mark II research reactor using the 3D continuous energy Monte-Carlo code TRIPOLI-4 (T4). This new model was developed to assess neutronic simulations and determine quantities of interest such as kinetic parameters of the reactor, control rods worth, power peaking factors and neutron flux distributions. This model is also a key tool used to accurately design new experiments in the TRIGA reactor, to analyze these experiments and to carry out sensitivity and uncertainty studies. The geometry and materials data, as part of the MCNP reference model, were used to build the T4 model. In this regard, the differences between the two models are mainly due to mathematical approaches of both codes. Indeed, the study presented in this article is divided into two parts: the first part deals with the development and the validation of the T4 model. The results obtained with the T4 model were compared to the existing MCNP reference model and to the experimental results from the Final Safety Analysis Report (FSAR). Different core configurations were investigated via simulations to test the computational model reliability in predicting the physical parameters of the reactor. As a fairly good agreement among the results was deduced, it seems reasonable to assume that the T4 model can accurately reproduce the MCNP calculated values. The second part of this study is devoted to the sensitivity and uncertainty (S/U) studies that were carried out to quantify the nuclear data uncertainty in the multiplication factor keff. For that purpose, the T4 model was used to calculate the sensitivity profiles of the keff to the nuclear data. The integrated-sensitivities were compared to the results obtained from the previous works that were carried out with MCNP and SCALE-6.2 simulation tools and differences of less than 5% were obtained for most of these quantities except for the C-graphite sensitivities. Moreover, the nuclear data uncertainties in the keff were derived using the COMAC-V2.1 covariance matrices library and the calculated sensitivities. The results have shown that the total nuclear data uncertainty in the keff is around 585 pcm using the COMAC-V2.1. This study also demonstrates that the contribution of zirconium isotopes to the nuclear data uncertainty in the keff is not negligible and should be taken into account when performing S/U analysis.

Survival Analysis for White Non-Hispanic Female Breast Cancer Patients

  • Khan, Hafiz Mohammad Rafiqullah;Saxena, Anshul;Gabbidon, Kemesha;Stewart, Tiffanie Shauna-Jeanne;Bhatt, Chintan
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.15 no.9
    • /
    • pp.4049-4054
    • /
    • 2014
  • Background: Race and ethnicity are significant factors in predicting survival time of breast cancer patients. In this study, we applied advanced statistical methods to predict the survival of White non-Hispanic female breast cancer patients, who were diagnosed between the years 1973 and 2009 in the United States (U.S.). Materials and Methods: Demographic data from the Surveillance Epidemiology and End Results (SEER) database were used for the purpose of this study. Nine states were randomly selected from 12 U.S. cancer registries. A stratified random sampling method was used to select 2,000 female breast cancer patients from these nine states. We compared four types of advanced statistical probability models to identify the best-fit model for the White non-Hispanic female breast cancer survival data. Three model building criterion were used to measure and compare goodness of fit of the models. These include Akaike Information Criteria (AIC), Bayesian Information Criteria (BIC), and Deviance Information Criteria (DIC). In addition, we used a novel Bayesian method and the Markov Chain Monte Carlo technique to determine the posterior density function of the parameters. After evaluating the model parameters, we selected the model having the lowest DIC value. Using this Bayesian method, we derived the predictive survival density for future survival time and its related inferences. Results: The analytical sample of White non-Hispanic women included 2,000 breast cancer cases from the SEER database (1973-2009). The majority of cases were married (55.2%), the mean age of diagnosis was 63.61 years (SD = 14.24) and the mean survival time was 84 months (SD = 35.01). After comparing the four statistical models, results suggested that the exponentiated Weibull model (DIC= 19818.220) was a better fit for White non-Hispanic females' breast cancer survival data. This model predicted the survival times (in months) for White non-Hispanic women after implementation of precise estimates of the model parameters. Conclusions: By using modern model building criteria, we determined that the data best fit the exponentiated Weibull model. We incorporated precise estimates of the parameter into the predictive model and evaluated the survival inference for the White non-Hispanic female population. This method of analysis will assist researchers in making scientific and clinical conclusions when assessing survival time of breast cancer patients.

Variability of Mid-plane Symmetric Functionally Graded Material Beams in Free Vibration (중립면 대칭 기능경사재료 보의 자유진동 변화도)

  • Nguyen, Van Thuan;Noh, Hyuk-Chun
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.31 no.3
    • /
    • pp.127-132
    • /
    • 2018
  • In this paper, a scheme for the evaluation of variability in the eigen-modes of functionally graded material(FGM) beams is proposed within the framework of perturbation-based stochastic analysis. As a random parameter, the spatially varying elastic modulus of FGM along the axial direction at the mid-surface of the beam is chosen, and the thru-thickness variation of the elastic modulus is assumed to follow the original form of exponential variation. In deriving the formulation, the first order Taylor expansion on the eigen-modes is employed. As an example, a simply supported FGM beam having symmetric elastic modulus with respect to the mid-surface is chosen. Monte Carlo analysis is also performed to check if the proposed scheme gives reasonable outcomes. From the analyses it is found that the two schemes give almost identical results of the mean and standard deviation of eigen-modes. With the propose scheme, the standard deviation shape of respective eigen-modes can be evaluated easily. The deviated mode shape is found to have one more zero-slope points than the mother modes shapes, irrespective of order of modes. The amount of deviation from the mean is found to have larger values for the higher modes than the lower modes.

Probabilistic Optimization for Improving Soft Marine Ground using a Low Replacement Ratio (해상 연약지반의 저치환율 개량에 대한 확률론적 최적화)

  • Han, Sang-Hyun;Kim, Hong-Yeon;Yea, Geu-Guwen
    • The Journal of Engineering Geology
    • /
    • v.26 no.4
    • /
    • pp.485-495
    • /
    • 2016
  • To reinforce and improve the soft ground under a breakwater while using materials efficiently, the replacement ratio and leaving periods of surcharge load are optimized probabilistically. The results of Bayesian updating of the random variables using prior information decrease uncertainty by up to 39.8%, and using prior information with more samples results in a sharp decrease in uncertainty. Replacement ratios of 15%-40% are analyzed using First Order Reliability Method and Monte Carlo simulation to optimize the replacement ratio. The results show that replacement ratios of 20% and 25% are acceptable at the column jet grouting area and the granular compaction pile area, respectively. Life cycle costs are also compared to optimize the replacement ratios within allowable ranges. The results show that a range of 20%-30% is the most economical during the total life cycle. This means that initial construction cost, maintenance cost and failure loss cost are minimized during total life cycle. Probabilistic analysis for leaving periods of shows that three months acceptable. Design optimization with respect to life cycle cost is important to minimize maintenance costs and retain the performance of the structures for the required period. Therefore, more case studies that consider the maintenance costs of soil structures are necessary to establish relevant design codes.