• Title/Summary/Keyword: statistical uncertainties

Search Result 197, Processing Time 0.028 seconds

Uncertainty Requirement Analysis for the Orbit, Attitude, and Burn Performance of the 1st Lunar Orbit Insertion Maneuver

  • Song, Young-Joo;Bae, Jonghee;Kim, Young-Rok;Kim, Bang-Yeop
    • Journal of Astronomy and Space Sciences
    • /
    • v.33 no.4
    • /
    • pp.323-333
    • /
    • 2016
  • In this study, the uncertainty requirements for orbit, attitude, and burn performance were estimated and analyzed for the execution of the $1^{st}$ lunar orbit insertion (LOI) maneuver of the Korea Pathfinder Lunar Orbiter (KPLO) mission. During the early design phase of the system, associate analysis is an essential design factor as the $1^{st}$ LOI maneuver is the largest burn that utilizes the onboard propulsion system; the success of the lunar capture is directly affected by the performance achieved. For the analysis, the spacecraft is assumed to have already approached the periselene with a hyperbolic arrival trajectory around the moon. In addition, diverse arrival conditions and mission constraints were considered, such as varying periselene approach velocity, altitude, and orbital period of the capture orbit after execution of the $1^{st}$ LOI maneuver. The current analysis assumed an impulsive LOI maneuver, and two-body equations of motion were adapted to simplify the problem for a preliminary analysis. Monte Carlo simulations were performed for the statistical analysis to analyze diverse uncertainties that might arise at the moment when the maneuver is executed. As a result, three major requirements were analyzed and estimated for the early design phase. First, the minimum requirements were estimated for the burn performance to be captured around the moon. Second, the requirements for orbit, attitude, and maneuver burn performances were simultaneously estimated and analyzed to maintain the $1^{st}$ elliptical orbit achieved around the moon within the specified orbital period. Finally, the dispersion requirements on the B-plane aiming at target points to meet the target insertion goal were analyzed and can be utilized as reference target guidelines for a mid-course correction (MCC) maneuver during the transfer. More detailed system requirements for the KPLO mission, particularly for the spacecraft bus itself and for the flight dynamics subsystem at the ground control center, are expected to be prepared and established based on the current results, including a contingency trajectory design plan.

Improvement of WRF forecast meteorological data by Model Output Statistics using linear, polynomial and scaling regression methods

  • Jabbari, Aida;Bae, Deg-Hyo
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2019.05a
    • /
    • pp.147-147
    • /
    • 2019
  • The Numerical Weather Prediction (NWP) models determine the future state of the weather by forcing current weather conditions into the atmospheric models. The NWP models approximate mathematically the physical dynamics by nonlinear differential equations; however these approximations include uncertainties. The errors of the NWP estimations can be related to the initial and boundary conditions and model parameterization. Development in the meteorological forecast models did not solve the issues related to the inevitable biases. In spite of the efforts to incorporate all sources of uncertainty into the forecast, and regardless of the methodologies applied to generate the forecast ensembles, they are still subject to errors and systematic biases. The statistical post-processing increases the accuracy of the forecast data by decreasing the errors. Error prediction of the NWP models which is updating the NWP model outputs or model output statistics is one of the ways to improve the model forecast. The regression methods (including linear, polynomial and scaling regression) are applied to the present study to improve the real time forecast skill. Such post-processing consists of two main steps. Firstly, regression is built between forecast and measurement, available during a certain training period, and secondly, the regression is applied to new forecasts. In this study, the WRF real-time forecast data, in comparison with the observed data, had systematic biases; the errors related to the NWP model forecasts were reflected in the underestimation of the meteorological data forecast by the WRF model. The promising results will indicate that the post-processing techniques applied in this study improved the meteorological forecast data provided by WRF model. A comparison between various bias correction methods will show the strength and weakness of the each methods.

  • PDF

An assessment of the applicability of multigroup cross sections generated with Monte Carlo method for fast reactor analysis

  • Lin, Ching-Sheng;Yang, Won Sik
    • Nuclear Engineering and Technology
    • /
    • v.52 no.12
    • /
    • pp.2733-2742
    • /
    • 2020
  • This paper presents an assessment of applicability of the multigroup cross sections generated with Monte Carlo tools to the fast reactor analysis based on transport calculations. 33-group cross section sets were generated for simple one- (1-D) and two-dimensional (2-D) sodium-cooled fast reactor problems using the SERPENT code and applied to deterministic steady-state and depletion calculations. Relative to the reference continuous-energy SERPENT results, with the transport corrected P0 scattering cross section, the k-eff value was overestimated by 506 and 588 pcm for 1-D and 2-D problems, respectively, since anisotropic scattering is important in fast reactors. When the scattering order was increased to P5, the 1-D and 2-D problem errors were increased to 577 and 643 pcm, respectively. A sensitivity and uncertainty analysis with the PERSENT code indicated that these large k-eff errors cannot be attributed to the statistical uncertainties of cross sections and they are likely due to the approximate anisotropic scattering matrices determined by scalar flux weighting. The anisotropic scattering cross sections were alternatively generated using the MC2-3 code and merged with the SERPENT cross sections. The mixed cross section set consistently reduced the errors in k-eff, assembly powers, and nuclide densities. For example, in the 2-D calculation with P3 scattering order, the k-eff error was reduced from 634 pcm to -223 pcm. The maximum error in assembly power was reduced from 2.8% to 0.8% and the RMS error was reduced from 1.4% to 0.4%. The maximum error in the nuclide densities at the end of 12-month depletion that occurred in 237Np was reduced from 3.4% to 1.5%. The errors of the other nuclides are also reduced consistently, for example, from 1.1% to 0.1% for 235U, from 2.2% to 0.7% for 238Pu, and from 1.6% to 0.2% for 241Pu. These results indicate that the scalar flux weighted anisotropic scattering cross sections of SERPENT may not be adequate for application to fast reactors where anisotropic scattering is important.

The Effect of the Project Plan Variance on Customer Satisfaction: Focus on IT Project Results (프로젝트 계획의 변동이 고객만족도에 미치는 영향: IT 프로젝트 결과 중심으로)

  • Yoon, Hyeong-Seok;Lee, Seouk-Joo;Kim, Seung-Chul;Park, So-Hyun
    • Journal of Information Technology Services
    • /
    • v.21 no.5
    • /
    • pp.51-64
    • /
    • 2022
  • Companies are planning and executing IT projects using information technology as a means to secure external competitiveness. However, IT projects have high risks and uncertainties due to the invisibility of outputs (systems, services, products), and changes in plans frequently occur during project execution. As a result, most IT projects are closed without achieving the target performance. This can lead to a waste of resources and money for the company, which in turn leads to the loss of opportunities to enter new markets. This study intends to analyze the effect of changes in the project plan on customer satisfaction, which is the project performance. Also, we want to find the importance of project planning so that the target performance of the IT project can be achieved. For the empirical analysis of this study, about 500 actual project data were collected. As for the analysis method, statistical analysis such as simple and multiple regression analysis and control effect was performed. Looking at the results of the analysis, it was found that the scope change affects the cost change and the schedule change. Also, changes in scope and cost were found to affect project performance. The theoretical performance of this study proved the theoretical fact that good project performance comes out if the IT project is executed as planned. The practical performance suggested the need for a change in project management by proving that thorough execution of the project planning stage can improve the project performance in the Korean project management culture, where the project planning stage is poorly performed for the rapid implementation of the IT project.

Forecasting volatility index by temporal convolutional neural network (Causal temporal convolutional neural network를 이용한 변동성 지수 예측)

  • Ji Won Shin;Dong Wan Shin
    • The Korean Journal of Applied Statistics
    • /
    • v.36 no.2
    • /
    • pp.129-139
    • /
    • 2023
  • Forecasting volatility is essential to avoiding the risk caused by the uncertainties of an financial asset. Complicated financial volatility features such as ambiguity between non-stationarity and stationarity, asymmetry, long-memory, sudden fairly large values like outliers bring great challenges to volatility forecasts. In order to address such complicated features implicity, we consider machine leaning models such as LSTM (1997) and GRU (2014), which are known to be suitable for existing time series forecasting. However, there are the problems of vanishing gradients, of enormous amount of computation, and of a huge memory. To solve these problems, a causal temporal convolutional network (TCN) model, an advanced form of 1D CNN, is also applied. It is confirmed that the overall forecasting power of TCN model is higher than that of the RNN models in forecasting VIX, VXD, and VXN, the daily volatility indices of S&P 500, DJIA, Nasdaq, respectively.

Using machine learning to forecast and assess the uncertainty in the response of a typical PWR undergoing a steam generator tube rupture accident

  • Tran Canh Hai Nguyen ;Aya Diab
    • Nuclear Engineering and Technology
    • /
    • v.55 no.9
    • /
    • pp.3423-3440
    • /
    • 2023
  • In this work, a multivariate time-series machine learning meta-model is developed to predict the transient response of a typical nuclear power plant (NPP) undergoing a steam generator tube rupture (SGTR). The model employs Recurrent Neural Networks (RNNs), including the Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and a hybrid CNN-LSTM model. To address the uncertainty inherent in such predictions, a Bayesian Neural Network (BNN) was implemented. The models were trained using a database generated by the Best Estimate Plus Uncertainty (BEPU) methodology; coupling the thermal hydraulics code, RELAP5/SCDAP/MOD3.4 to the statistical tool, DAKOTA, to predict the variation in system response under various operational and phenomenological uncertainties. The RNN models successfully captures the underlying characteristics of the data with reasonable accuracy, and the BNN-LSTM approach offers an additional layer of insight into the level of uncertainty associated with the predictions. The results demonstrate that LSTM outperforms GRU, while the hybrid CNN-LSTM model is computationally the most efficient. This study aims to gain a better understanding of the capabilities and limitations of machine learning models in the context of nuclear safety. By expanding the application of ML models to more severe accident scenarios, where operators are under extreme stress and prone to errors, ML models can provide valuable support and act as expert systems to assist in decision-making while minimizing the chances of human error.

Characterization of Soil Variability of Songdo Area in Incheon (인천 송도지역 지반의 변동성 분석)

  • Kim, Dong-Hee;An, Shin-Whan;Kim, Jae-Jung;Lee, Woo-Jin
    • Journal of the Korean Geotechnical Society
    • /
    • v.25 no.6
    • /
    • pp.73-88
    • /
    • 2009
  • Geotechnical variability is a complex feature that results from many independent sources of uncertainties, and is mainly affected by inherent variability and measurement errors. This study evaluates the coefficient of variation (COV) of soil properties and soil layers at Song-do region in Korea. Since soil variability is sensitive to soil layers and soil types, the Cays by soil layers (reclaimed layer and marine layer) and the COVs by soil types (clay and silt) were separately evaluated. It is observed that geotechnical variability of marine layer and clay is relatively smaller than that of reclamation layer and silt. And, the highly weathered rock and soil show the higher cays in the interpretation of the strength parameters of the fresh and weathered rock. And the proposed COV of Songdo area can be used for the reliability-based design procedure.

Probabilistic Stability Analysis of Slopes by the Limit Equilibrium Method Considering Spatial Variability of Soil Property (지반물성의 공간적 변동성을 고려한 한계평형법에 의한 확률론적 사면안정 해석)

  • Cho, Sung-Eun;Park, Hyung-Choon
    • Journal of the Korean Geotechnical Society
    • /
    • v.25 no.12
    • /
    • pp.13-25
    • /
    • 2009
  • In this paper, a numerical procedure of probabilistic slope stability analysis that considers the spatial variability of soil properties is presented. The procedure extends the deterministic analysis based on the limit equilibrium method of slices to a probabilistic approach that accounts for the uncertainties and spatial variation of the soil parameters. Making no a priori assumptions about the critical failure surface like the Random Finite Element Method (RFEM), the approach saves the amount of solution time required to perform the analysis. Two-dimensional random fields are generated based on a Karhunen-Lo$\grave{e}$ve expansion in a fashion consistent with a specified marginal distribution function and an autocorrelation function. A Monte Carlo simulation is then used to determine the statistical response based on the random fields. A series of analyses were performed to verify the application potential of the proposed method and to study the effects of uncertainty caused by the spatial heterogeneity on the stability of slope. The results show that the proposed method can efficiently consider the various failure mechanisms caused by the spatial variability of soil property in the probabilistic slope stability assessment.

A Study on the Probabilistic Analysis Method Considering Spatial Variability of Soil Properties (지반의 공간적 변동성을 고려한 확률론적 해석기법에 관한 연구)

  • Cho, Sung-Eun;Park, Hyung-Choon
    • Journal of the Korean Geotechnical Society
    • /
    • v.24 no.8
    • /
    • pp.111-123
    • /
    • 2008
  • Geotechnical engineering problems are characterized by many sources of uncertainty. Some of these sources are connected to the uncertainties of soil properties involved in the analysis. In this paper, a numerical procedure for a probabilistic analysis that considers the spatial variability of soil properties is presented to study the response of spatially random soil. The approach integrates a commercial finite difference method and random field theory into the framework of a probabilistic analysis. Two-dimensional non-Gaussian random fields are generated based on a Karhunen-$Lo{\grave{e}}ve$ expansion in a fashion consistent with a specified marginal distribution function and an autocorrelation function. A Monte Carlo simulation is then used to determine the statistical response based on the random fields. A series of analyses were performed to study the effects of uncertainty due to the spatial heterogeneity on the settlement and bearing capacity of a rough strip footing. The simulations provide insight into the application of uncertainty treatment to the geotechnical problem and show the importance of the spatial variability of soil properties with regard to the outcome of a probabilistic assessment.

Real-time private consumption prediction using big data (빅데이터를 이용한 실시간 민간소비 예측)

  • Seung Jun Shin;Beomseok Seo
    • The Korean Journal of Applied Statistics
    • /
    • v.37 no.1
    • /
    • pp.13-38
    • /
    • 2024
  • As economic uncertainties have increased recently due to COVID-19, there is a growing need to quickly grasp private consumption trends that directly reflect the economic situation of private economic entities. This study proposes a method of estimating private consumption in real-time by comprehensively utilizing big data as well as existing macroeconomic indicators. In particular, it is intended to improve the accuracy of private consumption estimation by comparing and analyzing various machine learning methods that are capable of fitting ultra-high-dimensional big data. As a result of the empirical analysis, it has been demonstrated that when the number of covariates including big data is large, variables can be selected in advance and used for model fit to improve private consumption prediction performance. In addition, as the inclusion of big data greatly improves the predictive performance of private consumption after COVID-19, the benefit of big data that reflects new information in a timely manner has been shown to increase when economic uncertainty is high.