• Title/Summary/Keyword: uncertainty factor

Search Result 625, Processing Time 0.023 seconds

Classification of Proximity Relational Using Multiple Fuzzy Alpha Cut(MFAC) (MFAC를 사용한 근접관계의 분류)

  • Ryu, Kyung-Hyun;Chung, Hwan-Mook
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.1
    • /
    • pp.139-144
    • /
    • 2008
  • Generally, real system that is the object of decision-making is very variable and sometimes it lies situations with uncertainty. To solve these problem, it has used statistical methods as significance level, certainty factor, sensitivity analysis and so on. In this paper, we propose a method for fuzzy decision-making based on MFAC(Multiple Fuzzy Alpha Cut) to improve the definiteness of classification results with similarity evaluation. In the proposed method, MFAC is used for extracting multiple a ${\alpha}$-level with proximity degree at proximity relation between relative Hamming distance and max-min method and for minimizing the number of data which are associated with the partition intervals extracted by MFAC. To determine final alternative of decision-making, we compute the weighted value between extracted data by MFAC From the experimental results, we can see the fact that the proposed method is simpler and more definite than classification performance of the conventional methods and determines an alternative efficiently for decision-maker by testing significance of sample data through statistical method.

Optimum design of lead-rubber bearing system with uncertainty parameters

  • Fan, Jian;Long, Xiaohong;Zhang, Yanping
    • Structural Engineering and Mechanics
    • /
    • v.56 no.6
    • /
    • pp.959-982
    • /
    • 2015
  • In this study, a non-stationary random earthquake Clough-Penzien model is used to describe earthquake ground motion. Using stochastic direct integration in combination with an equivalent linear method, a solution is established to describe the non-stationary response of lead-rubber bearing (LRB) system to a stochastic earthquake. Two parameters are used to develop an optimization method for bearing design: the post-yielding stiffness and the normalized yield strength of the isolation bearing. Using the minimization of the maximum energy response level of the upper structure subjected to an earthquake as an objective function, and with the constraints that the bearing failure probability is no more than 5% and the second shape factor of the bearing is less than 5, a calculation method for the two optimal design parameters is presented. In this optimization process, the radial basis function (RBF) response surface was applied, instead of the implicit objective function and constraints, and a sequential quadratic programming (SQP) algorithm was used to solve the optimization problems. By considering the uncertainties of the structural parameters and seismic ground motion input parameters for the optimization of the bearing design, convex set models (such as the interval model and ellipsoidal model) are used to describe the uncertainty parameters. Subsequently, the optimal bearing design parameters were expanded at their median values into first-order Taylor series expansions, and then, the Lagrange multipliers method was used to determine the upper and lower boundaries of the parameters. Moreover, using a calculation example, the impacts of site soil parameters, such as input peak ground acceleration, bearing diameter and rubber shore hardness on the optimization parameters, are investigated.

A novel evidence theory model and combination rule for reliability estimation of structures

  • Tao, Y.R.;Wang, Q.;Cao, L.;Duan, S.Y.;Huang, Z.H.H.;Cheng, G.Q.
    • Structural Engineering and Mechanics
    • /
    • v.62 no.4
    • /
    • pp.507-517
    • /
    • 2017
  • Due to the discontinuous nature of uncertainty quantification in conventional evidence theory(ET), the computational cost of reliability analysis based on ET model is very high. A novel ET model based on fuzzy distribution and the corresponding combination rule to synthesize the judgments of experts are put forward in this paper. The intersection and union of membership functions are defined as belief and plausible membership function respectively, and the Murfhy's average combination rule is adopted to combine the basic probability assignment for focal elements. Then the combined membership functions are transformed to the equivalent probability density function by a normalizing factor. Finally, a reliability analysis procedure for structures with the mixture of epistemic and aleatory uncertainties is presented, in which the equivalent normalization method is adopted to solve the upper and lower bound of reliability. The effectiveness of the procedure is demonstrated by a numerical example and an engineering example. The results also show that the reliability interval calculated by the suggested method is almost identical to that solved by conventional method. Moreover, the results indicate that the computational cost of the suggested procedure is much less than that of conventional method. The suggested ET model provides a new way to flexibly represent epistemic uncertainty, and provides an efficiency method to estimate the reliability of structures with the mixture of epistemic and aleatory uncertainties.

MEASURING THE INFLUENCE OF TASK COMPLEXITY ON HUMAN ERROR PROBABILITY: AN EMPIRICAL EVALUATION

  • Podofillini, Luca;Park, Jinkyun;Dang, Vinh N.
    • Nuclear Engineering and Technology
    • /
    • v.45 no.2
    • /
    • pp.151-164
    • /
    • 2013
  • A key input for the assessment of Human Error Probabilities (HEPs) with Human Reliability Analysis (HRA) methods is the evaluation of the factors influencing the human performance (often referred to as Performance Shaping Factors, PSFs). In general, the definition of these factors and the supporting guidance are such that their evaluation involves significant subjectivity. This affects the repeatability of HRA results as well as the collection of HRA data for model construction and verification. In this context, the present paper considers the TAsk COMplexity (TACOM) measure, developed by one of the authors to quantify the complexity of procedure-guided tasks (by the operating crew of nuclear power plants in emergency situations), and evaluates its use to represent (objectively and quantitatively) task complexity issues relevant to HRA methods. In particular, TACOM scores are calculated for five Human Failure Events (HFEs) for which empirical evidence on the HEPs (albeit with large uncertainty) and influencing factors are available - from the International HRA Empirical Study. The empirical evaluation has shown promising results. The TACOM score increases as the empirical HEP of the selected HFEs increases. Except for one case, TACOM scores are well distinguished if related to different difficulty categories (e.g., "easy" vs. "somewhat difficult"), while values corresponding to tasks within the same category are very close. Despite some important limitations related to the small number of HFEs investigated and the large uncertainty in their HEPs, this paper presents one of few attempts to empirically study the effect of a performance shaping factor on the human error probability. This type of study is important to enhance the empirical basis of HRA methods, to make sure that 1) the definitions of the PSFs cover the influences important for HRA (i.e., influencing the error probability), and 2) the quantitative relationships among PSFs and error probability are adequately represented.

A Study on Effect Analysis and Design Optimization of Tire and ABS Logic for Vehicle Braking Performance Improvement (차량 제동성능 개선을 위한 타이어 인자 분석 및 최적설계에 대한 연구)

  • Ki, Won Yong;Lee, Gwang Woo;Heo, Seung Jin;Kang, Dae Oh;Kim, Ki Woon
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.24 no.5
    • /
    • pp.581-587
    • /
    • 2016
  • Braking is a basic and an important safety feature for all vehicles, and the final braking performance of a vehicle is determined by the vehicle's ABS performance and tire performance. However, the combination of excellent ABS and tires will not always ensure good braking performance. This is due to the fact that tire performance has non-linearity and uncertainty in predicting the repeated increase and decrease of wheel slip when activating the ABS, thus increasing the uncertainty of tire performance prediction. Furthermore, existing studies predicted braking performance after using an ABS that used a wheel slip control as a controller, which was different from an actual vehicle's ABS that controlled angular acceleration, therefore causing a decrease in the prediction accuracy of the braking performance. This paper reverse-designed the ABS that controlled angular acceleration based on the information on brake pressure, etc., which were obtained from vehicle tests, and established a braking performance prediction analysis model by combining a multi-body dynamics(MBD) vehicle model and a magic formula(MF) tire model. The established analysis model was verified after comparing it with the results of the braking tests of an actual vehicle. Using this analysis model, this study analyzed the braking effect by vehicle factor, and finally designed a tire that had optimized braking performance. As a result of this study, it was possible to design the MF tire model whose braking performance improved by 9.2 %.

A Effect of Unreliable Default Parameter in Forecasting Delay and Level of Service of Signalized Intersection (초기변수의 불확실성이 신호교차로 지체모형 및 서비스수준 예측에 미치는 영향 분석)

  • Kim, Sung-Deuk;Park, Won-Kyu;Kim, Kyung-Kyu
    • Journal of Navigation and Port Research
    • /
    • v.27 no.4
    • /
    • pp.471-478
    • /
    • 2003
  • In the Signalized Intersection, the capacity analysis is conducted with a large amount of input data such as road way, traffic and signal condition. but the level of service(LOS) is determined by delay estimated as a measure of effectiveness (MOE) based on this procedure. However, It is under the circumstances which are not considered for the errors caused by the uncertainty of input data in the field(the turing volumes, lane geometry, signal timing, grade of approach lane, percentage heavy vehicles, peak hour factor and arrival type etc.) as become the bases in the determination of the capacity and LOS. It includes the problem of reliability which is not verified for the capacity and LOS estimated. So, this study is to suggest the minimization of their influences by examining whether the uncertainty of input data such as the traffic volume, percentage of heavy vehicles and roadway geometry on the approach lane in the intersection under the study affects the capacity analysis and LOS determination.

Development of Load and Resistance Factor Design of Mound Breakwater Against Circular Failure (경사식 방파제 원호파괴에 대한 하중저항계수 설계법 개발)

  • Kim, unghwan;Huh, Jungwon;Lee, Kicheol;Kim, Dongwook
    • Journal of the Korean Geosynthetics Society
    • /
    • v.18 no.4
    • /
    • pp.205-214
    • /
    • 2019
  • Load and resistance factor design of mound breakwater against circular failure was developed in this study. To achieve the goal, uncertainties of parameters of soils, mound, and concrete cap were determined. Eight design cases of domestic mound breakwaters were collected and analyzed. Monte Carlo Simulation was implemented to determine the most critical slip surfaces of the design cases. Using the results of Monte Carlo Simulation, First-Order Reliability Method (FORM) was used to perform reliability analyses. Optimal load and resistance factors were calculated using the reliability analysis results and final load and resistance factors were proposed based on the calculated optimal factors.

Reliability Analysis of Plane Failure in Rock Slope (암반사면의 평면파괴에 대한 신뢰성해석)

  • 장연수;오승현;김종수
    • Journal of the Korean Geotechnical Society
    • /
    • v.18 no.4
    • /
    • pp.119-126
    • /
    • 2002
  • A reliability analysis is performed to investigate the influence of the uncertainty from few in-situ samples and inherent heterogeneity of the ground on the probability of failure for a rock cut slope. The results are compared with those of deterministic slope stability analysis. The random variables used are unit weight of the rock, the angle of potential slope of failure, and cohesion and internal friction angle of joints. It was found that the rock slope in which the factor of safety satisfied the minimum safety factor in the deterministic analysis has high probability of failure in the reliability analysis when the weak geological strata are involved in the cut slope. The probability of failure of rock slope is most sensitive to the mean and standard deviation of cohesion in rock joint among the random soil parameters included in the reliability analysis. Sensitivities of the mean values are larger than those of standard deviations, which means that accurate estimation of the mean for the in-situ geotechnical properties is important.

The study on dose variation due to exchange of Upper and Lower jaw in the linear accelerator (선형가속기에서 상위조리개와 하위조리개의 교환에 의한 선량 변화의 고찰)

  • Lim CK.;Kim HN.;Song KW.
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.11 no.1
    • /
    • pp.6-10
    • /
    • 1999
  • The field size can be beam output, therefore MonitorUnit can be varied due to field size dependence The purpose of this study is to evaluate and compare the dose variation according to exchange of collimator The measurements were perfomed with Wellhofer dosimetry system(water phantom. ion chamber. electrometer. system controller. build up cap. etc)and two types of linear accerlerator (Mevatron KD, MevatronMX) Scatter can be affected to field size dependence and scatter correction is separated into collimator and phantom components, scatter components can affect by exchanging of collimator Measurements of collimator scatter factor(Sc) was done in air with build up cap. 1)Square field (5cm2 to 40cm2) was measured 2)and then keeping the upper jaw constant at loom and varing lower jaw from 5cm to 40cm, 3)keeping the lower jaw constant at 10cm and varing upper jaw from 5cm to 40cm Measurements of total scatter factor(Scp) was done in water at Dmax as the procedure of collimator scatter factor measurements in water Dmax The total scatter factors were obtained to the following equation(Sp=Scp/Sc) The measured data is normalized to the data of reference field size($10{\times}10$), rectangular field is inverted to equivalent field to compare three field size data As the collimator setting is varied, the output was changed In conclusion, the error was obtained small but it must be eliminated if we intend to reach the common stated goal of $5\%$ overall uncertainty in dose determination

  • PDF

Estimation of Mass Discrimination Factor for a Wide Range of m/z by Argon Artificial Isotope Mixtures and NF3 Gas

  • Min, Deullae;Lee, Jin Bok;Lee, Christopher;Lee, Dong Soo;Kim, Jin Seog
    • Bulletin of the Korean Chemical Society
    • /
    • v.35 no.8
    • /
    • pp.2403-2409
    • /
    • 2014
  • Absolute isotope ratio is a critical constituent in determination of atomic weight. To measure the absolute isotope ratio using a mass spectrometer, mass discrimination factor, $f_{MD}$, is needed to convert measured isotope ratio to real isotope ratio of gas molecules. If the $f_{MD}$ could be predicted, absolute isotope ratio of a chemical species would be measureable in absence of its enriched isotope pure materials or isotope references. This work employed gravimetrically prepared isotope mixtures of argon (Ar) to obtain $f_{MD}$ at m/z of 40 in the magnetic sector type gas mass spectrometer (gas/MS). Besides, we compare the nitrogen isotope ratio of nitrogen trifluoride ($NF_3$) with that of nitrogen molecule ($N_2$) decomposed from the same $NF_3$ thermally in order to identify the difference of $f_{MD}$ values in extensive m/z region from 28 to 71. Our result shows that $f_{MD}$ at m/z 40 was $-0.044%{\pm}0.017%$ (k = 1) from measurement of Ar artificial isotope mixtures. The $f_{MD}$ difference in the range of m/z from 28 to 71 is observed $-0.12%{\pm}0.14%$ from $NF_3$ and $N_2$. From combination of this work and reported $f_{MD}$ values by another team, IRMM, if $f_{MD}$ of $-0.16%{\pm}0.14%$ is applied to isotope ratio measurement from $N_2$ to $SF_6$, we can determine absolute isotope ratio within relative uncertainty of 0.2 %.