• Title/Summary/Keyword: Mean Value Reliability Method

Search Result 119, Processing Time 0.029 seconds

A study on the fault detection efficiency of software (소프트웨어의 결함 검출 효과에 관한 연구)

  • Kim, Sun-Il;Che, Gyu-Shik;Jo, In-June
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.4
    • /
    • pp.737-743
    • /
    • 2008
  • I compare my parameter estimation methodoloay with existing method, considering both of testing effort and fault detecting rate simultaneously in software reliability modeling. Generally speaking, fault detection/removal mechanism depends on how apply previous fault detection/removal and testing effort of S/W. The fault removal efficiency makes large influence to the reliability growth, testing and removal cost in developing stage S/W. This is very useful measure during all the developing stages and much helpful for the developer to estimate debugging efficiency, and furthermore, to anticipate additional working amount.

Reliability-Based Service Life Estimation of Concrete in Marine Environment (신뢰성이론에 기반한 해양환경 콘크리트의 내구수명 평가)

  • Kim, Ki-Hyun;Cha, Soo-Won
    • Journal of the Korea Concrete Institute
    • /
    • v.22 no.4
    • /
    • pp.595-603
    • /
    • 2010
  • Monte-Carlo simulation technique is often used in order to predict service life of concrete structure subjected to chloride penetration in marine environment based on probability theory. Monte-Carlo simulation method, however, the method gives different results every time that the simulation is run. On the other hand, moment method, which is frequently used in reliability analysis, needs negligible computational cost compared with simulation technique and gives a constant result for the same problem. Thus, in this study, moment method was applied to the calculation of corrosion-initiation probability. For this purpose, computer programs to calculate failure probabilities are developed using first-order second moment (FOSM) and second-order second moment (SOSM) methods, respectively. From the analysis examples with the developed programs, SOSM was found to give a more accurate result than FOSM does. The sensitivity analysis has shown that the factor affecting the corrosion-initiation probability the most was the cover depth, and the corrosion-initiation probability was influenced more by its coefficient of variation than its mean value.

The Bayesian Inference for Software Reliability Models Based on NHPP (NHPP에 기초한 소프트웨어 신뢰도 모형에 대한 베이지안 추론에 관한 연구)

  • Lee, Sang-Sik;Kim, Hui-Cheol;Song, Yeong-Jae
    • The KIPS Transactions:PartD
    • /
    • v.9D no.3
    • /
    • pp.389-398
    • /
    • 2002
  • Software reliability growth models are used in testing stages of software development to model the error content and time intervals between software failures. This paper presents a stochastic model for the software failure phenomenon based on a nonhomogeneous Poisson process(NHPP) and performs Bayesian inference using prior information. The failure process is analyzed to develop a suitable mean value function for the NHPP ; expressions are given for several performance measure. Actual software failure data are compared with several model on the constant reflecting the quality of testing. The performance measures and parametric inferences of the suggested models using Rayleigh distribution and Laplace distribution are discussed. The results of the suggested models are applied to real software failure data and compared with Goel model. Tools of parameter point inference and 95% credible intereval was used method of Gibbs sampling. In this paper, model selection using the sum of the squared errors was employed. The numerical example by NTDS data was illustrated.

Localized reliability analysis on a large-span rigid frame bridge based on monitored strains from the long-term SHM system

  • Liu, Zejia;Li, Yinghua;Tang, Liqun;Liu, Yiping;Jiang, Zhenyu;Fang, Daining
    • Smart Structures and Systems
    • /
    • v.14 no.2
    • /
    • pp.209-224
    • /
    • 2014
  • With more and more built long-term structural health monitoring (SHM) systems, it has been considered to apply monitored data to learn the reliability of bridges. In this paper, based on a long-term SHM system, especially in which the sensors were embedded from the beginning of the construction of the bridge, a method to calculate the localized reliability around an embedded sensor is recommended and implemented. In the reliability analysis, the probability distribution of loading can be the statistics of stress transferred from the monitored strain which covered the effects of both the live and dead loads directly, and it means that the mean value and deviation of loads are fully derived from the monitored data. The probability distribution of resistance may be the statistics of strength of the material of the bridge accordingly. With five years' monitored strains, the localized reliabilities around the monitoring sensors of a bridge were computed by the method. Further, the monitored stresses are classified into two time segments in one year period to count the loading probability distribution according to the local climate conditions, which helps us to learn the reliability in different time segments and their evolvement trends. The results show that reliabilities and their evolvement trends in different parts of the bridge are different though they are all reliable yet. The method recommended in this paper is feasible to learn the localized reliabilities revealed from monitored data of a long-term SHM system of bridges, which would help bridge engineers and managers to decide a bridge inspection or maintenance strategy.

Minimal clinically important difference of mouth opening in oral submucous fibrosis patients: a retrospective study

  • Kaur, Amanjot;Rustagi, Neeti;Ganesan, Aparna;PM, Nihadha;Kumar, Pravin;Chaudhry, Kirti
    • Journal of the Korean Association of Oral and Maxillofacial Surgeons
    • /
    • v.48 no.3
    • /
    • pp.167-173
    • /
    • 2022
  • Objectives: The purpose of this study was to estimate the minimal clinically important difference (MCID) of mouth opening (MO) and patient satisfaction in surgically treated oral submucous fibrosis (OSMF) patients. Materials and Methods: The status of MO was collected preoperatively (T0), postoperatively at 3 months (T1), and at a minimum of 6 months postoperatively (T2). MCID was determined through the anchor-based approach with the change difference method, mean change method, and receiver operator characteristic curve (ROC) method. Results: In this study, 35 patients enrolled and completed postoperative follow-up (T2) averaging a duration of 18.1 months. At T1, using the change difference method, MO was 14.89 mm and the ROC curve exhibited a 11.5 gain in MO (sensitivity 81.8% and specificity 100%, area under the curve [AUC] of 0.902) and was classified as MCID as reported by patients. At T2, MCID of MO was 9.75 mm using the change difference method and 11.75 mm by the mean change method. The ROC curve revealed that the MCID of MO at T2 was 10.5 mm with 73.9% sensitivity and 83.3% specificity (AUC of 0.873). The kappa value was 0.91, confirming reliability of the data. Conclusion: This study demonstrated MCID values that indicate the clinical relevance of surgical treatment of OSMF if the minimum possible gain in MO is approximately 10 mm.

Methods and Sample Size Effect Evaluation for Wafer Level Statistical Bin Limits Determination with Poisson Distributions (포아송 분포를 가정한 Wafer 수준 Statistical Bin Limits 결정방법과 표본크기 효과에 대한 평가)

  • Park, Sung-Min;Kim, Young-Sig
    • IE interfaces
    • /
    • v.17 no.1
    • /
    • pp.1-12
    • /
    • 2004
  • In a modern semiconductor device manufacturing industry, statistical bin limits on wafer level test bin data are used for minimizing value added to defective product as well as protecting end customers from potential quality and reliability excursion. Most wafer level test bin data show skewed distributions. By Monte Carlo simulation, this paper evaluates methods and sample size effect regarding determination of statistical bin limits. In the simulation, it is assumed that wafer level test bin data follow the Poisson distribution. Hence, typical shapes of the data distribution can be specified in terms of the distribution's parameter. This study examines three different methods; 1) percentile based methodology; 2) data transformation; and 3) Poisson model fitting. The mean square error is adopted as a performance measure for each simulation scenario. Then, a case study is presented. Results show that the percentile and transformation based methods give more stable statistical bin limits associated with the real dataset. However, with highly skewed distributions, the transformation based method should be used with caution in determining statistical bin limits. When the data are well fitted to a certain probability distribution, the model fitting approach can be used in the determination. As for the sample size effect, the mean square error seems to reduce exponentially according to the sample size.

Evaluation of life Expectancy of Power System Equipment Using Probability Distribution (확률분포를 이용한 전력설비의 기대여명 추정)

  • Kim, Gwang-Won;Hyun, Seung-Ho
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.22 no.10
    • /
    • pp.49-55
    • /
    • 2008
  • This paper presents a novel evaluation method of life expectancy of power system equipment. The life expectancy means expected remaining lifetime; it can be usefully utilized to maintenance planning, equipment replacement planning, and reliability assessment. The proposed method is composed of three steps. Firstly, a cumulative probability for future years is evaluated for targeted age year. Secondly, the cumulative probability is modeled by well-blown cumulative distribution function(CDF) such as Weibull distribution. Lastly, life expectancy is evaluated as the mean value of the model. Since the model CDF is established in the proposed method, it can also evaluate the probability of equipment retirement within specific years. The developed method is applied to examples of generators of combined cycle power plants to show its effectiveness.

Reliability and Accuracy of the Deployable Particulate Impact Sampler for Application to Spatial PM2.5 Sampling in Seoul, Korea (서울시 PM2.5 공간 샘플링을 위한 Deployable Particulate Impact Sampler의 성능 검증 연구)

  • Oh, Gyu-Lim;Heo, Jong-Bae;Yi, Seung-Muk;Kim, Sun-Young
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.33 no.3
    • /
    • pp.277-288
    • /
    • 2017
  • Previous studies of health effects of $PM_{2.5}$ performed spatial monitoring campaigns to assess spatial variability of $PM_{2.5}$ across people's residences. Highly reliable portable and cost-effective samplers will be useful for such campaigns. This study aimed to investigate applicability of the Deployable Particulate Impact Sampler(DPIS), one of the compact impact samplers, to spatial monitoring campaigns of $PM_{2.5}$ in Seoul, Korea. The investigation focused on the consistency of $PM_{2.5}$ concentrations measured by DPISs compared to those by the Low-volume Cyclone sampler (LCS). LCS has operated at a fixed site in the Seoul National University Yeongeon campus, Seoul, Korea since 2003 and provided qualified $PM_{2.5}$ data. $PM_{2.5}$ sampling of DPISs was carried out at the same site from November 17, 2015 through February 3, 2016. $PM_{2.5}$ concentrations were quantified by the gravimetric method. Using a duplicated DPIS, we confirmed the reliability of DPIS by computing relative precision and mean square error-based R squared value ($R^2$). Relative precision was one minus the difference of measurements between two samplers relative to the sum. For accuracy, we compared $PM_{2.5}$ concentrations from four DPISs (DPIS_Tg, DPIS_To, DPIS_Qg, and DPIS_Qo) to those of LCS. Four samplers included two types of collection filters(Teflon, T; quartz, Q) and impaction discs(glass fiber filter, g; pre-oiled porous plastic disc, o). We assessed accuracy using accuracy value which is one minus the difference between DPIS and LCS $PM_{2.5}$ relative to LCS $PM_{2.5}$ in addition to $R^2$. DPIS showed high reliability (average precision=97.28%, $R^2=0.98$). Accuracy was generally high for all DPISs (average accuracy=83.78~88.88%, $R^2=0.89{\sim}0.93$) except for DPIS_Qg (77.35~78.35%, 0.82~0.84). Our results of high accuracy of DPIS compared to LCS suggested that DPIS will help the assessment of people's individual exposure to $PM_{2.5}$ in extensive spatial monitoring campaigns.

Development of an Efficient Optimization Technique for Robust Design by Approximating Probability Constratints (확률조건의 근사화를 통한 효율적인 강건 최적설계 기법의 개발)

  • Jeong, Do-Hyeon;Lee, Byeong-Chae
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.24 no.12
    • /
    • pp.3053-3060
    • /
    • 2000
  • Alternative formulation is presented for robust optimization problems and an efficient computational scheme for reliability estimation is proposed. Both design variables and design parameters considered as random variables about their nominal values. To ensure the robustness of objective performance a new cost function bounding the performance and a new constraint limiting the performance variation are introduced. The constraint variations are regulated by considering the probability of feasibility. Each probability constraint is transformed into a sub-optimization problem and then is resolved with the modified advanced first order second moment(AFOSM) method for computational efficiency. The proposed robust optimization method has advantages that the mean value and the variation of the performance function are controlled simultaneously and the second order sensitivity information is not required even in case of gradient based optimization. The suggested method is examined by solving three examples and the results are compared with those for deterministic case and those available in literature.

Radiation Flux Impact in High Density Residential Areas - A Case Study from Jungnang area, Seoul - (고밀도 주거지역에서의 복사플럭스 영향 연구 - 서울시 중랑구 지역을 대상으로 -)

  • YI, Chae-Yeon;KWON, Hyuk-Gi;Lindberg, Fredrik
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.21 no.4
    • /
    • pp.26-49
    • /
    • 2018
  • The purpose of this study was to verify the reliability of the solar radiation model and discuss its applicability to the urban area of Seoul for summer heat stress mitigation. We extended the study area closer to the city scale and enhanced the spatial resolution sufficiently to determine pedestrian-level urban radiance. The domain was a $4km^2$ residential area with high-rise building sites. Radiance modelling (SOLWEIG) was performed with LiDAR (Light Detection and Ranging)-based detailed geomorphological land cover shape. The radiance model was evaluated using surface energy balance (SEB) observations. The model showed the highest accuracy on a clear day in summer. When the mean radiation temperature (MRT) was simulated, the highest value was for a low-rise building area and road surface with a low shadow effect. On the other hand, for high-rise buildings and vegetated areas, the effect of shadows was large and showed a relatively low value of mean radiation temperature. The method proposed in this study exhibits high reliability for the management of heat stress in urban areas at pedestrian height. It is applicable for many urban micro-climate management functions related to natural and artificial urban settings; for example, when a new urban infrastructure is planned.