• Title/Summary/Keyword: failure intensity function

Search Result 58, Processing Time 0.03 seconds

A Light Exposure Correction Algorithm Using Binary Image Segmentation and Adaptive Fusion Weights (이진화 영상분할기법과 적응적 융합 가중치를 이용한 광노출 보정기법)

  • Han, Kyu-Phil
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.11
    • /
    • pp.1461-1471
    • /
    • 2021
  • This paper presents a light exposure correction algorithm for less pleasant images, acquired with a light metering failure. Since conventional tone mapping and gamma correction methods adopt a function mapping with the same range of input and output, the results are pleasurable for almost symmetric distributions to their intensity average. However, their corrections gave insufficient outputs for asymmetric cases at either bright or dark regions. Also, histogram modification approaches show good results on varied pattern images, but these generate unintentional noises at flat regions because of the compulsive shift of the intensity distribution. Therefore, in order to sufficient corrections for both bright and dark areas, the proposed algorithm calculates the gamma coefficients using primary parameters extracted from the global distribution. And the fusion weights are adaptively determined with complementary parameters, considering the classification information of a binary segmentation. As the result, the proposed algorithm can obtain a good output about both the symmetric and the asymmetric distribution images even with severe exposure values.

The Comparative Study of Software Optimal Release Time of Finite NHPP Model Considering Property of Nonlinear Intensity Function (비선형 강도함수 특성을 이용한 유한고장 NHPP모형에 근거한 소프트웨어 최적방출시기 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.11 no.9
    • /
    • pp.159-166
    • /
    • 2013
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, finite failure non-homogeneous Poisson process model, presented and propose release policies of the life distribution, half-logistic property model which used to an area of reliability because of various shape and scale parameter. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, the parameters estimation using maximum likelihood estimation of failure time data, make out estimating software optimal release time. Software release time is used as prior information, potential security damages should be reduced.

A BAYESIAN APPROACH FOR A DECOMPOSITION MODEL OF SOFTWARE RELIABILITY GROWTH USING A RECORD VALUE STATISTICS

  • Choi, Ki-Heon;Kim, Hee-Cheul
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.1
    • /
    • pp.243-252
    • /
    • 2001
  • The points of failure of a decomposition process are defined to be the union of the points of failure from two component point processes for software reliability systems. Because sampling from the likelihood function of the decomposition model is difficulty, Gibbs Sampler can be applied in a straightforward manner. A Markov Chain Monte Carlo method with data augmentation is developed to compute the features of the posterior distribution. For model determination, we explored the prequential conditional predictive ordinate criterion that selects the best model with the largest posterior likelihood among models using all possible subsets of the component intensity functions. A numerical example with a simulated data set is given.

Probabilistic sensitivity analysis of multi-span highway bridges

  • Bayat, M.;Daneshjoo, F.;Nistico, N.
    • Steel and Composite Structures
    • /
    • v.19 no.1
    • /
    • pp.237-262
    • /
    • 2015
  • In this study, we try to compare different intensity measures for evaluating nonlinear response of bridge structure. This paper presents seismic analytic fragility of a three-span concrete girder highway bridge. A complete detail of bridge modeling parameters and also its verification has been presented. Fragility function considers the relationship of intensities of the ground motion and probability of exceeding certain state of damage. Incremental dynamic analysis (IDA) has been subjected to the bridge from medium to strong ground motions. A suite of 20 earthquake ground motions with different range of PGAs are used in nonlinear dynamic analysis of the bridge. Complete sensitive analyses have been done on the response of bridge and also efficiency and practically of them are studied to obtain a proficient intensity measure for these types of structure by considering its sensitivity to the period of the bridge. Three dimensional finite element (FE) model of the bridge is developed and analyzed. The numerical results show that the bridge response is very sensitive to the earthquake ground motions when PGA and Sa (Ti, 5%) are used as intensity measure (IM) and also indicated that the failure probability of the bridge system is dominated by the bridge piers.

Design of bivariate step-stress partially accelerated degradation test plan using copula and gamma process

  • Srivastava, P.W.;Manisha, Manisha;Agarwal, M.L.
    • International Journal of Reliability and Applications
    • /
    • v.17 no.1
    • /
    • pp.21-49
    • /
    • 2016
  • Many mechanical, electrical and electronic products have more than one performance characteristics (PCs). For example the performance degradation of rubidium discharge lamps can be characterized by the rubidium consumption or the decreasing intensity the lamp. The product may degrade due to all the PCs which may be independent or dependent. This paper deals with the design of optimal bivariate step-stress partially accelerated degradation test (PADT) with degradation paths modelled by gamma process. The dependency between PCs has been modelled through Frank copula function. In partial step-stress loading, the unit is tested at usual stress for some time, and then the stress is accelerated. This helps in preventing over-stressing of the test specimens. Failure occurs when the performance characteristic crosses the critical value the first time. Under the constraint of total experimental cost, the optimal test duration and the optimal number of inspections at each intermediate stress level are obtained using variance optimality criterion.

Estimation Method of Key Block Size on a Large Scale Rock Slope by Simulation of 3-D Rock Joint System (3차원 절리계 모사를 통한 대규모 암반비탈면 파괴블록크기 추정방법)

  • Kim, Dong-Hee;Jung, Hyuk-Il;Kim, Seok-Ki;Lee, Woo-Jin;Ryu, Dong-Woo
    • Journal of the Korean Geotechnical Society
    • /
    • v.23 no.10
    • /
    • pp.97-107
    • /
    • 2007
  • Accurate evaluation of the slope stability by assuming failure block as the entire slope is considered to be apposite for the small scale slope, whereas it is not the case for the large scale slope. Hence, appropriate estimation of a failure block size is required since the safety factor and the joint strength parameters are the function of the failure block size. In this paper, the size of failure block was investigated by generating 3-dimensional rock joint system based on statistical data of joints obtained from research slope, such as joint orientation, spacing and 3-dimensional joint intensity. The result indicates that 33 potential failure blocks exist in research slope, as large as 1.4 meters at least and 38.7 meters at most, and average block height is 15.2 meters. In addition, the data obtained from 3 dimensional joint system were directly applicable to the probability analysis and 2 and 3 dimensional discontinuity analysis.

A software reliability model with a Burr Type III fault detection rate function

  • Song, Kwang Yoon;Chang, In Hong;Choi, Min Su
    • International Journal of Reliability and Applications
    • /
    • v.17 no.2
    • /
    • pp.149-158
    • /
    • 2016
  • We are enjoying a very comfortable life thanks to modern civilization, however, comfort is not guaranteed to us. Development of software system is a difficult and complex process. Therefore, the main focus of software development is on improving the reliability and stability of a software system. We have become aware of the importance of developing software reliability models and have begun to develop software reliability models. NHPP software reliability models have been developed through the fault intensity rate function and the mean value functions within a controlled testing environment to estimate reliability metrics such as the number of residual faults, failure rate, and reliability of the software. In this paper, we present a new NHPP software reliability model with Burr Type III fault detection rate, and present the goodness-of-fit of the fault detection rate software reliability model and other NHPP models based on two datasets of software testing data. The results show that the proposed model fits significantly better than other NHPP software reliability models.

Numerical Fracture Mechanics Evaluation on Surface Cracks in a Spherical Oxygen Holder (구형 산소용기 내 표면균열에 대한 수치파괴역학 평가)

  • Cho, Doo-Ho;Kim, Jong-Min;Chang, Yoon-Suk;Choi, Jae-Boong;Kim, Young-Jin;Han, Sang-In
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.33 no.11
    • /
    • pp.1187-1194
    • /
    • 2009
  • During the last decade, possibility of flaw occurrences has been rapidly increased world-widely as the increase of operating times of petro-chemical facilities. For instance, from a recent in-service inspection, three different sized surface cracks were detected in welding parts of a spherical oxygen holder in Korea. While API579 code provides corresponding engineering assessment procedures to determine crack driving forces, in the present work, numerical analyses are carried out for the cracked oxygen holder to investigate effects of complex geometry, analysis model and residual stress. With regard to the detailed finite element analysis, stress intensity factors are determined from both the full three-dimensional model and equivalent plate model. Also, as an alternative, stress intensity factors are calculated for equivalent plate model by employing the noted influence stress function technique. Finally, parametric structural integrity evaluation of the cracked oxygen holder is conducted in use of failure assessment diagram method, J/T method and DPFAD method. Effects of the geometry and so forth are examined and key findings from the simulations are fully discussed, which enables to determine practical safety margins of spherical components containing a defect.

Fragility Analysis of Staggered Wall Structures (격간벽 구조의 취약도 해석)

  • Beak, Donggirl;Kwon, Kwangho;Kim, Jinkoo
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.25 no.5
    • /
    • pp.397-404
    • /
    • 2012
  • Fragility curves show the probability of a system reaching a limit state as a function of some measure of seismic intensity. To obtain fragility curves of six and twelve story staggered wall structures with middle corridor, incremental dynamic analyses were carried out using twenty two pairs of earthquake records, and their failure probabilities for various intensity of seismic load were investigated. The performances of staggered wall structures with added columns along the central corridor and the structures with their first story walls replaced by columns were compared with those of the regular staggered wall structures. Based on the analysis results it was concluded that staggered wall structures with central columns have the largest safety margin for the same level of seismic load.

Bayesian Computation for Superposition of MUSA-OKUMOTO and ERLANG(2) processes (MUSA-OKUMOTO와 ERLANG(2)의 중첩과정에 대한 베이지안 계산 연구)

  • 최기헌;김희철
    • The Korean Journal of Applied Statistics
    • /
    • v.11 no.2
    • /
    • pp.377-387
    • /
    • 1998
  • A Markov Chain Monte Carlo method with data augmentation is developed to compute the features of the posterior distribution. For each observed failure epoch, we introduced latent variables that indicates with component of the Superposition model. This data augmentation approach facilitates specification of the transitional measure in the Markov Chain. Metropolis algorithms along with Gibbs steps are proposed to preform the Bayesian inference of such models. for model determination, we explored the Pre-quential conditional predictive Ordinate(PCPO) criterion that selects the best model with the largest posterior likelihood among models using all possible subsets of the component intensity functions. To relax the monotonic intensity function assumptions, we consider in this paper Superposition of Musa-Okumoto and Erlang(2) models. A numerical example with simulated dataset is given.

  • PDF