• Title/Summary/Keyword: Cumulative distribution function

Search Result 299, Processing Time 0.026 seconds

ON CONSISTENCY OF SOME NONPARAMETRIC BAYES ESTIMATORS WITH RESPECT TO A BETA PROCESS BASED ON INCOMPLETE DATA

  • Hong, Jee-Chang;Jung, In-Ha
    • The Pure and Applied Mathematics
    • /
    • v.5 no.2
    • /
    • pp.123-132
    • /
    • 1998
  • Let F and G denote the distribution functions of the failure times and the censoring variables in a random censorship model. Susarla and Van Ryzin(1978) verified consistency of $F_{\alpha}$, he NPBE of F with respect to the Dirichlet process prior D($\alpha$), in which they assumed F and G are continuous. Assuming that A, the cumulative hazard function, is distributed according to a beta process with parameters c, $\alpha$, Hjort(1990) obtained the Bayes estimator $A_{c,\alpha}$ of A under a squared error loss function. By the theory of product-integral developed by Gill and Johansen(1990), the Bayes estimator $F_{c,\alpha}$ is recovered from $A_{c,\alpha}$. Continuity assumption on F and G is removed in our proof of the consistency of $A_{c,\alpha}$ and $F_{c,\alpha}$. Our result extends Susarla and Van Ryzin(1978) since a particular transform of a beta process is a Dirichlet process and the class of beta processes forms a much larger class than the class of Dirichlet processes.

  • PDF

Statistical Characteristics of Response Consistency Parameters in Analytic Hierarchy Process (AHP에서의 응답일관성 모수의 통계적 특성과 활용 방안)

  • 고길곤;이경전
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.26 no.4
    • /
    • pp.71-82
    • /
    • 2001
  • Using the computer simulation method, we invest19ate the probability distribution of maximum eigenvalue of pair-wise comparison matrix, which has been used as a parameter for measuring the consistency of responses in analytic hierarchy process (AHP). We show that the shape of the distribution of the maximum eigenvalue is different according to the dimension of the matrix. In addition, we cannot find any evidence that the distribution of the Consistency Index is a Normal distribution, which has been claimed in the Previous literature. Accordingly, we suggest using so called K-index calcu1ated based on the concept of cumulative distribution function lather than based on that of arithmetic mean because the probabilistic distribution cannot be assumed to be a Normal distribution. We interpret the simulation results by comparing them with the suggestion of Saaty[11]. Our results show that using Saaty's value could be too generous when the dimension of the matrix is 3 and strict over 4. Finally, we propose new criteria for measuring the response consistency in AHP.

  • PDF

Estimation of sewer deterioration by Weibull distribution function (와이블 분포함수를 이용한 하수관로 노후도 추정)

  • Kang, Byongjun;Yoo, Soonyu;Park, Kyoohong
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.34 no.4
    • /
    • pp.251-258
    • /
    • 2020
  • Sewer deterioration models are needed to forecast the remaining life expectancy of sewer networks by assessing their conditions. In this study, the serious defect (or condition state 3) occurrence probability, at which sewer rehabilitation program should be implemented, was evaluated using four probability distribution functions such as normal, lognormal, exponential, and Weibull distribution. A sample of 252 km of CCTV-inspected sewer pipe data in city Z was collected in the first place. Then the effective data (284 sewer sections of 8.15 km) with reliable information were extracted and classified into 3 groups considering the sub-catchment area, sewer material, and sewer pipe size. Anderson-Darling test was conducted to select the most fitted probability distribution of sewer defect occurrence as Weibull distribution. The shape parameters (β) and scale parameters (η) of Weibull distribution were estimated from the data set of 3 classified groups, including standard errors, 95% confidence intervals, and log-likelihood values. The plot of probability density function and cumulative distribution function were obtained using the estimated parameter values, which could be used to indicate the quantitative level of risk on occurrence of CS3. It was estimated that sewer data group 1, group 2, and group 3 has CS3 occurrence probability exceeding 50% at 13th-year, 11th-year, and 16th-year after the installation, respectively. For every data groups, the time exceeding the CS3 occurrence probability of 90% was also predicted to be 27th- to 30th-year after the installation.

An Experimental Study on the Droplet Size Distribution of Sprinkler Spray for Residential Building (주거용 스프링클러 분무의 액적크기 분포에 관한 실험적 연구)

  • Kim, Sung Chan;Kim, Jung Yong
    • Journal of ILASS-Korea
    • /
    • v.20 no.3
    • /
    • pp.175-180
    • /
    • 2015
  • A series of sprinkler discharging tests was conducted to measure the droplet size and its distribution of residential fire sprinkler heads. Droplet sizes in sprinkler spray were measured using a laser diffraction method for the flush, circular and pendent type sprinkler head. In this study, the $D_{v,50}$ of the flush type sprinkler heads were ranged between $530{\sim}1040{\mu}m$ and those of circular and pendent type were $988{\mu}m$ and $916{\mu}m$, respectively. The measured cumulative volume distributions were followed by a combination of the log-normal and Rosin-Rammler distribution which is widely used in the computational fire analysis and the parameters of distribution function were obtained from the best fit line through the measured data.

Impact of Outdated CSI on the Performance of Incremental Amplify-and-Forward Opportunistic Relaying

  • Zhou, Tsingsong;Gao, Qiang;Fei, Li
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.6
    • /
    • pp.2567-2582
    • /
    • 2016
  • This paper investigates the impact of outdated channel state information (CSI) on the performance of the incremental amplify-and-forward (AF) opportunistic relaying (OR) over dual-hop Rayleigh fading channels. According to the definition of distribution function, we obtain the cumulative distribution function (CDF) of the actual combined signal-to-noise ratio (SNR) received at the destination. Based on this CDF, the closed-form expressions of the average spectral efficiency and outage probability are derived for the incremental AF OR under outdated CSI. Numerical results show that in the low region of the average SNR of the direct link, outdated CSI deteriorates the system performance, whereas in the high region, outdated CSI has almost no impact on the system performance.

OPTIMUM DESIGN OF AN AUTOMOTIVE CATALYTIC CONVERTER FOR MINIMIZATION OF COLD-START EMISSIONS USING A MICRO GENETIC ALGORITHM

  • Kim, Y.D.;Kim, W.S.
    • International Journal of Automotive Technology
    • /
    • v.8 no.5
    • /
    • pp.563-573
    • /
    • 2007
  • Optimal design of an automotive catalytic converter for minimization of cold-start emissions is numerically performed using a micro genetic algorithm for two optimization problems: optimal geometry design of the monolith for various operating conditions and optimal axial catalyst distribution. The optimal design process considered in this study consists of three modules: analysis, optimization, and control. The analysis module is used to evaluate the objective functions with a one-dimensional single channel model and the Romberg integration method. It obtains new design variables from the control module, produces the CO cumulative emissions and the integral value of a catalyst distribution function over the monolith volume, and provides objective function values to the control module. The optimal design variables for minimizing the objective functions are determined by the optimization module using a micro genetic algorithm. The control module manages the optimal design process that mainly takes place in both the analysis and optimization modules.

An Adaptive Histogram Redistribution Algorithm Based on Area Ratio of Sub-Histogram for Contrast Enhancement (명암비 향상을 위한 서브-히스토그램 면적비 기반의 적응형 히스토그램 재분배 알고리즘)

  • Park, Dong-Min;Choi, Myung-Ruyl
    • The KIPS Transactions:PartB
    • /
    • v.16B no.4
    • /
    • pp.263-270
    • /
    • 2009
  • Histogram Equalization (HE) is a very popular technique for enhancing the contrast of an image. HE stretches the dynamic range of an image using the cumulative distribution function of a given input image, therefore improving its contrast. However, HE has a well-known problem : when HE is applied for the contrast enhancement, there is a significant change in brightness. To resolve this problem, we propose An Adaptive Contrast Enhancement Algorithm using Subhistogram Area-Ratioed Histogram Redistribution, a new method that helps reduce excessive contrast enhancement. This proposed algorithm redistributes the dynamic range of an input image using its mean luminance value and the ratio of sub-histogram area. Experimental results show that by this redistribution, the significant change in brightness is reduced effectively and the output image is able to preserve the naturalness of an original image even if it has a poor histogram distribution.

Histogram Equalization using Gamma Transformation (감마변환을 사용한 히스토그램 평활화)

  • Chung, Soyoung;Chung, Min Gyo
    • KIISE Transactions on Computing Practices
    • /
    • v.20 no.12
    • /
    • pp.646-651
    • /
    • 2014
  • Histogram equalization generally has the disadvantage that if the distribution of the gray level of an image is concentrated in one place, then the range of the gray level in the output image is excessively expanded, which then produces a visually unnatural result. However, a gamma transformation can reduce such unnatural appearances since it operates under a nonlinear regime. Therefore, this paper proposes a new histogram equalization method that can improve image quality by using a gamma transformation. The proposed method 1) derives the proper form of the gamma transformation by using the average brightness of the input image, 2) linearly combines the earlier gamma transformation with a CDF (Cumulative Distribution Function) for the image in order to obtain a new CDF, and 3) to finally perform histogram equalization by using the new CDF. The experimental results show that relative to existing methods, the proposed method provides good performance in terms of quantitative measures, such as entropy, UIQ, SSIM, etc., and it also naturally enhances the image quality in visual perspective as well.

Study on Predictable Program of Fire.Explosion Accident Using Poisson Distribution Function & Societal Risk Criteria in City Gas (Poisson분포를 이용한 도시가스 화재 폭발사고의 발생 예측프로그램 및 사회적 위험기준에 관한 연구)

  • Ko, Jae-Sun;Kim, Hyo;Lee, Su-Kyoung
    • Fire Science and Engineering
    • /
    • v.20 no.1 s.61
    • /
    • pp.6-14
    • /
    • 2006
  • The data of city gas accidents has been collected and analysed for not only predictions of the fire and explosion accidents but also the criteria of societal risk. The accidents of the recent 11 years have been broken up into such 3 groups roughly as "release", "explosion", "fire" d 16 groups in detail. Owing to the Poisson probability distribution functions, 'careless work-explosion-pipeline' and 'joint loosening & erosion-release-pipeline' items respectively have turned out to record the lowest and most frequency among the recent 11-years accidents. And thus the proper counteractions must be carried out. In order to assess the societal risks tendency of the fatal gas accidents and set the more obvious safety policies up, the D. O. Hogon equation and the regression method has been used to range the acceptable range in the F-N curve of the cumulative casualties. The further works requires setting up successive database on the fire and explosion accidents systematically to obtain reliable analyses. Also the standard codification will be demanded.

CHARACTERIZATIONS BASED ON THE INDEPENDENCE OF THE EXPONENTIAL AND PARETO DISTRIBUTIONS BY RECORD VALUES

  • LEE MIN-YOUNG;CHANG SE-KYUNG
    • Journal of applied mathematics & informatics
    • /
    • v.18 no.1_2
    • /
    • pp.497-503
    • /
    • 2005
  • This paper presents characterizations on the independence of the exponential and Pareto distributions by record values. Let ${X_{n},\;n {\ge1}$ be a sequence of independent and identically distributed(i.i.d) random variables with a continuous cumulative distribution function(cdf) F(x) and probability density function(pdf) f(x). $Let{\;}Y_{n} = max{X_1, X_2, \ldots, X_n}$ for n \ge 1. We say $X_{j}$ is an upper record value of ${X_{n},{\;}n\ge 1}, if Y_{j} > Y_{j-1}, j > 1$. The indices at which the upper record values occur are given by the record times {u(n)}, n \ge 1, where u(n) = $min{j|j > u(n-1), X_{j} > X_{u(n-1)}, n \ge 2}$ and u(l) = 1. Then F(x) = $1 - e^{-\frac{x}{a}}$, x > 0, ${\sigma} > 0$ if and only if $\frac {X_u(_n)}{X_u(_{n+1})} and X_u(_{n+1}), n \ge 1$, are independent. Also F(x) = $1 - x^{-\theta}, x > 1, {\theta} > 0$ if and only if $\frac {X_u(_{n+1})}{X_u(_n)}{\;}and{\;} X_{u(n)},{\;} n {\ge} 1$, are independent.