• Title/Summary/Keyword: Optimal Distribution Estimation

Search Result 218, Processing Time 0.027 seconds

Discriminant analysis using empirical distribution function

  • Kim, Jae Young;Hong, Chong Sun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.5
    • /
    • pp.1179-1189
    • /
    • 2017
  • In this study, we propose an alternative method for discriminant analysis using a multivariate empirical distribution function to express multivariate data as a simple one-dimensional statistic. This method turns to be the estimation process of the optimal threshold based on classification accuracy measures and an empirical distribution function of data composed of classes. This can also be visually represented on a two-dimensional plane and discussed with some measures in ROC curves, surfaces, and manifolds. In order to explore the usefulness of this method for discriminant analysis in the study, we conducted comparisons between the proposed method and the existing methods through simulations and illustrative examples. It is found that the proposed method may have better performances for some cases.

On the New Age Replacement Policy (새로운 연령교체 방식의 개발)

  • Seo, Sun-Keun
    • Journal of Applied Reliability
    • /
    • v.16 no.4
    • /
    • pp.280-286
    • /
    • 2016
  • Purpose: Recently, Jiang defines the tradeoff B life to minimize a sum of life lost by preventive maintenance (PM) and corrective maintenance (CM) contribution parts and sets up an optimal replacement age of age replacement policy as this tradeoff life. In this paper, Jiang's model only considering the known lifetime distribution is extended by assigning different weights to two parts of PM and CM in order to reflect the practical maintenance situations in application. Methods: The new age replacement model is formulated and the meaning of a weight factor is expressed with the implied cost of failure under asymptotic expected cost model and also discussed with one-cycle expected cost criterion. Results: The proposed model is applied to Weibull and lognormal lifetime distributions and optimum PM replacement ages are derived with corresponding implied cost of failure. Conclusion: The new age replacement policy to escape the estimation of cost of failure in classical asymptotic expected cost criterion based on the renewal process is provided.

ATSC Digital Television Signal Detection with Spectral Correlation Density

  • Yoo, Do-Sik;Lim, Jongtae;Kang, Min-Hong
    • Journal of Communications and Networks
    • /
    • v.16 no.6
    • /
    • pp.600-612
    • /
    • 2014
  • In this paper, we consider the problem of spectrum sensing for advanced television systems committee (ATSC) digital television (DTV) signal detection. To exploit the cyclostationarity of the ATSC DTV signals, we employ spectral correlation density (SCD) as the decision statistic and propose an optimal detection algorithm. The major difficulty is in obtaining the probability distribution functions of the SCD. To overcome the difficulty, we probabilistically model the pilot frequency location and employ Gaussian approximation for the SCD distribution. Then, we obtain a practically implementable detection algorithm that outperforms the industry leading systems by 2-3 dB. We also propose various techniques that greatly reduce the system complexity with performance degradation by only a few tenths of decibels. Finally, we show how robust the system is to the estimation errors of the noise power spectral density level and the probability distribution of the pilot frequency location.

A Study on Development Cost Attributes Analysis of NHPP Software Reliability Model Based on Rayleigh Distribution and Inverse Rayleigh Distribution (레일리 분포와 역-레일리 분포에 근거한 NHPP 소프트웨어 신뢰성 모형의 개발비용 속성 분석에 관한 연구)

  • Yang, Tae-Jin
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.6
    • /
    • pp.554-560
    • /
    • 2019
  • In this study, after applying the finite failure NHPP Rayleigh distribution model and NHPP Inverse Rayleigh distribution model which are widely used in the field of software reliability to the software development cost model, the attributes of development cost and optimal release time were compared and analyzed. To analyze the attributes of software development cost, software failure time data was used, parametric estimation was applied to the maximum likelihood estimation method, and nonlinear equations were calculated using the bisection method. As a result, it was confirmed that Rayleigh model is relatively superior to Inverse Rayleigh model because software development cost is relatively low and software release time is also fast. Through this study, the development cost attributes of the Rayleigh model and the Inverse Rayleigh model without the existing research examples were newly analyzed. In addition, we expect that software developers will be able to use this study as a basic guideline for exploring software reliability improvement method and development cost attributes.

A Study on the Distribution Estimation of Personal Data Leak Incidents (개인정보유출 사고의 분포 추정에 관한 연구)

  • Hwang, Yoon-hee;Yoo, Jinho
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.26 no.3
    • /
    • pp.799-808
    • /
    • 2016
  • To find the pattern of personal data leak incidents and confirm which distribution is suitable for, this paper searched the personal data leak incidents reported by the media from 2011 to 2014. Based on result, this research estimated the statistical distribution using the 'K-S Statistics' and tested the 'Goodness-of-Fit'. As a result, the fact that in 95% significance level, the Poisson & Exponential distribution have high 'Goodness-of-Fit' has been proven quantitatively and, this could find it for major personal data leak incidents to occur 12 times in a year on average. This study can be useful for organizations to predict a loss of personal data leak incidents and information security investments and furthermore, this study can be a data for requirements of the cyber-insurance.

The Auto Regressive Parameter Estimation and Pattern Classification of EKS Signals for Automatic Diagnosis (심전도 신호의 자동분석을 위한 자기회귀모델 변수추정과 패턴분류)

  • 이윤선;윤형로
    • Journal of Biomedical Engineering Research
    • /
    • v.9 no.1
    • /
    • pp.93-100
    • /
    • 1988
  • The Auto Regressive Parameter Estimation and Pattern Classification of EKG Signal for Automatic Diagnosis. This paper presents the results from pattern discriminant analysis of an AR (auto regressive) model parameter group, which represents the HRV (heart rate variability) that is being considered as time series data. HRV data was extracted using the correct R-point of the EKG wave that was A/D converted from the I/O port both by hardware and software functions. Data number (N) and optimal (P), which were used for analysis, were determined by using Burg's maximum entropy method and Akaike's Information Criteria test. The representative values were extracted from the distribution of the results. In turn, these values were used as the index for determining the range o( pattern discriminant analysis. By carrying out pattern discriminant analysis, the performance of clustering was checked, creating the text pattern, where the clustering was optimum. The analysis results showed first that the HRV data were considered sufficient to ensure the stationarity of the data; next, that the patern discrimimant analysis was able to discriminate even though the optimal order of each syndrome was dissimilar.

  • PDF

Evolutionary Algorithms with Distribution Estimation by Variational Bayesian Mixtures of Factor Analyzers (변분 베이지안 혼합 인자 분석에 의한 분포 추정을 이용하는 진화 알고리즘)

  • Cho Dong-Yeon;Zhang Byoung-Tak
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.11
    • /
    • pp.1071-1083
    • /
    • 2005
  • By estimating probability distributions of the good solutions in the current population, some researchers try to find the optimal solution more efficiently. Particularly, finite mixtures of distributions have a very useful role in dealing with complex problems. However, it is difficult to choose the number of components in the mixture models and merge superior partial solutions represented by each component. In this paper, we propose a new continuous evolutionary optimization algorithm with distribution estimation by variational Bayesian mixtures of factor analyzers. This technique can estimate the number of mixtures automatically and combine good sub-solutions by sampling new individuals with the latent variables. In a comparison with two probabilistic model-based evolutionary algorithms, the proposed scheme achieves superior performance on the traditional benchmark function optimization. We also successfully estimate the parameters of S-system for the dynamic modeling of biochemical networks.

Maximum A Posteriori Estimation-based Adaptive Search Range Decision for Accelerating HEVC Motion Estimation on GPU

  • Oh, Seoung-Jun;Lee, Dongkyu
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.9
    • /
    • pp.4587-4605
    • /
    • 2019
  • High Efficiency Video Coding (HEVC) suffers from high computational complexity due to its quad-tree structure in motion estimation (ME). This paper exposes an adaptive search range decision algorithm for accelerating HEVC integer-pel ME on GPU which estimates the optimal search range (SR) using a MAP (Maximum A Posteriori) estimator. There are three main contributions; First, we define the motion feature as the standard deviation of motion vector difference values in a CTU. Second, a MAP estimator is proposed, which theoretically estimates the motion feature of the current CTU using the motion feature of a temporally adjacent CTU and its SR without any data dependency. Thus, the SR for the current CTU is parallelly determined. Finally, the values of the prior distribution and the likelihood for each discretized motion feature are computed in advance and stored at a look-up table to further save the computational complexity. Experimental results show in conventional HEVC test sequences that the proposed algorithm can achieves high average time reductions without any subjective quality loss as well as with little BD-bitrate increase.

Estimation of Drought Rainfall According to Consecutive Duration and Return Period Using Probability Distribution (확률분포에 의한 지속기간 및 빈도별 가뭄우량 추정)

  • Lee, Soon Hyuk;Maeng, Sung Jin;Ryoo, Kyong Sik
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2004.05b
    • /
    • pp.1103-1106
    • /
    • 2004
  • The objective of this study is to induce the design drought rainfall by the methodology of L-moment including testing homogeneity, independence and outlier of the data of annual minimum monthly rainfall in 57 rainfall stations in Korea in terms of consecutive duration for 1, 2, 4, 6, 9 and 12 months. To select appropriate distribution of the data for annual minimum monthy rainfall by rainfall station, the distribution of generalized extreme value (GEV), generalized logistic (GLO) as well as that of generalized pareto (GPA) are applied and the appropriateness of the applied GEV, GLO, and GPA distribution is judged by L-moment ratio diagram and Kolmogorov-Smirnov (K-S) test. As for the annual minimum monthly rainfall measured by rainfall station and that stimulated by Monte Carlo techniques, the parameters of the appropriately selected GEV and GPA distributions are calculated by the methodology of L-moment and the design drought rainfall is induced. Through the comparative analysis of design drought rainfall induced by GEV and GPA distribution by rainfall station, the optimal design drought rainfall by rainfall station is provided.

  • PDF

Optimum failure-censored step-stress partially accelerated life test for the truncated logistic life distribution

  • Srivastava, P.W.;Mittal, N.
    • International Journal of Reliability and Applications
    • /
    • v.13 no.1
    • /
    • pp.19-35
    • /
    • 2012
  • This paper presents an optimum design of step-stress partially accelerated life test (PALT) plan which allows the test condition to be changed from use to accelerated condition on the occurrence of fixed number of failures. Various life distribution models such as exponential, Weibull, log-logistic, Burr type-Xii, etc have been used in the literature to analyze the PALT data. The need of different life distribution models is necessitated as in the presence of a limited source of data as typically occurs with modern devices having high reliability, the use of correct life distribution model helps in preventing the choice of unnecessary and expensive planned replacements. Truncated distributions arise when sample selection is not possible in some sub-region of sample space. In this paper it is assumed that the lifetimes of the items follow Truncated Logistic distribution truncated at point zero since time to failure of an item cannot be negative. Optimum step-stress PALT plan that finds the optimal proportion of units failed at normal use condition is determined by using the D-optimality criterion. The method developed has been explained using a numerical example. Sensitivity analysis and comparative study have also been carried out.

  • PDF