• Title/Summary/Keyword: multi-class probability estimates

Search Result 3, Processing Time 0.018 seconds

Prediction of Protein Subcellular Localization using Label Power-set Classification and Multi-class Probability Estimates (레이블 멱집합 분류와 다중클래스 확률추정을 사용한 단백질 세포내 위치 예측)

  • Chi, Sang-Mun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.10
    • /
    • pp.2562-2570
    • /
    • 2014
  • One of the important hints for inferring the function of unknown proteins is the knowledge about protein subcellular localization. Recently, there are considerable researches on the prediction of subcellular localization of proteins which simultaneously exist at multiple subcellular localization. In this paper, label power-set classification is improved for the accurate prediction of multiple subcellular localization. The predicted multi-labels from the label power-set classifier are combined with their prediction probability to give the final result. To find the accurate probability estimates of multi-classes, this paper employs pair-wise comparison and error-correcting output codes frameworks. Prediction experiments on protein subcellular localization show significant performance improvement.

Multi-focus Image Fusion Technique Based on Parzen-windows Estimates (Parzen 윈도우 추정에 기반한 다중 초점 이미지 융합 기법)

  • Atole, Ronnel R.;Park, Daechul
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.8 no.4
    • /
    • pp.75-88
    • /
    • 2008
  • This paper presents a spatial-level nonparametric multi-focus image fusion technique based on kernel estimates of input image blocks' underlying class-conditional probability density functions. Image fusion is approached as a classification task whose posterior class probabilities, P($wi{\mid}Bikl$), are calculated with likelihood density functions that are estimated from the training patterns. For each of the C input images Ii, the proposed method defines i classes wi and forms the fused image Z(k,l) from a decision map represented by a set of $P{\times}Q$ blocks Bikl whose features maximize the discriminant function based on the Bayesian decision principle. Performance of the proposed technique is evaluated in terms of RMSE and Mutual Information (MI) as the output quality measures. The width of the kernel functions, ${\sigma}$, were made to vary, and different kernels and block sizes were applied in performance evaluation. The proposed scheme is tested with C=2 and C=3 input images and results exhibited good performance.

  • PDF

A Note on the Bias in the Multi-nomial Classification (다항분류상 편의에 관한 연구)

  • 윤용운
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.1 no.1
    • /
    • pp.45-48
    • /
    • 1978
  • If two inspectors classify items in a lot into m classes, it is possible that each of them makes wrong classification in some cases, thus causing bias. Expressions have been obtained for the limits of this bias in estimating the proportion of the different classes. From the results of the classification they obtained limit for the estimates of Proportions have been worked out, based on assumption regarding the magnitudes of probabilities of misclassification. Now we suppose that $P_{ti}{\;}(t=1.2)$ is the probability that t the inspector classifies correctly an item in class $A_i$ and $q_{tji}$ is the probability that he misclassifies in $A_j$ an item actually belonging to $A_i$, therefor, $P_{ti}+ \sum\limits_{j{\neq}i}q_{tji}=1$ An estimate for the proportion $P_k$ of the class $A_k$ in the lot would be $\hat{P}_k=r_{kk}+(\frac{1}{2})\sum\limits_{j{\neq}k}r_{kj}+r_{jk}$ The % Bias in proportion $\hat{P}_k$ is $\frac{E(\hat{P}_k)-P_k}{P_k}{\times}100$

  • PDF