• 제목/요약/키워드: Kullback-Leibler's directed divergence

검색결과 1건 처리시간 0.014초

On a Balanced Classification Rule

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • 제24권2호
    • /
    • pp.453-470
    • /
    • 1995
  • We describe a constrained optimal classification rule for the case when the prior probability of an observation belonging to one of the two populations is unknown. This is done by suggesting a balanced design for the classification experiment and constructing the optimal rule under the balanced design condition. The rule si characterized by a constrained minimization of total risk of misclassification; the constraint of the rule is constructed by the process of equation between Kullback-Leibler's directed divergence measures obtained from the two population conditional densities. The efficacy of the suggested rule is examined through two-group normal classification. This indicates that, in case little is known about the relative population sizes, dramatic gains in accuracy of classification result can be achieved.

  • PDF