Abstract
We describe a constrained optimal classification rule for the case when the prior probability of an observation belonging to one of the two populations is unknown. This is done by suggesting a balanced design for the classification experiment and constructing the optimal rule under the balanced design condition. The rule si characterized by a constrained minimization of total risk of misclassification; the constraint of the rule is constructed by the process of equation between Kullback-Leibler's directed divergence measures obtained from the two population conditional densities. The efficacy of the suggested rule is examined through two-group normal classification. This indicates that, in case little is known about the relative population sizes, dramatic gains in accuracy of classification result can be achieved.