ON THE QUANTIZERS FOR SMALL TRAINING SEQUENCES

  • Kim, Dong-Sik (School of Electronics and Information Engineering, Hankuk University of Foreign Studies)
  • 발행 : 2009.01.12

초록

In order to design a good quantizer for the underlying distribution using a training sequence (TS), the traditional approach is seeking for the empirical minimum based on the empirical risk minimization principle. As the size of TS increases, we may obtain a good quantizer for the true distribution. However, if we have a relatively small TS, searching the empirical minimum for the TS causes the overfitting problem, which even worsens the performance of the trained codebook. In this paper, the performance of codebooks trained by small TSs is studied, and it is shown that a piecewise uniform codebook can be better than an empirically minimized codebook is.

키워드